Feb 16 12:11:58 localhost kernel: Linux version 5.14.0-677.el9.x86_64 (mockbuild@x86-05.stream.rdu2.redhat.com) (gcc (GCC) 11.5.0 20240719 (Red Hat 11.5.0-14), GNU ld version 2.35.2-69.el9) #1 SMP PREEMPT_DYNAMIC Fri Feb 6 13:57:07 UTC 2026
Feb 16 12:11:58 localhost kernel: The list of certified hardware and cloud instances for Red Hat Enterprise Linux 9 can be viewed at the Red Hat Ecosystem Catalog, https://catalog.redhat.com.
Feb 16 12:11:58 localhost kernel: Command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-677.el9.x86_64 root=UUID=19ee07ed-c14b-4aa3-804d-f2cbdae2694f ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Feb 16 12:11:58 localhost kernel: BIOS-provided physical RAM map:
Feb 16 12:11:58 localhost kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable
Feb 16 12:11:58 localhost kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved
Feb 16 12:11:58 localhost kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved
Feb 16 12:11:58 localhost kernel: BIOS-e820: [mem 0x0000000000100000-0x00000000bffdafff] usable
Feb 16 12:11:58 localhost kernel: BIOS-e820: [mem 0x00000000bffdb000-0x00000000bfffffff] reserved
Feb 16 12:11:58 localhost kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved
Feb 16 12:11:58 localhost kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved
Feb 16 12:11:58 localhost kernel: BIOS-e820: [mem 0x0000000100000000-0x000000023fffffff] usable
Feb 16 12:11:58 localhost kernel: NX (Execute Disable) protection: active
Feb 16 12:11:58 localhost kernel: APIC: Static calls initialized
Feb 16 12:11:58 localhost kernel: SMBIOS 2.8 present.
Feb 16 12:11:58 localhost kernel: DMI: OpenStack Foundation OpenStack Nova, BIOS 1.15.0-1 04/01/2014
Feb 16 12:11:58 localhost kernel: Hypervisor detected: KVM
Feb 16 12:11:58 localhost kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00
Feb 16 12:11:58 localhost kernel: kvm-clock: using sched offset of 9973969488 cycles
Feb 16 12:11:58 localhost kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns
Feb 16 12:11:58 localhost kernel: tsc: Detected 2800.000 MHz processor
Feb 16 12:11:58 localhost kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved
Feb 16 12:11:58 localhost kernel: e820: remove [mem 0x000a0000-0x000fffff] usable
Feb 16 12:11:58 localhost kernel: last_pfn = 0x240000 max_arch_pfn = 0x400000000
Feb 16 12:11:58 localhost kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs
Feb 16 12:11:58 localhost kernel: x86/PAT: Configuration [0-7]: WB  WC  UC- UC  WB  WP  UC- WT  
Feb 16 12:11:58 localhost kernel: last_pfn = 0xbffdb max_arch_pfn = 0x400000000
Feb 16 12:11:58 localhost kernel: found SMP MP-table at [mem 0x000f5ae0-0x000f5aef]
Feb 16 12:11:58 localhost kernel: Using GB pages for direct mapping
Feb 16 12:11:58 localhost kernel: RAMDISK: [mem 0x1b6e4000-0x29b69fff]
Feb 16 12:11:58 localhost kernel: ACPI: Early table checksum verification disabled
Feb 16 12:11:58 localhost kernel: ACPI: RSDP 0x00000000000F5AA0 000014 (v00 BOCHS )
Feb 16 12:11:58 localhost kernel: ACPI: RSDT 0x00000000BFFE16BD 000030 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Feb 16 12:11:58 localhost kernel: ACPI: FACP 0x00000000BFFE1571 000074 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Feb 16 12:11:58 localhost kernel: ACPI: DSDT 0x00000000BFFDFC80 0018F1 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Feb 16 12:11:58 localhost kernel: ACPI: FACS 0x00000000BFFDFC40 000040
Feb 16 12:11:58 localhost kernel: ACPI: APIC 0x00000000BFFE15E5 0000B0 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Feb 16 12:11:58 localhost kernel: ACPI: WAET 0x00000000BFFE1695 000028 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Feb 16 12:11:58 localhost kernel: ACPI: Reserving FACP table memory at [mem 0xbffe1571-0xbffe15e4]
Feb 16 12:11:58 localhost kernel: ACPI: Reserving DSDT table memory at [mem 0xbffdfc80-0xbffe1570]
Feb 16 12:11:58 localhost kernel: ACPI: Reserving FACS table memory at [mem 0xbffdfc40-0xbffdfc7f]
Feb 16 12:11:58 localhost kernel: ACPI: Reserving APIC table memory at [mem 0xbffe15e5-0xbffe1694]
Feb 16 12:11:58 localhost kernel: ACPI: Reserving WAET table memory at [mem 0xbffe1695-0xbffe16bc]
Feb 16 12:11:58 localhost kernel: No NUMA configuration found
Feb 16 12:11:58 localhost kernel: Faking a node at [mem 0x0000000000000000-0x000000023fffffff]
Feb 16 12:11:58 localhost kernel: NODE_DATA(0) allocated [mem 0x23ffd5000-0x23fffffff]
Feb 16 12:11:58 localhost kernel: crashkernel reserved: 0x00000000af000000 - 0x00000000bf000000 (256 MB)
Feb 16 12:11:58 localhost kernel: Zone ranges:
Feb 16 12:11:58 localhost kernel:   DMA      [mem 0x0000000000001000-0x0000000000ffffff]
Feb 16 12:11:58 localhost kernel:   DMA32    [mem 0x0000000001000000-0x00000000ffffffff]
Feb 16 12:11:58 localhost kernel:   Normal   [mem 0x0000000100000000-0x000000023fffffff]
Feb 16 12:11:58 localhost kernel:   Device   empty
Feb 16 12:11:58 localhost kernel: Movable zone start for each node
Feb 16 12:11:58 localhost kernel: Early memory node ranges
Feb 16 12:11:58 localhost kernel:   node   0: [mem 0x0000000000001000-0x000000000009efff]
Feb 16 12:11:58 localhost kernel:   node   0: [mem 0x0000000000100000-0x00000000bffdafff]
Feb 16 12:11:58 localhost kernel:   node   0: [mem 0x0000000100000000-0x000000023fffffff]
Feb 16 12:11:58 localhost kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000023fffffff]
Feb 16 12:11:58 localhost kernel: On node 0, zone DMA: 1 pages in unavailable ranges
Feb 16 12:11:58 localhost kernel: On node 0, zone DMA: 97 pages in unavailable ranges
Feb 16 12:11:58 localhost kernel: On node 0, zone Normal: 37 pages in unavailable ranges
Feb 16 12:11:58 localhost kernel: ACPI: PM-Timer IO Port: 0x608
Feb 16 12:11:58 localhost kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1])
Feb 16 12:11:58 localhost kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23
Feb 16 12:11:58 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl)
Feb 16 12:11:58 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level)
Feb 16 12:11:58 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level)
Feb 16 12:11:58 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level)
Feb 16 12:11:58 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level)
Feb 16 12:11:58 localhost kernel: ACPI: Using ACPI (MADT) for SMP configuration information
Feb 16 12:11:58 localhost kernel: TSC deadline timer available
Feb 16 12:11:58 localhost kernel: CPU topo: Max. logical packages:   8
Feb 16 12:11:58 localhost kernel: CPU topo: Max. logical dies:       8
Feb 16 12:11:58 localhost kernel: CPU topo: Max. dies per package:   1
Feb 16 12:11:58 localhost kernel: CPU topo: Max. threads per core:   1
Feb 16 12:11:58 localhost kernel: CPU topo: Num. cores per package:     1
Feb 16 12:11:58 localhost kernel: CPU topo: Num. threads per package:   1
Feb 16 12:11:58 localhost kernel: CPU topo: Allowing 8 present CPUs plus 0 hotplug CPUs
Feb 16 12:11:58 localhost kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write()
Feb 16 12:11:58 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x00000000-0x00000fff]
Feb 16 12:11:58 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x0009f000-0x0009ffff]
Feb 16 12:11:58 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x000a0000-0x000effff]
Feb 16 12:11:58 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x000f0000-0x000fffff]
Feb 16 12:11:58 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xbffdb000-0xbfffffff]
Feb 16 12:11:58 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xc0000000-0xfeffbfff]
Feb 16 12:11:58 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xfeffc000-0xfeffffff]
Feb 16 12:11:58 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xff000000-0xfffbffff]
Feb 16 12:11:58 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xfffc0000-0xffffffff]
Feb 16 12:11:58 localhost kernel: [mem 0xc0000000-0xfeffbfff] available for PCI devices
Feb 16 12:11:58 localhost kernel: Booting paravirtualized kernel on KVM
Feb 16 12:11:58 localhost kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns
Feb 16 12:11:58 localhost kernel: setup_percpu: NR_CPUS:8192 nr_cpumask_bits:8 nr_cpu_ids:8 nr_node_ids:1
Feb 16 12:11:58 localhost kernel: percpu: Embedded 64 pages/cpu s225280 r8192 d28672 u262144
Feb 16 12:11:58 localhost kernel: pcpu-alloc: s225280 r8192 d28672 u262144 alloc=1*2097152
Feb 16 12:11:58 localhost kernel: pcpu-alloc: [0] 0 1 2 3 4 5 6 7 
Feb 16 12:11:58 localhost kernel: kvm-guest: PV spinlocks disabled, no host support
Feb 16 12:11:58 localhost kernel: Kernel command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-677.el9.x86_64 root=UUID=19ee07ed-c14b-4aa3-804d-f2cbdae2694f ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Feb 16 12:11:58 localhost kernel: Unknown kernel command line parameters "BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-677.el9.x86_64", will be passed to user space.
Feb 16 12:11:58 localhost kernel: random: crng init done
Feb 16 12:11:58 localhost kernel: Dentry cache hash table entries: 1048576 (order: 11, 8388608 bytes, linear)
Feb 16 12:11:58 localhost kernel: Inode-cache hash table entries: 524288 (order: 10, 4194304 bytes, linear)
Feb 16 12:11:58 localhost kernel: Fallback order for Node 0: 0 
Feb 16 12:11:58 localhost kernel: Built 1 zonelists, mobility grouping on.  Total pages: 2064091
Feb 16 12:11:58 localhost kernel: Policy zone: Normal
Feb 16 12:11:58 localhost kernel: mem auto-init: stack:off, heap alloc:off, heap free:off
Feb 16 12:11:58 localhost kernel: software IO TLB: area num 8.
Feb 16 12:11:58 localhost kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=8, Nodes=1
Feb 16 12:11:58 localhost kernel: ftrace: allocating 49543 entries in 194 pages
Feb 16 12:11:58 localhost kernel: ftrace: allocated 194 pages with 3 groups
Feb 16 12:11:58 localhost kernel: Dynamic Preempt: voluntary
Feb 16 12:11:58 localhost kernel: rcu: Preemptible hierarchical RCU implementation.
Feb 16 12:11:58 localhost kernel: rcu:         RCU event tracing is enabled.
Feb 16 12:11:58 localhost kernel: rcu:         RCU restricting CPUs from NR_CPUS=8192 to nr_cpu_ids=8.
Feb 16 12:11:58 localhost kernel:         Trampoline variant of Tasks RCU enabled.
Feb 16 12:11:58 localhost kernel:         Rude variant of Tasks RCU enabled.
Feb 16 12:11:58 localhost kernel:         Tracing variant of Tasks RCU enabled.
Feb 16 12:11:58 localhost kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies.
Feb 16 12:11:58 localhost kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=8
Feb 16 12:11:58 localhost kernel: RCU Tasks: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Feb 16 12:11:58 localhost kernel: RCU Tasks Rude: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Feb 16 12:11:58 localhost kernel: RCU Tasks Trace: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Feb 16 12:11:58 localhost kernel: NR_IRQS: 524544, nr_irqs: 488, preallocated irqs: 16
Feb 16 12:11:58 localhost kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention.
Feb 16 12:11:58 localhost kernel: kfence: initialized - using 2097152 bytes for 255 objects at 0x(____ptrval____)-0x(____ptrval____)
Feb 16 12:11:58 localhost kernel: Console: colour VGA+ 80x25
Feb 16 12:11:58 localhost kernel: printk: console [ttyS0] enabled
Feb 16 12:11:58 localhost kernel: ACPI: Core revision 20230331
Feb 16 12:11:58 localhost kernel: APIC: Switch to symmetric I/O mode setup
Feb 16 12:11:58 localhost kernel: x2apic enabled
Feb 16 12:11:58 localhost kernel: APIC: Switched APIC routing to: physical x2apic
Feb 16 12:11:58 localhost kernel: tsc: Marking TSC unstable due to TSCs unsynchronized
Feb 16 12:11:58 localhost kernel: Calibrating delay loop (skipped) preset value.. 5600.00 BogoMIPS (lpj=2800000)
Feb 16 12:11:58 localhost kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated
Feb 16 12:11:58 localhost kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127
Feb 16 12:11:58 localhost kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0
Feb 16 12:11:58 localhost kernel: mitigations: Enabled attack vectors: user_kernel, user_user, guest_host, guest_guest, SMT mitigations: auto
Feb 16 12:11:58 localhost kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl
Feb 16 12:11:58 localhost kernel: Spectre V2 : Mitigation: Retpolines
Feb 16 12:11:58 localhost kernel: RETBleed: Mitigation: untrained return thunk
Feb 16 12:11:58 localhost kernel: Speculative Return Stack Overflow: Mitigation: SMT disabled
Feb 16 12:11:58 localhost kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization
Feb 16 12:11:58 localhost kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT
Feb 16 12:11:58 localhost kernel: Spectre V2 : Enabling Speculation Barrier for firmware calls
Feb 16 12:11:58 localhost kernel: active return thunk: retbleed_return_thunk
Feb 16 12:11:58 localhost kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier
Feb 16 12:11:58 localhost kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers'
Feb 16 12:11:58 localhost kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers'
Feb 16 12:11:58 localhost kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers'
Feb 16 12:11:58 localhost kernel: x86/fpu: xstate_offset[2]:  576, xstate_sizes[2]:  256
Feb 16 12:11:58 localhost kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format.
Feb 16 12:11:58 localhost kernel: Freeing SMP alternatives memory: 40K
Feb 16 12:11:58 localhost kernel: pid_max: default: 32768 minimum: 301
Feb 16 12:11:58 localhost kernel: LSM: initializing lsm=lockdown,capability,landlock,yama,integrity,selinux,bpf
Feb 16 12:11:58 localhost kernel: landlock: Up and running.
Feb 16 12:11:58 localhost kernel: Yama: becoming mindful.
Feb 16 12:11:58 localhost kernel: SELinux:  Initializing.
Feb 16 12:11:58 localhost kernel: LSM support for eBPF active
Feb 16 12:11:58 localhost kernel: Mount-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Feb 16 12:11:58 localhost kernel: Mountpoint-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Feb 16 12:11:58 localhost kernel: smpboot: CPU0: AMD EPYC-Rome Processor (family: 0x17, model: 0x31, stepping: 0x0)
Feb 16 12:11:58 localhost kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver.
Feb 16 12:11:58 localhost kernel: ... version:                0
Feb 16 12:11:58 localhost kernel: ... bit width:              48
Feb 16 12:11:58 localhost kernel: ... generic registers:      6
Feb 16 12:11:58 localhost kernel: ... value mask:             0000ffffffffffff
Feb 16 12:11:58 localhost kernel: ... max period:             00007fffffffffff
Feb 16 12:11:58 localhost kernel: ... fixed-purpose events:   0
Feb 16 12:11:58 localhost kernel: ... event mask:             000000000000003f
Feb 16 12:11:58 localhost kernel: signal: max sigframe size: 1776
Feb 16 12:11:58 localhost kernel: rcu: Hierarchical SRCU implementation.
Feb 16 12:11:58 localhost kernel: rcu:         Max phase no-delay instances is 400.
Feb 16 12:11:58 localhost kernel: smp: Bringing up secondary CPUs ...
Feb 16 12:11:58 localhost kernel: smpboot: x86: Booting SMP configuration:
Feb 16 12:11:58 localhost kernel: .... node  #0, CPUs:      #1 #2 #3 #4 #5 #6 #7
Feb 16 12:11:58 localhost kernel: smp: Brought up 1 node, 8 CPUs
Feb 16 12:11:58 localhost kernel: smpboot: Total of 8 processors activated (44800.00 BogoMIPS)
Feb 16 12:11:58 localhost kernel: node 0 deferred pages initialised in 14ms
Feb 16 12:11:58 localhost kernel: Memory: 7617768K/8388068K available (16384K kernel code, 5795K rwdata, 13944K rodata, 4204K init, 7180K bss, 764408K reserved, 0K cma-reserved)
Feb 16 12:11:58 localhost kernel: devtmpfs: initialized
Feb 16 12:11:58 localhost kernel: x86/mm: Memory block size: 128MB
Feb 16 12:11:58 localhost kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns
Feb 16 12:11:58 localhost kernel: futex hash table entries: 2048 (131072 bytes on 1 NUMA nodes, total 128 KiB, linear).
Feb 16 12:11:58 localhost kernel: pinctrl core: initialized pinctrl subsystem
Feb 16 12:11:58 localhost kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family
Feb 16 12:11:58 localhost kernel: DMA: preallocated 1024 KiB GFP_KERNEL pool for atomic allocations
Feb 16 12:11:58 localhost kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations
Feb 16 12:11:58 localhost kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations
Feb 16 12:11:58 localhost kernel: audit: initializing netlink subsys (disabled)
Feb 16 12:11:58 localhost kernel: audit: type=2000 audit(1771243917.231:1): state=initialized audit_enabled=0 res=1
Feb 16 12:11:58 localhost kernel: thermal_sys: Registered thermal governor 'fair_share'
Feb 16 12:11:58 localhost kernel: thermal_sys: Registered thermal governor 'step_wise'
Feb 16 12:11:58 localhost kernel: thermal_sys: Registered thermal governor 'user_space'
Feb 16 12:11:58 localhost kernel: cpuidle: using governor menu
Feb 16 12:11:58 localhost kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5
Feb 16 12:11:58 localhost kernel: PCI: Using configuration type 1 for base access
Feb 16 12:11:58 localhost kernel: PCI: Using configuration type 1 for extended access
Feb 16 12:11:58 localhost kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible.
Feb 16 12:11:58 localhost kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages
Feb 16 12:11:58 localhost kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page
Feb 16 12:11:58 localhost kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages
Feb 16 12:11:58 localhost kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page
Feb 16 12:11:58 localhost kernel: Demotion targets for Node 0: null
Feb 16 12:11:58 localhost kernel: cryptd: max_cpu_qlen set to 1000
Feb 16 12:11:58 localhost kernel: ACPI: Added _OSI(Module Device)
Feb 16 12:11:58 localhost kernel: ACPI: Added _OSI(Processor Device)
Feb 16 12:11:58 localhost kernel: ACPI: Added _OSI(Processor Aggregator Device)
Feb 16 12:11:58 localhost kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded
Feb 16 12:11:58 localhost kernel: ACPI: Interpreter enabled
Feb 16 12:11:58 localhost kernel: ACPI: PM: (supports S0 S3 S4 S5)
Feb 16 12:11:58 localhost kernel: ACPI: Using IOAPIC for interrupt routing
Feb 16 12:11:58 localhost kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug
Feb 16 12:11:58 localhost kernel: PCI: Using E820 reservations for host bridge windows
Feb 16 12:11:58 localhost kernel: ACPI: Enabled 2 GPEs in block 00 to 0F
Feb 16 12:11:58 localhost kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff])
Feb 16 12:11:58 localhost kernel: acpi PNP0A03:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI EDR HPX-Type3]
Feb 16 12:11:58 localhost kernel: acpiphp: Slot [3] registered
Feb 16 12:11:58 localhost kernel: acpiphp: Slot [4] registered
Feb 16 12:11:58 localhost kernel: acpiphp: Slot [5] registered
Feb 16 12:11:58 localhost kernel: acpiphp: Slot [6] registered
Feb 16 12:11:58 localhost kernel: acpiphp: Slot [7] registered
Feb 16 12:11:58 localhost kernel: acpiphp: Slot [8] registered
Feb 16 12:11:58 localhost kernel: acpiphp: Slot [9] registered
Feb 16 12:11:58 localhost kernel: acpiphp: Slot [10] registered
Feb 16 12:11:58 localhost kernel: acpiphp: Slot [11] registered
Feb 16 12:11:58 localhost kernel: acpiphp: Slot [12] registered
Feb 16 12:11:58 localhost kernel: acpiphp: Slot [13] registered
Feb 16 12:11:58 localhost kernel: acpiphp: Slot [14] registered
Feb 16 12:11:58 localhost kernel: acpiphp: Slot [15] registered
Feb 16 12:11:58 localhost kernel: acpiphp: Slot [16] registered
Feb 16 12:11:58 localhost kernel: acpiphp: Slot [17] registered
Feb 16 12:11:58 localhost kernel: acpiphp: Slot [18] registered
Feb 16 12:11:58 localhost kernel: acpiphp: Slot [19] registered
Feb 16 12:11:58 localhost kernel: acpiphp: Slot [20] registered
Feb 16 12:11:58 localhost kernel: acpiphp: Slot [21] registered
Feb 16 12:11:58 localhost kernel: acpiphp: Slot [22] registered
Feb 16 12:11:58 localhost kernel: acpiphp: Slot [23] registered
Feb 16 12:11:58 localhost kernel: acpiphp: Slot [24] registered
Feb 16 12:11:58 localhost kernel: acpiphp: Slot [25] registered
Feb 16 12:11:58 localhost kernel: acpiphp: Slot [26] registered
Feb 16 12:11:58 localhost kernel: acpiphp: Slot [27] registered
Feb 16 12:11:58 localhost kernel: acpiphp: Slot [28] registered
Feb 16 12:11:58 localhost kernel: acpiphp: Slot [29] registered
Feb 16 12:11:58 localhost kernel: acpiphp: Slot [30] registered
Feb 16 12:11:58 localhost kernel: acpiphp: Slot [31] registered
Feb 16 12:11:58 localhost kernel: PCI host bridge to bus 0000:00
Feb 16 12:11:58 localhost kernel: pci_bus 0000:00: root bus resource [io  0x0000-0x0cf7 window]
Feb 16 12:11:58 localhost kernel: pci_bus 0000:00: root bus resource [io  0x0d00-0xffff window]
Feb 16 12:11:58 localhost kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window]
Feb 16 12:11:58 localhost kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window]
Feb 16 12:11:58 localhost kernel: pci_bus 0000:00: root bus resource [mem 0x240000000-0x2bfffffff window]
Feb 16 12:11:58 localhost kernel: pci_bus 0000:00: root bus resource [bus 00-ff]
Feb 16 12:11:58 localhost kernel: pci 0000:00:00.0: [8086:1237] type 00 class 0x060000 conventional PCI endpoint
Feb 16 12:11:58 localhost kernel: pci 0000:00:01.0: [8086:7000] type 00 class 0x060100 conventional PCI endpoint
Feb 16 12:11:58 localhost kernel: pci 0000:00:01.1: [8086:7010] type 00 class 0x010180 conventional PCI endpoint
Feb 16 12:11:58 localhost kernel: pci 0000:00:01.1: BAR 4 [io  0xc140-0xc14f]
Feb 16 12:11:58 localhost kernel: pci 0000:00:01.1: BAR 0 [io  0x01f0-0x01f7]: legacy IDE quirk
Feb 16 12:11:58 localhost kernel: pci 0000:00:01.1: BAR 1 [io  0x03f6]: legacy IDE quirk
Feb 16 12:11:58 localhost kernel: pci 0000:00:01.1: BAR 2 [io  0x0170-0x0177]: legacy IDE quirk
Feb 16 12:11:58 localhost kernel: pci 0000:00:01.1: BAR 3 [io  0x0376]: legacy IDE quirk
Feb 16 12:11:58 localhost kernel: pci 0000:00:01.2: [8086:7020] type 00 class 0x0c0300 conventional PCI endpoint
Feb 16 12:11:58 localhost kernel: pci 0000:00:01.2: BAR 4 [io  0xc100-0xc11f]
Feb 16 12:11:58 localhost kernel: pci 0000:00:01.3: [8086:7113] type 00 class 0x068000 conventional PCI endpoint
Feb 16 12:11:58 localhost kernel: pci 0000:00:01.3: quirk: [io  0x0600-0x063f] claimed by PIIX4 ACPI
Feb 16 12:11:58 localhost kernel: pci 0000:00:01.3: quirk: [io  0x0700-0x070f] claimed by PIIX4 SMB
Feb 16 12:11:58 localhost kernel: pci 0000:00:02.0: [1af4:1050] type 00 class 0x030000 conventional PCI endpoint
Feb 16 12:11:58 localhost kernel: pci 0000:00:02.0: BAR 0 [mem 0xfe000000-0xfe7fffff pref]
Feb 16 12:11:58 localhost kernel: pci 0000:00:02.0: BAR 2 [mem 0xfe800000-0xfe803fff 64bit pref]
Feb 16 12:11:58 localhost kernel: pci 0000:00:02.0: BAR 4 [mem 0xfeb90000-0xfeb90fff]
Feb 16 12:11:58 localhost kernel: pci 0000:00:02.0: ROM [mem 0xfeb80000-0xfeb8ffff pref]
Feb 16 12:11:58 localhost kernel: pci 0000:00:02.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff]
Feb 16 12:11:58 localhost kernel: pci 0000:00:03.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Feb 16 12:11:58 localhost kernel: pci 0000:00:03.0: BAR 0 [io  0xc080-0xc0bf]
Feb 16 12:11:58 localhost kernel: pci 0000:00:03.0: BAR 1 [mem 0xfeb91000-0xfeb91fff]
Feb 16 12:11:58 localhost kernel: pci 0000:00:03.0: BAR 4 [mem 0xfe804000-0xfe807fff 64bit pref]
Feb 16 12:11:58 localhost kernel: pci 0000:00:03.0: ROM [mem 0xfeb00000-0xfeb7ffff pref]
Feb 16 12:11:58 localhost kernel: pci 0000:00:04.0: [1af4:1001] type 00 class 0x010000 conventional PCI endpoint
Feb 16 12:11:58 localhost kernel: pci 0000:00:04.0: BAR 0 [io  0xc000-0xc07f]
Feb 16 12:11:58 localhost kernel: pci 0000:00:04.0: BAR 1 [mem 0xfeb92000-0xfeb92fff]
Feb 16 12:11:58 localhost kernel: pci 0000:00:04.0: BAR 4 [mem 0xfe808000-0xfe80bfff 64bit pref]
Feb 16 12:11:58 localhost kernel: pci 0000:00:05.0: [1af4:1002] type 00 class 0x00ff00 conventional PCI endpoint
Feb 16 12:11:58 localhost kernel: pci 0000:00:05.0: BAR 0 [io  0xc0c0-0xc0ff]
Feb 16 12:11:58 localhost kernel: pci 0000:00:05.0: BAR 4 [mem 0xfe80c000-0xfe80ffff 64bit pref]
Feb 16 12:11:58 localhost kernel: pci 0000:00:06.0: [1af4:1005] type 00 class 0x00ff00 conventional PCI endpoint
Feb 16 12:11:58 localhost kernel: pci 0000:00:06.0: BAR 0 [io  0xc120-0xc13f]
Feb 16 12:11:58 localhost kernel: pci 0000:00:06.0: BAR 4 [mem 0xfe810000-0xfe813fff 64bit pref]
Feb 16 12:11:58 localhost kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10
Feb 16 12:11:58 localhost kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10
Feb 16 12:11:58 localhost kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11
Feb 16 12:11:58 localhost kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11
Feb 16 12:11:58 localhost kernel: ACPI: PCI: Interrupt link LNKS configured for IRQ 9
Feb 16 12:11:58 localhost kernel: iommu: Default domain type: Translated
Feb 16 12:11:58 localhost kernel: iommu: DMA domain TLB invalidation policy: lazy mode
Feb 16 12:11:58 localhost kernel: SCSI subsystem initialized
Feb 16 12:11:58 localhost kernel: ACPI: bus type USB registered
Feb 16 12:11:58 localhost kernel: usbcore: registered new interface driver usbfs
Feb 16 12:11:58 localhost kernel: usbcore: registered new interface driver hub
Feb 16 12:11:58 localhost kernel: usbcore: registered new device driver usb
Feb 16 12:11:58 localhost kernel: pps_core: LinuxPPS API ver. 1 registered
Feb 16 12:11:58 localhost kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti <giometti@linux.it>
Feb 16 12:11:58 localhost kernel: PTP clock support registered
Feb 16 12:11:58 localhost kernel: EDAC MC: Ver: 3.0.0
Feb 16 12:11:58 localhost kernel: NetLabel: Initializing
Feb 16 12:11:58 localhost kernel: NetLabel:  domain hash size = 128
Feb 16 12:11:58 localhost kernel: NetLabel:  protocols = UNLABELED CIPSOv4 CALIPSO
Feb 16 12:11:58 localhost kernel: NetLabel:  unlabeled traffic allowed by default
Feb 16 12:11:58 localhost kernel: PCI: Using ACPI for IRQ routing
Feb 16 12:11:58 localhost kernel: PCI: pci_cache_line_size set to 64 bytes
Feb 16 12:11:58 localhost kernel: e820: reserve RAM buffer [mem 0x0009fc00-0x0009ffff]
Feb 16 12:11:58 localhost kernel: e820: reserve RAM buffer [mem 0xbffdb000-0xbfffffff]
Feb 16 12:11:58 localhost kernel: pci 0000:00:02.0: vgaarb: setting as boot VGA device
Feb 16 12:11:58 localhost kernel: pci 0000:00:02.0: vgaarb: bridge control possible
Feb 16 12:11:58 localhost kernel: pci 0000:00:02.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none
Feb 16 12:11:58 localhost kernel: vgaarb: loaded
Feb 16 12:11:58 localhost kernel: clocksource: Switched to clocksource kvm-clock
Feb 16 12:11:58 localhost kernel: VFS: Disk quotas dquot_6.6.0
Feb 16 12:11:58 localhost kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes)
Feb 16 12:11:58 localhost kernel: pnp: PnP ACPI init
Feb 16 12:11:58 localhost kernel: pnp 00:03: [dma 2]
Feb 16 12:11:58 localhost kernel: pnp: PnP ACPI: found 5 devices
Feb 16 12:11:58 localhost kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns
Feb 16 12:11:58 localhost kernel: NET: Registered PF_INET protocol family
Feb 16 12:11:58 localhost kernel: IP idents hash table entries: 131072 (order: 8, 1048576 bytes, linear)
Feb 16 12:11:58 localhost kernel: tcp_listen_portaddr_hash hash table entries: 4096 (order: 4, 65536 bytes, linear)
Feb 16 12:11:58 localhost kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear)
Feb 16 12:11:58 localhost kernel: TCP established hash table entries: 65536 (order: 7, 524288 bytes, linear)
Feb 16 12:11:58 localhost kernel: TCP bind hash table entries: 65536 (order: 8, 1048576 bytes, linear)
Feb 16 12:11:58 localhost kernel: TCP: Hash tables configured (established 65536 bind 65536)
Feb 16 12:11:58 localhost kernel: MPTCP token hash table entries: 8192 (order: 5, 196608 bytes, linear)
Feb 16 12:11:58 localhost kernel: UDP hash table entries: 4096 (order: 5, 131072 bytes, linear)
Feb 16 12:11:58 localhost kernel: UDP-Lite hash table entries: 4096 (order: 5, 131072 bytes, linear)
Feb 16 12:11:58 localhost kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family
Feb 16 12:11:58 localhost kernel: NET: Registered PF_XDP protocol family
Feb 16 12:11:58 localhost kernel: pci_bus 0000:00: resource 4 [io  0x0000-0x0cf7 window]
Feb 16 12:11:58 localhost kernel: pci_bus 0000:00: resource 5 [io  0x0d00-0xffff window]
Feb 16 12:11:58 localhost kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window]
Feb 16 12:11:58 localhost kernel: pci_bus 0000:00: resource 7 [mem 0xc0000000-0xfebfffff window]
Feb 16 12:11:58 localhost kernel: pci_bus 0000:00: resource 8 [mem 0x240000000-0x2bfffffff window]
Feb 16 12:11:58 localhost kernel: pci 0000:00:01.0: PIIX3: Enabling Passive Release
Feb 16 12:11:58 localhost kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers
Feb 16 12:11:58 localhost kernel: ACPI: \_SB_.LNKD: Enabled at IRQ 11
Feb 16 12:11:58 localhost kernel: pci 0000:00:01.2: quirk_usb_early_handoff+0x0/0x160 took 22615 usecs
Feb 16 12:11:58 localhost kernel: PCI: CLS 0 bytes, default 64
Feb 16 12:11:58 localhost kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB)
Feb 16 12:11:58 localhost kernel: software IO TLB: mapped [mem 0x00000000ab000000-0x00000000af000000] (64MB)
Feb 16 12:11:58 localhost kernel: ACPI: bus type thunderbolt registered
Feb 16 12:11:58 localhost kernel: Trying to unpack rootfs image as initramfs...
Feb 16 12:11:58 localhost kernel: Initialise system trusted keyrings
Feb 16 12:11:58 localhost kernel: Key type blacklist registered
Feb 16 12:11:58 localhost kernel: workingset: timestamp_bits=36 max_order=21 bucket_order=0
Feb 16 12:11:58 localhost kernel: zbud: loaded
Feb 16 12:11:58 localhost kernel: integrity: Platform Keyring initialized
Feb 16 12:11:58 localhost kernel: integrity: Machine keyring initialized
Feb 16 12:11:58 localhost kernel: Freeing initrd memory: 234008K
Feb 16 12:11:58 localhost kernel: NET: Registered PF_ALG protocol family
Feb 16 12:11:58 localhost kernel: xor: automatically using best checksumming function   avx       
Feb 16 12:11:58 localhost kernel: Key type asymmetric registered
Feb 16 12:11:58 localhost kernel: Asymmetric key parser 'x509' registered
Feb 16 12:11:58 localhost kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 246)
Feb 16 12:11:58 localhost kernel: io scheduler mq-deadline registered
Feb 16 12:11:58 localhost kernel: io scheduler kyber registered
Feb 16 12:11:58 localhost kernel: io scheduler bfq registered
Feb 16 12:11:58 localhost kernel: atomic64_test: passed for x86-64 platform with CX8 and with SSE
Feb 16 12:11:58 localhost kernel: shpchp: Standard Hot Plug PCI Controller Driver version: 0.4
Feb 16 12:11:58 localhost kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input0
Feb 16 12:11:58 localhost kernel: ACPI: button: Power Button [PWRF]
Feb 16 12:11:58 localhost kernel: ACPI: \_SB_.LNKB: Enabled at IRQ 10
Feb 16 12:11:58 localhost kernel: ACPI: \_SB_.LNKC: Enabled at IRQ 11
Feb 16 12:11:58 localhost kernel: ACPI: \_SB_.LNKA: Enabled at IRQ 10
Feb 16 12:11:58 localhost kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled
Feb 16 12:11:58 localhost kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A
Feb 16 12:11:58 localhost kernel: Non-volatile memory driver v1.3
Feb 16 12:11:58 localhost kernel: rdac: device handler registered
Feb 16 12:11:58 localhost kernel: hp_sw: device handler registered
Feb 16 12:11:58 localhost kernel: emc: device handler registered
Feb 16 12:11:58 localhost kernel: alua: device handler registered
Feb 16 12:11:58 localhost kernel: uhci_hcd 0000:00:01.2: UHCI Host Controller
Feb 16 12:11:58 localhost kernel: uhci_hcd 0000:00:01.2: new USB bus registered, assigned bus number 1
Feb 16 12:11:58 localhost kernel: uhci_hcd 0000:00:01.2: detected 2 ports
Feb 16 12:11:58 localhost kernel: uhci_hcd 0000:00:01.2: irq 11, io port 0x0000c100
Feb 16 12:11:58 localhost kernel: usb usb1: New USB device found, idVendor=1d6b, idProduct=0001, bcdDevice= 5.14
Feb 16 12:11:58 localhost kernel: usb usb1: New USB device strings: Mfr=3, Product=2, SerialNumber=1
Feb 16 12:11:58 localhost kernel: usb usb1: Product: UHCI Host Controller
Feb 16 12:11:58 localhost kernel: usb usb1: Manufacturer: Linux 5.14.0-677.el9.x86_64 uhci_hcd
Feb 16 12:11:58 localhost kernel: usb usb1: SerialNumber: 0000:00:01.2
Feb 16 12:11:58 localhost kernel: hub 1-0:1.0: USB hub found
Feb 16 12:11:58 localhost kernel: hub 1-0:1.0: 2 ports detected
Feb 16 12:11:58 localhost kernel: usbcore: registered new interface driver usbserial_generic
Feb 16 12:11:58 localhost kernel: usbserial: USB Serial support registered for generic
Feb 16 12:11:58 localhost kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12
Feb 16 12:11:58 localhost kernel: serio: i8042 KBD port at 0x60,0x64 irq 1
Feb 16 12:11:58 localhost kernel: serio: i8042 AUX port at 0x60,0x64 irq 12
Feb 16 12:11:58 localhost kernel: mousedev: PS/2 mouse device common for all mice
Feb 16 12:11:58 localhost kernel: rtc_cmos 00:04: RTC can wake from S4
Feb 16 12:11:58 localhost kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input1
Feb 16 12:11:58 localhost kernel: rtc_cmos 00:04: registered as rtc0
Feb 16 12:11:58 localhost kernel: rtc_cmos 00:04: setting system clock to 2026-02-16T12:11:57 UTC (1771243917)
Feb 16 12:11:58 localhost kernel: rtc_cmos 00:04: alarms up to one day, y3k, 242 bytes nvram
Feb 16 12:11:58 localhost kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled
Feb 16 12:11:58 localhost kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input4
Feb 16 12:11:58 localhost kernel: hid: raw HID events driver (C) Jiri Kosina
Feb 16 12:11:58 localhost kernel: usbcore: registered new interface driver usbhid
Feb 16 12:11:58 localhost kernel: usbhid: USB HID core driver
Feb 16 12:11:58 localhost kernel: drop_monitor: Initializing network drop monitor service
Feb 16 12:11:58 localhost kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input3
Feb 16 12:11:58 localhost kernel: Initializing XFRM netlink socket
Feb 16 12:11:58 localhost kernel: NET: Registered PF_INET6 protocol family
Feb 16 12:11:58 localhost kernel: Segment Routing with IPv6
Feb 16 12:11:58 localhost kernel: NET: Registered PF_PACKET protocol family
Feb 16 12:11:58 localhost kernel: mpls_gso: MPLS GSO support
Feb 16 12:11:58 localhost kernel: IPI shorthand broadcast: enabled
Feb 16 12:11:58 localhost kernel: AVX2 version of gcm_enc/dec engaged.
Feb 16 12:11:58 localhost kernel: AES CTR mode by8 optimization enabled
Feb 16 12:11:58 localhost kernel: sched_clock: Marking stable (985001909, 144070840)->(1229900509, -100827760)
Feb 16 12:11:58 localhost kernel: registered taskstats version 1
Feb 16 12:11:58 localhost kernel: Loading compiled-in X.509 certificates
Feb 16 12:11:58 localhost kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: 59012b35a0d3f62f49a40ad60f91f66a06ca3be0'
Feb 16 12:11:58 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux Driver Update Program (key 3): bf57f3e87362bc7229d9f465321773dfd1f77a80'
Feb 16 12:11:58 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux kpatch signing key: 4d38fd864ebe18c5f0b72e3852e2014c3a676fc8'
Feb 16 12:11:58 localhost kernel: Loaded X.509 cert 'RH-IMA-CA: Red Hat IMA CA: fb31825dd0e073685b264e3038963673f753959a'
Feb 16 12:11:58 localhost kernel: Loaded X.509 cert 'Nvidia GPU OOT signing 001: 55e1cef88193e60419f0b0ec379c49f77545acf0'
Feb 16 12:11:58 localhost kernel: Demotion targets for Node 0: null
Feb 16 12:11:58 localhost kernel: page_owner is disabled
Feb 16 12:11:58 localhost kernel: Key type .fscrypt registered
Feb 16 12:11:58 localhost kernel: Key type fscrypt-provisioning registered
Feb 16 12:11:58 localhost kernel: Key type big_key registered
Feb 16 12:11:58 localhost kernel: Key type encrypted registered
Feb 16 12:11:58 localhost kernel: ima: No TPM chip found, activating TPM-bypass!
Feb 16 12:11:58 localhost kernel: Loading compiled-in module X.509 certificates
Feb 16 12:11:58 localhost kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: 59012b35a0d3f62f49a40ad60f91f66a06ca3be0'
Feb 16 12:11:58 localhost kernel: ima: Allocated hash algorithm: sha256
Feb 16 12:11:58 localhost kernel: ima: No architecture policies found
Feb 16 12:11:58 localhost kernel: evm: Initialising EVM extended attributes:
Feb 16 12:11:58 localhost kernel: evm: security.selinux
Feb 16 12:11:58 localhost kernel: evm: security.SMACK64 (disabled)
Feb 16 12:11:58 localhost kernel: evm: security.SMACK64EXEC (disabled)
Feb 16 12:11:58 localhost kernel: evm: security.SMACK64TRANSMUTE (disabled)
Feb 16 12:11:58 localhost kernel: evm: security.SMACK64MMAP (disabled)
Feb 16 12:11:58 localhost kernel: evm: security.apparmor (disabled)
Feb 16 12:11:58 localhost kernel: evm: security.ima
Feb 16 12:11:58 localhost kernel: evm: security.capability
Feb 16 12:11:58 localhost kernel: evm: HMAC attrs: 0x1
Feb 16 12:11:58 localhost kernel: usb 1-1: new full-speed USB device number 2 using uhci_hcd
Feb 16 12:11:58 localhost kernel: Running certificate verification RSA selftest
Feb 16 12:11:58 localhost kernel: Loaded X.509 cert 'Certificate verification self-testing key: f58703bb33ce1b73ee02eccdee5b8817518fe3db'
Feb 16 12:11:58 localhost kernel: Running certificate verification ECDSA selftest
Feb 16 12:11:58 localhost kernel: Loaded X.509 cert 'Certificate verification ECDSA self-testing key: 2900bcea1deb7bc8479a84a23d758efdfdd2b2d3'
Feb 16 12:11:58 localhost kernel: clk: Disabling unused clocks
Feb 16 12:11:58 localhost kernel: Freeing unused decrypted memory: 2028K
Feb 16 12:11:58 localhost kernel: Freeing unused kernel image (initmem) memory: 4204K
Feb 16 12:11:58 localhost kernel: Write protecting the kernel read-only data: 30720k
Feb 16 12:11:58 localhost kernel: Freeing unused kernel image (rodata/data gap) memory: 392K
Feb 16 12:11:58 localhost kernel: x86/mm: Checked W+X mappings: passed, no W+X pages found.
Feb 16 12:11:58 localhost kernel: Run /init as init process
Feb 16 12:11:58 localhost kernel:   with arguments:
Feb 16 12:11:58 localhost kernel:     /init
Feb 16 12:11:58 localhost kernel:   with environment:
Feb 16 12:11:58 localhost kernel:     HOME=/
Feb 16 12:11:58 localhost kernel:     TERM=linux
Feb 16 12:11:58 localhost kernel:     BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-677.el9.x86_64
Feb 16 12:11:58 localhost systemd[1]: systemd 252-64.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Feb 16 12:11:58 localhost systemd[1]: Detected virtualization kvm.
Feb 16 12:11:58 localhost systemd[1]: Detected architecture x86-64.
Feb 16 12:11:58 localhost systemd[1]: Running in initrd.
Feb 16 12:11:58 localhost systemd[1]: No hostname configured, using default hostname.
Feb 16 12:11:58 localhost systemd[1]: Hostname set to <localhost>.
Feb 16 12:11:58 localhost systemd[1]: Initializing machine ID from VM UUID.
Feb 16 12:11:58 localhost kernel: usb 1-1: New USB device found, idVendor=0627, idProduct=0001, bcdDevice= 0.00
Feb 16 12:11:58 localhost kernel: usb 1-1: New USB device strings: Mfr=1, Product=3, SerialNumber=10
Feb 16 12:11:58 localhost kernel: usb 1-1: Product: QEMU USB Tablet
Feb 16 12:11:58 localhost kernel: usb 1-1: Manufacturer: QEMU
Feb 16 12:11:58 localhost kernel: usb 1-1: SerialNumber: 28754-0000:00:01.2-1
Feb 16 12:11:58 localhost kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:01.2/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input5
Feb 16 12:11:58 localhost kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:00:01.2-1/input0
Feb 16 12:11:58 localhost systemd[1]: Queued start job for default target Initrd Default Target.
Feb 16 12:11:58 localhost systemd[1]: Started Dispatch Password Requests to Console Directory Watch.
Feb 16 12:11:58 localhost systemd[1]: Reached target Local Encrypted Volumes.
Feb 16 12:11:58 localhost systemd[1]: Reached target Initrd /usr File System.
Feb 16 12:11:58 localhost systemd[1]: Reached target Local File Systems.
Feb 16 12:11:58 localhost systemd[1]: Reached target Path Units.
Feb 16 12:11:58 localhost systemd[1]: Reached target Slice Units.
Feb 16 12:11:58 localhost systemd[1]: Reached target Swaps.
Feb 16 12:11:58 localhost systemd[1]: Reached target Timer Units.
Feb 16 12:11:58 localhost systemd[1]: Listening on D-Bus System Message Bus Socket.
Feb 16 12:11:58 localhost systemd[1]: Listening on Journal Socket (/dev/log).
Feb 16 12:11:58 localhost systemd[1]: Listening on Journal Socket.
Feb 16 12:11:58 localhost systemd[1]: Listening on udev Control Socket.
Feb 16 12:11:58 localhost systemd[1]: Listening on udev Kernel Socket.
Feb 16 12:11:58 localhost systemd[1]: Reached target Socket Units.
Feb 16 12:11:58 localhost systemd[1]: Starting Create List of Static Device Nodes...
Feb 16 12:11:58 localhost systemd[1]: Starting Journal Service...
Feb 16 12:11:58 localhost systemd[1]: Load Kernel Modules was skipped because no trigger condition checks were met.
Feb 16 12:11:58 localhost systemd[1]: Starting Apply Kernel Variables...
Feb 16 12:11:58 localhost systemd[1]: Starting Create System Users...
Feb 16 12:11:58 localhost systemd[1]: Starting Setup Virtual Console...
Feb 16 12:11:58 localhost systemd[1]: Finished Create List of Static Device Nodes.
Feb 16 12:11:58 localhost systemd[1]: Finished Apply Kernel Variables.
Feb 16 12:11:58 localhost systemd-journald[306]: Journal started
Feb 16 12:11:58 localhost systemd-journald[306]: Runtime Journal (/run/log/journal/880a2e206f1146bbb51cb4136280b28f) is 8.0M, max 153.6M, 145.6M free.
Feb 16 12:11:58 localhost systemd-sysusers[310]: Creating group 'users' with GID 100.
Feb 16 12:11:58 localhost systemd-sysusers[310]: Creating group 'dbus' with GID 81.
Feb 16 12:11:58 localhost systemd[1]: Started Journal Service.
Feb 16 12:11:58 localhost systemd-sysusers[310]: Creating user 'dbus' (System Message Bus) with UID 81 and GID 81.
Feb 16 12:11:58 localhost systemd[1]: Finished Create System Users.
Feb 16 12:11:58 localhost systemd[1]: Starting Create Static Device Nodes in /dev...
Feb 16 12:11:58 localhost systemd[1]: Starting Create Volatile Files and Directories...
Feb 16 12:11:58 localhost systemd[1]: Finished Create Static Device Nodes in /dev.
Feb 16 12:11:58 localhost systemd[1]: Finished Create Volatile Files and Directories.
Feb 16 12:11:58 localhost systemd[1]: Finished Setup Virtual Console.
Feb 16 12:11:58 localhost systemd[1]: dracut ask for additional cmdline parameters was skipped because no trigger condition checks were met.
Feb 16 12:11:58 localhost systemd[1]: Starting dracut cmdline hook...
Feb 16 12:11:58 localhost dracut-cmdline[326]: dracut-9 dracut-057-110.git20260130.el9
Feb 16 12:11:58 localhost dracut-cmdline[326]: Using kernel command line parameters:    BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-677.el9.x86_64 root=UUID=19ee07ed-c14b-4aa3-804d-f2cbdae2694f ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Feb 16 12:11:58 localhost systemd[1]: Finished dracut cmdline hook.
Feb 16 12:11:58 localhost systemd[1]: Starting dracut pre-udev hook...
Feb 16 12:11:58 localhost kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log.
Feb 16 12:11:58 localhost kernel: device-mapper: uevent: version 1.0.3
Feb 16 12:11:58 localhost kernel: device-mapper: ioctl: 4.50.0-ioctl (2025-04-28) initialised: dm-devel@lists.linux.dev
Feb 16 12:11:58 localhost kernel: RPC: Registered named UNIX socket transport module.
Feb 16 12:11:58 localhost kernel: RPC: Registered udp transport module.
Feb 16 12:11:58 localhost kernel: RPC: Registered tcp transport module.
Feb 16 12:11:58 localhost kernel: RPC: Registered tcp-with-tls transport module.
Feb 16 12:11:58 localhost kernel: RPC: Registered tcp NFSv4.1 backchannel transport module.
Feb 16 12:11:58 localhost rpc.statd[442]: Version 2.5.4 starting
Feb 16 12:11:58 localhost rpc.statd[442]: Initializing NSM state
Feb 16 12:11:58 localhost rpc.idmapd[447]: Setting log level to 0
Feb 16 12:11:58 localhost systemd[1]: Finished dracut pre-udev hook.
Feb 16 12:11:58 localhost systemd[1]: Starting Rule-based Manager for Device Events and Files...
Feb 16 12:11:58 localhost systemd-udevd[460]: Using default interface naming scheme 'rhel-9.0'.
Feb 16 12:11:58 localhost systemd[1]: Started Rule-based Manager for Device Events and Files.
Feb 16 12:11:58 localhost systemd[1]: Starting dracut pre-trigger hook...
Feb 16 12:11:58 localhost systemd[1]: Finished dracut pre-trigger hook.
Feb 16 12:11:58 localhost systemd[1]: Starting Coldplug All udev Devices...
Feb 16 12:11:58 localhost systemd[1]: Created slice Slice /system/modprobe.
Feb 16 12:11:58 localhost systemd[1]: Starting Load Kernel Module configfs...
Feb 16 12:11:58 localhost systemd[1]: Finished Coldplug All udev Devices.
Feb 16 12:11:58 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Feb 16 12:11:58 localhost systemd[1]: Finished Load Kernel Module configfs.
Feb 16 12:11:58 localhost systemd[1]: nm-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Feb 16 12:11:58 localhost systemd[1]: Reached target Network.
Feb 16 12:11:58 localhost systemd[1]: nm-wait-online-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Feb 16 12:11:58 localhost systemd[1]: Starting dracut initqueue hook...
Feb 16 12:11:58 localhost kernel: libata version 3.00 loaded.
Feb 16 12:11:58 localhost kernel: virtio_blk virtio2: 8/0/0 default/read/poll queues
Feb 16 12:11:58 localhost kernel: virtio_blk virtio2: [vda] 167772160 512-byte logical blocks (85.9 GB/80.0 GiB)
Feb 16 12:11:58 localhost kernel: ata_piix 0000:00:01.1: version 2.13
Feb 16 12:11:58 localhost kernel:  vda: vda1
Feb 16 12:11:58 localhost kernel: scsi host0: ata_piix
Feb 16 12:11:58 localhost kernel: ACPI: bus type drm_connector registered
Feb 16 12:11:58 localhost kernel: scsi host1: ata_piix
Feb 16 12:11:58 localhost kernel: ata1: PATA max MWDMA2 cmd 0x1f0 ctl 0x3f6 bmdma 0xc140 irq 14 lpm-pol 0
Feb 16 12:11:58 localhost kernel: ata2: PATA max MWDMA2 cmd 0x170 ctl 0x376 bmdma 0xc148 irq 15 lpm-pol 0
Feb 16 12:11:58 localhost systemd-udevd[481]: Network interface NamePolicy= disabled on kernel command line.
Feb 16 12:11:58 localhost systemd[1]: Found device /dev/disk/by-uuid/19ee07ed-c14b-4aa3-804d-f2cbdae2694f.
Feb 16 12:11:58 localhost systemd[1]: Reached target Initrd Root Device.
Feb 16 12:11:59 localhost kernel: ata1: found unknown device (class 0)
Feb 16 12:11:59 localhost kernel: ata1.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100
Feb 16 12:11:59 localhost kernel: scsi 0:0:0:0: CD-ROM            QEMU     QEMU DVD-ROM     2.5+ PQ: 0 ANSI: 5
Feb 16 12:11:59 localhost systemd[1]: Mounting Kernel Configuration File System...
Feb 16 12:11:59 localhost systemd[1]: Mounted Kernel Configuration File System.
Feb 16 12:11:59 localhost systemd[1]: Reached target System Initialization.
Feb 16 12:11:59 localhost systemd[1]: Reached target Basic System.
Feb 16 12:11:59 localhost kernel: scsi 0:0:0:0: Attached scsi generic sg0 type 5
Feb 16 12:11:59 localhost kernel: [drm] pci: virtio-vga detected at 0000:00:02.0
Feb 16 12:11:59 localhost kernel: virtio-pci 0000:00:02.0: vgaarb: deactivate vga console
Feb 16 12:11:59 localhost kernel: Console: switching to colour dummy device 80x25
Feb 16 12:11:59 localhost kernel: [drm] features: -virgl +edid -resource_blob -host_visible
Feb 16 12:11:59 localhost kernel: [drm] features: -context_init
Feb 16 12:11:59 localhost kernel: [drm] number of scanouts: 1
Feb 16 12:11:59 localhost kernel: [drm] number of cap sets: 0
Feb 16 12:11:59 localhost kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:00:02.0 on minor 0
Feb 16 12:11:59 localhost kernel: fbcon: virtio_gpudrmfb (fb0) is primary device
Feb 16 12:11:59 localhost kernel: Console: switching to colour frame buffer device 128x48
Feb 16 12:11:59 localhost kernel: virtio-pci 0000:00:02.0: [drm] fb0: virtio_gpudrmfb frame buffer device
Feb 16 12:11:59 localhost kernel: sr 0:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray
Feb 16 12:11:59 localhost kernel: cdrom: Uniform CD-ROM driver Revision: 3.20
Feb 16 12:11:59 localhost kernel: sr 0:0:0:0: Attached scsi CD-ROM sr0
Feb 16 12:11:59 localhost systemd[1]: Finished dracut initqueue hook.
Feb 16 12:11:59 localhost systemd[1]: Reached target Preparation for Remote File Systems.
Feb 16 12:11:59 localhost systemd[1]: Reached target Remote Encrypted Volumes.
Feb 16 12:11:59 localhost systemd[1]: Reached target Remote File Systems.
Feb 16 12:11:59 localhost systemd[1]: Starting dracut pre-mount hook...
Feb 16 12:11:59 localhost systemd[1]: Finished dracut pre-mount hook.
Feb 16 12:11:59 localhost systemd[1]: Starting File System Check on /dev/disk/by-uuid/19ee07ed-c14b-4aa3-804d-f2cbdae2694f...
Feb 16 12:11:59 localhost systemd-fsck[565]: /usr/sbin/fsck.xfs: XFS file system.
Feb 16 12:11:59 localhost systemd[1]: Finished File System Check on /dev/disk/by-uuid/19ee07ed-c14b-4aa3-804d-f2cbdae2694f.
Feb 16 12:11:59 localhost systemd[1]: Mounting /sysroot...
Feb 16 12:11:59 localhost kernel: SGI XFS with ACLs, security attributes, scrub, quota, no debug enabled
Feb 16 12:11:59 localhost kernel: XFS (vda1): Mounting V5 Filesystem 19ee07ed-c14b-4aa3-804d-f2cbdae2694f
Feb 16 12:11:59 localhost kernel: XFS (vda1): Ending clean mount
Feb 16 12:12:00 localhost systemd[1]: Mounted /sysroot.
Feb 16 12:12:00 localhost systemd[1]: Reached target Initrd Root File System.
Feb 16 12:12:00 localhost systemd[1]: Starting Mountpoints Configured in the Real Root...
Feb 16 12:12:00 localhost systemd[1]: initrd-parse-etc.service: Deactivated successfully.
Feb 16 12:12:00 localhost systemd[1]: Finished Mountpoints Configured in the Real Root.
Feb 16 12:12:00 localhost systemd[1]: Reached target Initrd File Systems.
Feb 16 12:12:00 localhost systemd[1]: Reached target Initrd Default Target.
Feb 16 12:12:00 localhost systemd[1]: Starting dracut mount hook...
Feb 16 12:12:00 localhost systemd[1]: Finished dracut mount hook.
Feb 16 12:12:00 localhost systemd[1]: Starting dracut pre-pivot and cleanup hook...
Feb 16 12:12:00 localhost rpc.idmapd[447]: exiting on signal 15
Feb 16 12:12:00 localhost systemd[1]: var-lib-nfs-rpc_pipefs.mount: Deactivated successfully.
Feb 16 12:12:00 localhost systemd[1]: Finished dracut pre-pivot and cleanup hook.
Feb 16 12:12:00 localhost systemd[1]: Starting Cleaning Up and Shutting Down Daemons...
Feb 16 12:12:00 localhost systemd[1]: Stopped target Network.
Feb 16 12:12:00 localhost systemd[1]: Stopped target Remote Encrypted Volumes.
Feb 16 12:12:00 localhost systemd[1]: Stopped target Timer Units.
Feb 16 12:12:00 localhost systemd[1]: dbus.socket: Deactivated successfully.
Feb 16 12:12:00 localhost systemd[1]: Closed D-Bus System Message Bus Socket.
Feb 16 12:12:00 localhost systemd[1]: dracut-pre-pivot.service: Deactivated successfully.
Feb 16 12:12:00 localhost systemd[1]: Stopped dracut pre-pivot and cleanup hook.
Feb 16 12:12:00 localhost systemd[1]: Stopped target Initrd Default Target.
Feb 16 12:12:00 localhost systemd[1]: Stopped target Basic System.
Feb 16 12:12:00 localhost systemd[1]: Stopped target Initrd Root Device.
Feb 16 12:12:00 localhost systemd[1]: Stopped target Initrd /usr File System.
Feb 16 12:12:00 localhost systemd[1]: Stopped target Path Units.
Feb 16 12:12:00 localhost systemd[1]: Stopped target Remote File Systems.
Feb 16 12:12:00 localhost systemd[1]: Stopped target Preparation for Remote File Systems.
Feb 16 12:12:00 localhost systemd[1]: Stopped target Slice Units.
Feb 16 12:12:00 localhost systemd[1]: Stopped target Socket Units.
Feb 16 12:12:00 localhost systemd[1]: Stopped target System Initialization.
Feb 16 12:12:00 localhost systemd[1]: Stopped target Local File Systems.
Feb 16 12:12:00 localhost systemd[1]: Stopped target Swaps.
Feb 16 12:12:00 localhost systemd[1]: dracut-mount.service: Deactivated successfully.
Feb 16 12:12:00 localhost systemd[1]: Stopped dracut mount hook.
Feb 16 12:12:00 localhost systemd[1]: dracut-pre-mount.service: Deactivated successfully.
Feb 16 12:12:00 localhost systemd[1]: Stopped dracut pre-mount hook.
Feb 16 12:12:00 localhost systemd[1]: Stopped target Local Encrypted Volumes.
Feb 16 12:12:00 localhost systemd[1]: systemd-ask-password-console.path: Deactivated successfully.
Feb 16 12:12:00 localhost systemd[1]: Stopped Dispatch Password Requests to Console Directory Watch.
Feb 16 12:12:00 localhost systemd[1]: dracut-initqueue.service: Deactivated successfully.
Feb 16 12:12:00 localhost systemd[1]: Stopped dracut initqueue hook.
Feb 16 12:12:00 localhost systemd[1]: systemd-sysctl.service: Deactivated successfully.
Feb 16 12:12:00 localhost systemd[1]: Stopped Apply Kernel Variables.
Feb 16 12:12:00 localhost systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully.
Feb 16 12:12:00 localhost systemd[1]: Stopped Create Volatile Files and Directories.
Feb 16 12:12:00 localhost systemd[1]: systemd-udev-trigger.service: Deactivated successfully.
Feb 16 12:12:00 localhost systemd[1]: Stopped Coldplug All udev Devices.
Feb 16 12:12:00 localhost systemd[1]: dracut-pre-trigger.service: Deactivated successfully.
Feb 16 12:12:00 localhost systemd[1]: Stopped dracut pre-trigger hook.
Feb 16 12:12:00 localhost systemd[1]: Stopping Rule-based Manager for Device Events and Files...
Feb 16 12:12:00 localhost systemd[1]: systemd-vconsole-setup.service: Deactivated successfully.
Feb 16 12:12:00 localhost systemd[1]: Stopped Setup Virtual Console.
Feb 16 12:12:00 localhost systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully.
Feb 16 12:12:00 localhost systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Feb 16 12:12:00 localhost systemd[1]: systemd-udevd.service: Deactivated successfully.
Feb 16 12:12:00 localhost systemd[1]: Stopped Rule-based Manager for Device Events and Files.
Feb 16 12:12:00 localhost systemd[1]: systemd-udevd-control.socket: Deactivated successfully.
Feb 16 12:12:00 localhost systemd[1]: Closed udev Control Socket.
Feb 16 12:12:00 localhost systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully.
Feb 16 12:12:00 localhost systemd[1]: Closed udev Kernel Socket.
Feb 16 12:12:00 localhost systemd[1]: dracut-pre-udev.service: Deactivated successfully.
Feb 16 12:12:00 localhost systemd[1]: Stopped dracut pre-udev hook.
Feb 16 12:12:00 localhost systemd[1]: dracut-cmdline.service: Deactivated successfully.
Feb 16 12:12:00 localhost systemd[1]: Stopped dracut cmdline hook.
Feb 16 12:12:00 localhost systemd[1]: Starting Cleanup udev Database...
Feb 16 12:12:00 localhost systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully.
Feb 16 12:12:00 localhost systemd[1]: Stopped Create Static Device Nodes in /dev.
Feb 16 12:12:00 localhost systemd[1]: kmod-static-nodes.service: Deactivated successfully.
Feb 16 12:12:00 localhost systemd[1]: Stopped Create List of Static Device Nodes.
Feb 16 12:12:00 localhost systemd[1]: systemd-sysusers.service: Deactivated successfully.
Feb 16 12:12:00 localhost systemd[1]: Stopped Create System Users.
Feb 16 12:12:00 localhost systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully.
Feb 16 12:12:00 localhost systemd[1]: run-credentials-systemd\x2dsysusers.service.mount: Deactivated successfully.
Feb 16 12:12:00 localhost systemd[1]: initrd-cleanup.service: Deactivated successfully.
Feb 16 12:12:00 localhost systemd[1]: Finished Cleaning Up and Shutting Down Daemons.
Feb 16 12:12:00 localhost systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully.
Feb 16 12:12:00 localhost systemd[1]: Finished Cleanup udev Database.
Feb 16 12:12:00 localhost systemd[1]: Reached target Switch Root.
Feb 16 12:12:00 localhost systemd[1]: Starting Switch Root...
Feb 16 12:12:00 localhost systemd[1]: Switching root.
Feb 16 12:12:00 localhost systemd-journald[306]: Journal stopped
Feb 16 12:12:02 localhost systemd-journald[306]: Received SIGTERM from PID 1 (systemd).
Feb 16 12:12:02 localhost kernel: audit: type=1404 audit(1771243920.920:2): enforcing=1 old_enforcing=0 auid=4294967295 ses=4294967295 enabled=1 old-enabled=1 lsm=selinux res=1
Feb 16 12:12:02 localhost kernel: SELinux:  policy capability network_peer_controls=1
Feb 16 12:12:02 localhost kernel: SELinux:  policy capability open_perms=1
Feb 16 12:12:02 localhost kernel: SELinux:  policy capability extended_socket_class=1
Feb 16 12:12:02 localhost kernel: SELinux:  policy capability always_check_network=0
Feb 16 12:12:02 localhost kernel: SELinux:  policy capability cgroup_seclabel=1
Feb 16 12:12:02 localhost kernel: SELinux:  policy capability nnp_nosuid_transition=1
Feb 16 12:12:02 localhost kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Feb 16 12:12:02 localhost kernel: audit: type=1403 audit(1771243921.089:3): auid=4294967295 ses=4294967295 lsm=selinux res=1
Feb 16 12:12:02 localhost systemd[1]: Successfully loaded SELinux policy in 193.355ms.
Feb 16 12:12:02 localhost systemd[1]: Relabelled /dev, /dev/shm, /run, /sys/fs/cgroup in 70.153ms.
Feb 16 12:12:02 localhost systemd[1]: systemd 252-64.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Feb 16 12:12:02 localhost systemd[1]: Detected virtualization kvm.
Feb 16 12:12:02 localhost systemd[1]: Detected architecture x86-64.
Feb 16 12:12:02 localhost systemd-rc-local-generator[646]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 12:12:02 localhost systemd[1]: initrd-switch-root.service: Deactivated successfully.
Feb 16 12:12:02 localhost systemd[1]: Stopped Switch Root.
Feb 16 12:12:02 localhost systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1.
Feb 16 12:12:02 localhost systemd[1]: Created slice Slice /system/getty.
Feb 16 12:12:02 localhost systemd[1]: Created slice Slice /system/serial-getty.
Feb 16 12:12:02 localhost systemd[1]: Created slice Slice /system/sshd-keygen.
Feb 16 12:12:02 localhost systemd[1]: Created slice User and Session Slice.
Feb 16 12:12:02 localhost systemd[1]: Started Dispatch Password Requests to Console Directory Watch.
Feb 16 12:12:02 localhost systemd[1]: Started Forward Password Requests to Wall Directory Watch.
Feb 16 12:12:02 localhost systemd[1]: Set up automount Arbitrary Executable File Formats File System Automount Point.
Feb 16 12:12:02 localhost systemd[1]: Reached target Local Encrypted Volumes.
Feb 16 12:12:02 localhost systemd[1]: Stopped target Switch Root.
Feb 16 12:12:02 localhost systemd[1]: Stopped target Initrd File Systems.
Feb 16 12:12:02 localhost systemd[1]: Stopped target Initrd Root File System.
Feb 16 12:12:02 localhost systemd[1]: Reached target Local Integrity Protected Volumes.
Feb 16 12:12:02 localhost systemd[1]: Reached target Path Units.
Feb 16 12:12:02 localhost systemd[1]: Reached target rpc_pipefs.target.
Feb 16 12:12:02 localhost systemd[1]: Reached target Slice Units.
Feb 16 12:12:02 localhost systemd[1]: Reached target Swaps.
Feb 16 12:12:02 localhost systemd[1]: Reached target Local Verity Protected Volumes.
Feb 16 12:12:02 localhost systemd[1]: Listening on RPCbind Server Activation Socket.
Feb 16 12:12:02 localhost systemd[1]: Reached target RPC Port Mapper.
Feb 16 12:12:02 localhost systemd[1]: Listening on Process Core Dump Socket.
Feb 16 12:12:02 localhost systemd[1]: Listening on initctl Compatibility Named Pipe.
Feb 16 12:12:02 localhost systemd[1]: Listening on udev Control Socket.
Feb 16 12:12:02 localhost systemd[1]: Listening on udev Kernel Socket.
Feb 16 12:12:02 localhost systemd[1]: Mounting Huge Pages File System...
Feb 16 12:12:02 localhost systemd[1]: Mounting POSIX Message Queue File System...
Feb 16 12:12:02 localhost systemd[1]: Mounting Kernel Debug File System...
Feb 16 12:12:02 localhost systemd[1]: Mounting Kernel Trace File System...
Feb 16 12:12:02 localhost systemd[1]: Kernel Module supporting RPCSEC_GSS was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Feb 16 12:12:02 localhost systemd[1]: Starting Create List of Static Device Nodes...
Feb 16 12:12:02 localhost systemd[1]: Starting Load Kernel Module configfs...
Feb 16 12:12:02 localhost systemd[1]: Starting Load Kernel Module drm...
Feb 16 12:12:02 localhost systemd[1]: Starting Load Kernel Module efi_pstore...
Feb 16 12:12:02 localhost systemd[1]: Starting Load Kernel Module fuse...
Feb 16 12:12:02 localhost systemd[1]: Starting Read and set NIS domainname from /etc/sysconfig/network...
Feb 16 12:12:02 localhost systemd[1]: systemd-fsck-root.service: Deactivated successfully.
Feb 16 12:12:02 localhost systemd[1]: Stopped File System Check on Root Device.
Feb 16 12:12:02 localhost systemd[1]: Stopped Journal Service.
Feb 16 12:12:02 localhost systemd[1]: Starting Journal Service...
Feb 16 12:12:02 localhost systemd[1]: Load Kernel Modules was skipped because no trigger condition checks were met.
Feb 16 12:12:02 localhost systemd[1]: Starting Generate network units from Kernel command line...
Feb 16 12:12:02 localhost systemd[1]: TPM2 PCR Machine ID Measurement was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Feb 16 12:12:02 localhost systemd[1]: Starting Remount Root and Kernel File Systems...
Feb 16 12:12:02 localhost systemd[1]: Repartition Root Disk was skipped because no trigger condition checks were met.
Feb 16 12:12:02 localhost systemd[1]: Starting Apply Kernel Variables...
Feb 16 12:12:02 localhost systemd[1]: Starting Coldplug All udev Devices...
Feb 16 12:12:02 localhost kernel: fuse: init (API version 7.37)
Feb 16 12:12:02 localhost systemd[1]: Mounted Huge Pages File System.
Feb 16 12:12:02 localhost systemd[1]: Mounted POSIX Message Queue File System.
Feb 16 12:12:02 localhost systemd[1]: Mounted Kernel Debug File System.
Feb 16 12:12:02 localhost systemd[1]: Mounted Kernel Trace File System.
Feb 16 12:12:02 localhost systemd[1]: Finished Create List of Static Device Nodes.
Feb 16 12:12:02 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Feb 16 12:12:02 localhost systemd[1]: Finished Load Kernel Module configfs.
Feb 16 12:12:02 localhost kernel: xfs filesystem being remounted at / supports timestamps until 2038 (0x7fffffff)
Feb 16 12:12:02 localhost systemd[1]: modprobe@drm.service: Deactivated successfully.
Feb 16 12:12:02 localhost systemd[1]: Finished Load Kernel Module drm.
Feb 16 12:12:02 localhost systemd[1]: modprobe@efi_pstore.service: Deactivated successfully.
Feb 16 12:12:02 localhost systemd[1]: Finished Load Kernel Module efi_pstore.
Feb 16 12:12:02 localhost systemd-journald[695]: Journal started
Feb 16 12:12:02 localhost systemd-journald[695]: Runtime Journal (/run/log/journal/c582f88d1fdab2d576c3dadef84540f2) is 8.0M, max 153.6M, 145.6M free.
Feb 16 12:12:02 localhost systemd[1]: Queued start job for default target Multi-User System.
Feb 16 12:12:02 localhost systemd[1]: systemd-journald.service: Deactivated successfully.
Feb 16 12:12:02 localhost systemd[1]: Started Journal Service.
Feb 16 12:12:02 localhost systemd[1]: modprobe@fuse.service: Deactivated successfully.
Feb 16 12:12:02 localhost systemd[1]: Finished Load Kernel Module fuse.
Feb 16 12:12:02 localhost systemd[1]: Finished Read and set NIS domainname from /etc/sysconfig/network.
Feb 16 12:12:02 localhost systemd[1]: Finished Generate network units from Kernel command line.
Feb 16 12:12:02 localhost systemd[1]: Finished Remount Root and Kernel File Systems.
Feb 16 12:12:02 localhost systemd[1]: Mounting FUSE Control File System...
Feb 16 12:12:02 localhost systemd[1]: First Boot Wizard was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Feb 16 12:12:02 localhost systemd[1]: Starting Rebuild Hardware Database...
Feb 16 12:12:02 localhost systemd[1]: Starting Flush Journal to Persistent Storage...
Feb 16 12:12:02 localhost systemd[1]: Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore).
Feb 16 12:12:02 localhost systemd[1]: Starting Load/Save OS Random Seed...
Feb 16 12:12:02 localhost systemd[1]: Starting Create System Users...
Feb 16 12:12:02 localhost systemd[1]: Finished Apply Kernel Variables.
Feb 16 12:12:02 localhost systemd[1]: Mounted FUSE Control File System.
Feb 16 12:12:02 localhost systemd-journald[695]: Runtime Journal (/run/log/journal/c582f88d1fdab2d576c3dadef84540f2) is 8.0M, max 153.6M, 145.6M free.
Feb 16 12:12:02 localhost systemd-journald[695]: Received client request to flush runtime journal.
Feb 16 12:12:02 localhost systemd[1]: Finished Flush Journal to Persistent Storage.
Feb 16 12:12:02 localhost systemd[1]: Finished Coldplug All udev Devices.
Feb 16 12:12:03 localhost systemd[1]: Finished Create System Users.
Feb 16 12:12:03 localhost systemd[1]: Finished Load/Save OS Random Seed.
Feb 16 12:12:03 localhost systemd[1]: First Boot Complete was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Feb 16 12:12:03 localhost systemd[1]: Starting Create Static Device Nodes in /dev...
Feb 16 12:12:03 localhost systemd[1]: Finished Create Static Device Nodes in /dev.
Feb 16 12:12:03 localhost systemd[1]: Reached target Preparation for Local File Systems.
Feb 16 12:12:03 localhost systemd[1]: Reached target Local File Systems.
Feb 16 12:12:03 localhost systemd[1]: Starting Rebuild Dynamic Linker Cache...
Feb 16 12:12:03 localhost systemd[1]: Mark the need to relabel after reboot was skipped because of an unmet condition check (ConditionSecurity=!selinux).
Feb 16 12:12:03 localhost systemd[1]: Set Up Additional Binary Formats was skipped because no trigger condition checks were met.
Feb 16 12:12:03 localhost systemd[1]: Update Boot Loader Random Seed was skipped because no trigger condition checks were met.
Feb 16 12:12:03 localhost systemd[1]: Starting Automatic Boot Loader Update...
Feb 16 12:12:03 localhost systemd[1]: Commit a transient machine-id on disk was skipped because of an unmet condition check (ConditionPathIsMountPoint=/etc/machine-id).
Feb 16 12:12:03 localhost systemd[1]: Starting Create Volatile Files and Directories...
Feb 16 12:12:03 localhost bootctl[712]: Couldn't find EFI system partition, skipping.
Feb 16 12:12:03 localhost systemd[1]: Finished Automatic Boot Loader Update.
Feb 16 12:12:03 localhost systemd[1]: Finished Create Volatile Files and Directories.
Feb 16 12:12:03 localhost systemd[1]: Starting Security Auditing Service...
Feb 16 12:12:03 localhost systemd[1]: Starting RPC Bind...
Feb 16 12:12:03 localhost systemd[1]: Starting Rebuild Journal Catalog...
Feb 16 12:12:03 localhost auditd[719]: audit dispatcher initialized with q_depth=2000 and 1 active plugins
Feb 16 12:12:03 localhost auditd[719]: Init complete, auditd 3.1.5 listening for events (startup state enable)
Feb 16 12:12:03 localhost systemd[1]: Finished Rebuild Journal Catalog.
Feb 16 12:12:03 localhost systemd[1]: Started RPC Bind.
Feb 16 12:12:03 localhost augenrules[724]: /sbin/augenrules: No change
Feb 16 12:12:03 localhost augenrules[739]: No rules
Feb 16 12:12:03 localhost augenrules[739]: enabled 1
Feb 16 12:12:03 localhost augenrules[739]: failure 1
Feb 16 12:12:03 localhost augenrules[739]: pid 719
Feb 16 12:12:03 localhost augenrules[739]: rate_limit 0
Feb 16 12:12:03 localhost augenrules[739]: backlog_limit 8192
Feb 16 12:12:03 localhost augenrules[739]: lost 0
Feb 16 12:12:03 localhost augenrules[739]: backlog 0
Feb 16 12:12:03 localhost augenrules[739]: backlog_wait_time 60000
Feb 16 12:12:03 localhost augenrules[739]: backlog_wait_time_actual 0
Feb 16 12:12:03 localhost augenrules[739]: enabled 1
Feb 16 12:12:03 localhost augenrules[739]: failure 1
Feb 16 12:12:03 localhost augenrules[739]: pid 719
Feb 16 12:12:03 localhost augenrules[739]: rate_limit 0
Feb 16 12:12:03 localhost augenrules[739]: backlog_limit 8192
Feb 16 12:12:03 localhost augenrules[739]: lost 0
Feb 16 12:12:03 localhost augenrules[739]: backlog 0
Feb 16 12:12:03 localhost augenrules[739]: backlog_wait_time 60000
Feb 16 12:12:03 localhost augenrules[739]: backlog_wait_time_actual 0
Feb 16 12:12:03 localhost augenrules[739]: enabled 1
Feb 16 12:12:03 localhost augenrules[739]: failure 1
Feb 16 12:12:03 localhost augenrules[739]: pid 719
Feb 16 12:12:03 localhost augenrules[739]: rate_limit 0
Feb 16 12:12:03 localhost augenrules[739]: backlog_limit 8192
Feb 16 12:12:03 localhost augenrules[739]: lost 0
Feb 16 12:12:03 localhost augenrules[739]: backlog 0
Feb 16 12:12:03 localhost augenrules[739]: backlog_wait_time 60000
Feb 16 12:12:03 localhost augenrules[739]: backlog_wait_time_actual 0
Feb 16 12:12:03 localhost systemd[1]: Started Security Auditing Service.
Feb 16 12:12:03 localhost systemd[1]: Starting Record System Boot/Shutdown in UTMP...
Feb 16 12:12:03 localhost systemd[1]: Finished Record System Boot/Shutdown in UTMP.
Feb 16 12:12:03 localhost systemd[1]: Finished Rebuild Hardware Database.
Feb 16 12:12:03 localhost systemd[1]: Starting Rule-based Manager for Device Events and Files...
Feb 16 12:12:03 localhost systemd-udevd[747]: Using default interface naming scheme 'rhel-9.0'.
Feb 16 12:12:03 localhost systemd[1]: Started Rule-based Manager for Device Events and Files.
Feb 16 12:12:03 localhost systemd[1]: Starting Load Kernel Module configfs...
Feb 16 12:12:03 localhost systemd[1]: Condition check resulted in /dev/ttyS0 being skipped.
Feb 16 12:12:03 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Feb 16 12:12:03 localhost systemd[1]: Finished Load Kernel Module configfs.
Feb 16 12:12:03 localhost systemd[1]: Finished Rebuild Dynamic Linker Cache.
Feb 16 12:12:03 localhost kernel: input: PC Speaker as /devices/platform/pcspkr/input/input6
Feb 16 12:12:03 localhost systemd-udevd[781]: Network interface NamePolicy= disabled on kernel command line.
Feb 16 12:12:03 localhost systemd[1]: Starting Update is Completed...
Feb 16 12:12:03 localhost systemd[1]: Finished Update is Completed.
Feb 16 12:12:03 localhost systemd[1]: Reached target System Initialization.
Feb 16 12:12:03 localhost systemd[1]: Started dnf makecache --timer.
Feb 16 12:12:03 localhost systemd[1]: Started Daily rotation of log files.
Feb 16 12:12:03 localhost systemd[1]: Started Daily Cleanup of Temporary Directories.
Feb 16 12:12:03 localhost systemd[1]: Reached target Timer Units.
Feb 16 12:12:03 localhost systemd[1]: Listening on D-Bus System Message Bus Socket.
Feb 16 12:12:03 localhost systemd[1]: Listening on SSSD Kerberos Cache Manager responder socket.
Feb 16 12:12:03 localhost systemd[1]: Reached target Socket Units.
Feb 16 12:12:04 localhost kernel: piix4_smbus 0000:00:01.3: SMBus Host Controller at 0x700, revision 0
Feb 16 12:12:04 localhost kernel: i2c i2c-0: 1/1 memory slots populated (from DMI)
Feb 16 12:12:04 localhost kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD
Feb 16 12:12:04 localhost systemd[1]: Starting D-Bus System Message Bus...
Feb 16 12:12:04 localhost systemd[1]: TPM2 PCR Barrier (Initialization) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Feb 16 12:12:04 localhost systemd[1]: Started D-Bus System Message Bus.
Feb 16 12:12:04 localhost systemd[1]: Reached target Basic System.
Feb 16 12:12:04 localhost dbus-broker-lau[793]: Ready
Feb 16 12:12:04 localhost kernel: kvm_amd: TSC scaling supported
Feb 16 12:12:04 localhost kernel: kvm_amd: Nested Virtualization enabled
Feb 16 12:12:04 localhost kernel: kvm_amd: Nested Paging enabled
Feb 16 12:12:04 localhost kernel: kvm_amd: LBR virtualization supported
Feb 16 12:12:04 localhost systemd[1]: Starting NTP client/server...
Feb 16 12:12:04 localhost systemd[1]: Starting Cloud-init: Local Stage (pre-network)...
Feb 16 12:12:04 localhost systemd[1]: Starting Restore /run/initramfs on shutdown...
Feb 16 12:12:04 localhost systemd[1]: Starting IPv4 firewall with iptables...
Feb 16 12:12:04 localhost systemd[1]: Started irqbalance daemon.
Feb 16 12:12:04 localhost systemd[1]: Load CPU microcode update was skipped because of an unmet condition check (ConditionPathExists=/sys/devices/system/cpu/microcode/reload).
Feb 16 12:12:04 localhost systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Feb 16 12:12:04 localhost systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Feb 16 12:12:04 localhost systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Feb 16 12:12:04 localhost systemd[1]: Reached target sshd-keygen.target.
Feb 16 12:12:04 localhost systemd[1]: System Security Services Daemon was skipped because no trigger condition checks were met.
Feb 16 12:12:04 localhost systemd[1]: Reached target User and Group Name Lookups.
Feb 16 12:12:04 localhost systemd[1]: Starting User Login Management...
Feb 16 12:12:04 localhost systemd[1]: Finished Restore /run/initramfs on shutdown.
Feb 16 12:12:04 localhost chronyd[842]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +NTS +SECHASH +IPV6 +DEBUG)
Feb 16 12:12:04 localhost chronyd[842]: Loaded 0 symmetric keys
Feb 16 12:12:04 localhost chronyd[842]: Using right/UTC timezone to obtain leap second data
Feb 16 12:12:04 localhost chronyd[842]: Loaded seccomp filter (level 2)
Feb 16 12:12:04 localhost systemd[1]: Started NTP client/server.
Feb 16 12:12:04 localhost systemd-logind[818]: New seat seat0.
Feb 16 12:12:04 localhost systemd-logind[818]: Watching system buttons on /dev/input/event0 (Power Button)
Feb 16 12:12:04 localhost systemd-logind[818]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Feb 16 12:12:04 localhost systemd[1]: Started User Login Management.
Feb 16 12:12:04 localhost kernel: Warning: Deprecated Driver is detected: nft_compat will not be maintained in a future major release and may be disabled
Feb 16 12:12:04 localhost kernel: Warning: Deprecated Driver is detected: nft_compat_module_init will not be maintained in a future major release and may be disabled
Feb 16 12:12:04 localhost iptables.init[812]: iptables: Applying firewall rules: [  OK  ]
Feb 16 12:12:04 localhost systemd[1]: Finished IPv4 firewall with iptables.
Feb 16 12:12:04 localhost cloud-init[851]: Cloud-init v. 24.4-8.el9 running 'init-local' at Mon, 16 Feb 2026 12:12:04 +0000. Up 8.23 seconds.
Feb 16 12:12:04 localhost kernel: ISO 9660 Extensions: Microsoft Joliet Level 3
Feb 16 12:12:04 localhost kernel: ISO 9660 Extensions: RRIP_1991A
Feb 16 12:12:04 localhost systemd[1]: run-cloud\x2dinit-tmp-tmpgp910oys.mount: Deactivated successfully.
Feb 16 12:12:05 localhost systemd[1]: Starting Hostname Service...
Feb 16 12:12:05 localhost systemd[1]: Started Hostname Service.
Feb 16 12:12:05 np0005620856.novalocal systemd-hostnamed[865]: Hostname set to <np0005620856.novalocal> (static)
Feb 16 12:12:05 np0005620856.novalocal systemd[1]: Finished Cloud-init: Local Stage (pre-network).
Feb 16 12:12:05 np0005620856.novalocal systemd[1]: Reached target Preparation for Network.
Feb 16 12:12:05 np0005620856.novalocal systemd[1]: Starting Network Manager...
Feb 16 12:12:05 np0005620856.novalocal NetworkManager[869]: <info>  [1771243925.3133] NetworkManager (version 1.54.3-2.el9) is starting... (boot:8a9ec7d7-dead-4443-8176-f7a3c4743d84)
Feb 16 12:12:05 np0005620856.novalocal NetworkManager[869]: <info>  [1771243925.3139] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Feb 16 12:12:05 np0005620856.novalocal NetworkManager[869]: <info>  [1771243925.3273] manager[0x556be6672000]: monitoring kernel firmware directory '/lib/firmware'.
Feb 16 12:12:05 np0005620856.novalocal NetworkManager[869]: <info>  [1771243925.3329] hostname: hostname: using hostnamed
Feb 16 12:12:05 np0005620856.novalocal NetworkManager[869]: <info>  [1771243925.3329] hostname: static hostname changed from (none) to "np0005620856.novalocal"
Feb 16 12:12:05 np0005620856.novalocal NetworkManager[869]: <info>  [1771243925.3333] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Feb 16 12:12:05 np0005620856.novalocal NetworkManager[869]: <info>  [1771243925.3424] manager[0x556be6672000]: rfkill: Wi-Fi hardware radio set enabled
Feb 16 12:12:05 np0005620856.novalocal NetworkManager[869]: <info>  [1771243925.3425] manager[0x556be6672000]: rfkill: WWAN hardware radio set enabled
Feb 16 12:12:05 np0005620856.novalocal systemd[1]: Listening on Load/Save RF Kill Switch Status /dev/rfkill Watch.
Feb 16 12:12:05 np0005620856.novalocal NetworkManager[869]: <info>  [1771243925.3537] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-device-plugin-team.so)
Feb 16 12:12:05 np0005620856.novalocal NetworkManager[869]: <info>  [1771243925.3538] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Feb 16 12:12:05 np0005620856.novalocal NetworkManager[869]: <info>  [1771243925.3539] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Feb 16 12:12:05 np0005620856.novalocal NetworkManager[869]: <info>  [1771243925.3540] manager: Networking is enabled by state file
Feb 16 12:12:05 np0005620856.novalocal NetworkManager[869]: <info>  [1771243925.3543] settings: Loaded settings plugin: keyfile (internal)
Feb 16 12:12:05 np0005620856.novalocal NetworkManager[869]: <info>  [1771243925.3605] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-settings-plugin-ifcfg-rh.so")
Feb 16 12:12:05 np0005620856.novalocal NetworkManager[869]: <info>  [1771243925.3630] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Feb 16 12:12:05 np0005620856.novalocal NetworkManager[869]: <info>  [1771243925.3646] dhcp: init: Using DHCP client 'internal'
Feb 16 12:12:05 np0005620856.novalocal NetworkManager[869]: <info>  [1771243925.3651] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Feb 16 12:12:05 np0005620856.novalocal NetworkManager[869]: <info>  [1771243925.3667] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 16 12:12:05 np0005620856.novalocal NetworkManager[869]: <info>  [1771243925.3681] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Feb 16 12:12:05 np0005620856.novalocal NetworkManager[869]: <info>  [1771243925.3692] device (lo): Activation: starting connection 'lo' (e66ad5fa-4651-424c-a7b4-2a119df1e243)
Feb 16 12:12:05 np0005620856.novalocal NetworkManager[869]: <info>  [1771243925.3698] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Feb 16 12:12:05 np0005620856.novalocal NetworkManager[869]: <info>  [1771243925.3700] device (eth0): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Feb 16 12:12:05 np0005620856.novalocal NetworkManager[869]: <info>  [1771243925.3723] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Feb 16 12:12:05 np0005620856.novalocal NetworkManager[869]: <info>  [1771243925.3726] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Feb 16 12:12:05 np0005620856.novalocal NetworkManager[869]: <info>  [1771243925.3728] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Feb 16 12:12:05 np0005620856.novalocal NetworkManager[869]: <info>  [1771243925.3730] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Feb 16 12:12:05 np0005620856.novalocal NetworkManager[869]: <info>  [1771243925.3731] device (eth0): carrier: link connected
Feb 16 12:12:05 np0005620856.novalocal NetworkManager[869]: <info>  [1771243925.3732] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Feb 16 12:12:05 np0005620856.novalocal NetworkManager[869]: <info>  [1771243925.3737] device (eth0): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Feb 16 12:12:05 np0005620856.novalocal NetworkManager[869]: <info>  [1771243925.3741] policy: auto-activating connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Feb 16 12:12:05 np0005620856.novalocal NetworkManager[869]: <info>  [1771243925.3745] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Feb 16 12:12:05 np0005620856.novalocal NetworkManager[869]: <info>  [1771243925.3745] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Feb 16 12:12:05 np0005620856.novalocal NetworkManager[869]: <info>  [1771243925.3747] manager: NetworkManager state is now CONNECTING
Feb 16 12:12:05 np0005620856.novalocal NetworkManager[869]: <info>  [1771243925.3748] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'full')
Feb 16 12:12:05 np0005620856.novalocal NetworkManager[869]: <info>  [1771243925.3753] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'full')
Feb 16 12:12:05 np0005620856.novalocal NetworkManager[869]: <info>  [1771243925.3755] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Feb 16 12:12:05 np0005620856.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Feb 16 12:12:05 np0005620856.novalocal systemd[1]: Started Network Manager.
Feb 16 12:12:05 np0005620856.novalocal NetworkManager[869]: <info>  [1771243925.3785] dhcp4 (eth0): state changed new lease, address=38.102.83.210
Feb 16 12:12:05 np0005620856.novalocal NetworkManager[869]: <info>  [1771243925.3791] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Feb 16 12:12:05 np0005620856.novalocal NetworkManager[869]: <info>  [1771243925.3804] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Feb 16 12:12:05 np0005620856.novalocal systemd[1]: Reached target Network.
Feb 16 12:12:05 np0005620856.novalocal systemd[1]: Starting Network Manager Wait Online...
Feb 16 12:12:05 np0005620856.novalocal systemd[1]: Starting GSSAPI Proxy Daemon...
Feb 16 12:12:05 np0005620856.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Feb 16 12:12:05 np0005620856.novalocal NetworkManager[869]: <info>  [1771243925.3986] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Feb 16 12:12:05 np0005620856.novalocal NetworkManager[869]: <info>  [1771243925.3989] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Feb 16 12:12:05 np0005620856.novalocal NetworkManager[869]: <info>  [1771243925.3990] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Feb 16 12:12:05 np0005620856.novalocal NetworkManager[869]: <info>  [1771243925.3994] device (lo): Activation: successful, device activated.
Feb 16 12:12:05 np0005620856.novalocal NetworkManager[869]: <info>  [1771243925.4000] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Feb 16 12:12:05 np0005620856.novalocal NetworkManager[869]: <info>  [1771243925.4004] manager: NetworkManager state is now CONNECTED_SITE
Feb 16 12:12:05 np0005620856.novalocal NetworkManager[869]: <info>  [1771243925.4006] device (eth0): Activation: successful, device activated.
Feb 16 12:12:05 np0005620856.novalocal NetworkManager[869]: <info>  [1771243925.4012] manager: NetworkManager state is now CONNECTED_GLOBAL
Feb 16 12:12:05 np0005620856.novalocal NetworkManager[869]: <info>  [1771243925.4015] manager: startup complete
Feb 16 12:12:05 np0005620856.novalocal systemd[1]: Started GSSAPI Proxy Daemon.
Feb 16 12:12:05 np0005620856.novalocal systemd[1]: RPC security service for NFS client and server was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Feb 16 12:12:05 np0005620856.novalocal systemd[1]: Reached target NFS client services.
Feb 16 12:12:05 np0005620856.novalocal systemd[1]: Reached target Preparation for Remote File Systems.
Feb 16 12:12:05 np0005620856.novalocal systemd[1]: Reached target Remote File Systems.
Feb 16 12:12:05 np0005620856.novalocal systemd[1]: TPM2 PCR Barrier (User) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Feb 16 12:12:05 np0005620856.novalocal systemd[1]: Finished Network Manager Wait Online.
Feb 16 12:12:05 np0005620856.novalocal systemd[1]: Starting Cloud-init: Network Stage...
Feb 16 12:12:05 np0005620856.novalocal cloud-init[934]: Cloud-init v. 24.4-8.el9 running 'init' at Mon, 16 Feb 2026 12:12:05 +0000. Up 9.12 seconds.
Feb 16 12:12:05 np0005620856.novalocal cloud-init[934]: ci-info: +++++++++++++++++++++++++++++++++++++++Net device info+++++++++++++++++++++++++++++++++++++++
Feb 16 12:12:05 np0005620856.novalocal cloud-init[934]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Feb 16 12:12:05 np0005620856.novalocal cloud-init[934]: ci-info: | Device |  Up  |           Address            |      Mask     | Scope  |     Hw-Address    |
Feb 16 12:12:05 np0005620856.novalocal cloud-init[934]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Feb 16 12:12:05 np0005620856.novalocal cloud-init[934]: ci-info: |  eth0  | True |        38.102.83.210         | 255.255.255.0 | global | fa:16:3e:c1:73:16 |
Feb 16 12:12:05 np0005620856.novalocal cloud-init[934]: ci-info: |  eth0  | True | fe80::f816:3eff:fec1:7316/64 |       .       |  link  | fa:16:3e:c1:73:16 |
Feb 16 12:12:05 np0005620856.novalocal cloud-init[934]: ci-info: |   lo   | True |          127.0.0.1           |   255.0.0.0   |  host  |         .         |
Feb 16 12:12:05 np0005620856.novalocal cloud-init[934]: ci-info: |   lo   | True |           ::1/128            |       .       |  host  |         .         |
Feb 16 12:12:05 np0005620856.novalocal cloud-init[934]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Feb 16 12:12:05 np0005620856.novalocal cloud-init[934]: ci-info: +++++++++++++++++++++++++++++++++Route IPv4 info+++++++++++++++++++++++++++++++++
Feb 16 12:12:05 np0005620856.novalocal cloud-init[934]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Feb 16 12:12:05 np0005620856.novalocal cloud-init[934]: ci-info: | Route |   Destination   |    Gateway    |     Genmask     | Interface | Flags |
Feb 16 12:12:05 np0005620856.novalocal cloud-init[934]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Feb 16 12:12:05 np0005620856.novalocal cloud-init[934]: ci-info: |   0   |     0.0.0.0     |  38.102.83.1  |     0.0.0.0     |    eth0   |   UG  |
Feb 16 12:12:05 np0005620856.novalocal cloud-init[934]: ci-info: |   1   |   38.102.83.0   |    0.0.0.0    |  255.255.255.0  |    eth0   |   U   |
Feb 16 12:12:05 np0005620856.novalocal cloud-init[934]: ci-info: |   2   | 169.254.169.254 | 38.102.83.126 | 255.255.255.255 |    eth0   |  UGH  |
Feb 16 12:12:05 np0005620856.novalocal cloud-init[934]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Feb 16 12:12:05 np0005620856.novalocal cloud-init[934]: ci-info: +++++++++++++++++++Route IPv6 info+++++++++++++++++++
Feb 16 12:12:05 np0005620856.novalocal cloud-init[934]: ci-info: +-------+-------------+---------+-----------+-------+
Feb 16 12:12:05 np0005620856.novalocal cloud-init[934]: ci-info: | Route | Destination | Gateway | Interface | Flags |
Feb 16 12:12:05 np0005620856.novalocal cloud-init[934]: ci-info: +-------+-------------+---------+-----------+-------+
Feb 16 12:12:05 np0005620856.novalocal cloud-init[934]: ci-info: |   1   |  fe80::/64  |    ::   |    eth0   |   U   |
Feb 16 12:12:05 np0005620856.novalocal cloud-init[934]: ci-info: |   3   |  multicast  |    ::   |    eth0   |   U   |
Feb 16 12:12:05 np0005620856.novalocal cloud-init[934]: ci-info: +-------+-------------+---------+-----------+-------+
Feb 16 12:12:06 np0005620856.novalocal useradd[1000]: new group: name=cloud-user, GID=1001
Feb 16 12:12:06 np0005620856.novalocal useradd[1000]: new user: name=cloud-user, UID=1001, GID=1001, home=/home/cloud-user, shell=/bin/bash, from=none
Feb 16 12:12:06 np0005620856.novalocal useradd[1000]: add 'cloud-user' to group 'adm'
Feb 16 12:12:06 np0005620856.novalocal useradd[1000]: add 'cloud-user' to group 'systemd-journal'
Feb 16 12:12:06 np0005620856.novalocal useradd[1000]: add 'cloud-user' to shadow group 'adm'
Feb 16 12:12:06 np0005620856.novalocal useradd[1000]: add 'cloud-user' to shadow group 'systemd-journal'
Feb 16 12:12:06 np0005620856.novalocal cloud-init[934]: Generating public/private rsa key pair.
Feb 16 12:12:06 np0005620856.novalocal cloud-init[934]: Your identification has been saved in /etc/ssh/ssh_host_rsa_key
Feb 16 12:12:06 np0005620856.novalocal cloud-init[934]: Your public key has been saved in /etc/ssh/ssh_host_rsa_key.pub
Feb 16 12:12:06 np0005620856.novalocal cloud-init[934]: The key fingerprint is:
Feb 16 12:12:06 np0005620856.novalocal cloud-init[934]: SHA256:NnKjmSGw86q82UurWgmJGCfkxo472TWXFH9PF32RSNE root@np0005620856.novalocal
Feb 16 12:12:06 np0005620856.novalocal cloud-init[934]: The key's randomart image is:
Feb 16 12:12:06 np0005620856.novalocal cloud-init[934]: +---[RSA 3072]----+
Feb 16 12:12:06 np0005620856.novalocal cloud-init[934]: | .    .     .o=oo|
Feb 16 12:12:06 np0005620856.novalocal cloud-init[934]: |+      o     . Eo|
Feb 16 12:12:06 np0005620856.novalocal cloud-init[934]: |o+o   . . . . . .|
Feb 16 12:12:06 np0005620856.novalocal cloud-init[934]: |*= o . . . o .   |
Feb 16 12:12:06 np0005620856.novalocal cloud-init[934]: |*.o + = S   .    |
Feb 16 12:12:06 np0005620856.novalocal cloud-init[934]: | = = + O o       |
Feb 16 12:12:06 np0005620856.novalocal cloud-init[934]: |+ +.. +          |
Feb 16 12:12:06 np0005620856.novalocal cloud-init[934]: |.o+..            |
Feb 16 12:12:06 np0005620856.novalocal cloud-init[934]: |+*++.            |
Feb 16 12:12:06 np0005620856.novalocal cloud-init[934]: +----[SHA256]-----+
Feb 16 12:12:06 np0005620856.novalocal cloud-init[934]: Generating public/private ecdsa key pair.
Feb 16 12:12:06 np0005620856.novalocal cloud-init[934]: Your identification has been saved in /etc/ssh/ssh_host_ecdsa_key
Feb 16 12:12:06 np0005620856.novalocal cloud-init[934]: Your public key has been saved in /etc/ssh/ssh_host_ecdsa_key.pub
Feb 16 12:12:06 np0005620856.novalocal cloud-init[934]: The key fingerprint is:
Feb 16 12:12:06 np0005620856.novalocal cloud-init[934]: SHA256:4x6VMRMQtdnYkd1pFU13jLK6owUMBqdrasWbzzLrvGM root@np0005620856.novalocal
Feb 16 12:12:06 np0005620856.novalocal cloud-init[934]: The key's randomart image is:
Feb 16 12:12:06 np0005620856.novalocal cloud-init[934]: +---[ECDSA 256]---+
Feb 16 12:12:06 np0005620856.novalocal cloud-init[934]: |    . . o+o .o *X|
Feb 16 12:12:06 np0005620856.novalocal cloud-init[934]: |     +     Bo.oo*|
Feb 16 12:12:06 np0005620856.novalocal cloud-init[934]: |    . o   B oo.  |
Feb 16 12:12:06 np0005620856.novalocal cloud-init[934]: |   . o o   =.    |
Feb 16 12:12:06 np0005620856.novalocal cloud-init[934]: |    =   S o.     |
Feb 16 12:12:06 np0005620856.novalocal cloud-init[934]: |   + o . +.      |
Feb 16 12:12:06 np0005620856.novalocal cloud-init[934]: |  o o   o ..     |
Feb 16 12:12:06 np0005620856.novalocal cloud-init[934]: | . .Eo . oo      |
Feb 16 12:12:06 np0005620856.novalocal cloud-init[934]: |   o=*o o. .     |
Feb 16 12:12:06 np0005620856.novalocal cloud-init[934]: +----[SHA256]-----+
Feb 16 12:12:06 np0005620856.novalocal cloud-init[934]: Generating public/private ed25519 key pair.
Feb 16 12:12:06 np0005620856.novalocal cloud-init[934]: Your identification has been saved in /etc/ssh/ssh_host_ed25519_key
Feb 16 12:12:06 np0005620856.novalocal cloud-init[934]: Your public key has been saved in /etc/ssh/ssh_host_ed25519_key.pub
Feb 16 12:12:06 np0005620856.novalocal cloud-init[934]: The key fingerprint is:
Feb 16 12:12:06 np0005620856.novalocal cloud-init[934]: SHA256:+LCKMZYn+cn/h0o3Nhv2hGJ2Aldw32s4C/zNibatz9c root@np0005620856.novalocal
Feb 16 12:12:06 np0005620856.novalocal cloud-init[934]: The key's randomart image is:
Feb 16 12:12:06 np0005620856.novalocal cloud-init[934]: +--[ED25519 256]--+
Feb 16 12:12:06 np0005620856.novalocal cloud-init[934]: |      . .        |
Feb 16 12:12:06 np0005620856.novalocal cloud-init[934]: |       o . .     |
Feb 16 12:12:06 np0005620856.novalocal cloud-init[934]: |        . . .    |
Feb 16 12:12:06 np0005620856.novalocal cloud-init[934]: |       +   . .   |
Feb 16 12:12:06 np0005620856.novalocal cloud-init[934]: |    . + S o o    |
Feb 16 12:12:06 np0005620856.novalocal cloud-init[934]: |   o o + + B .   |
Feb 16 12:12:06 np0005620856.novalocal cloud-init[934]: |  B . B @.* +  . |
Feb 16 12:12:06 np0005620856.novalocal cloud-init[934]: | . O * B.O.+  . E|
Feb 16 12:12:06 np0005620856.novalocal cloud-init[934]: |  . =.ooo.+o+.   |
Feb 16 12:12:06 np0005620856.novalocal cloud-init[934]: +----[SHA256]-----+
Feb 16 12:12:06 np0005620856.novalocal systemd[1]: Finished Cloud-init: Network Stage.
Feb 16 12:12:06 np0005620856.novalocal systemd[1]: Reached target Cloud-config availability.
Feb 16 12:12:06 np0005620856.novalocal systemd[1]: Reached target Network is Online.
Feb 16 12:12:07 np0005620856.novalocal systemd[1]: Starting Cloud-init: Config Stage...
Feb 16 12:12:07 np0005620856.novalocal systemd[1]: Starting Crash recovery kernel arming...
Feb 16 12:12:07 np0005620856.novalocal systemd[1]: Starting Notify NFS peers of a restart...
Feb 16 12:12:07 np0005620856.novalocal systemd[1]: Starting System Logging Service...
Feb 16 12:12:07 np0005620856.novalocal sm-notify[1016]: Version 2.5.4 starting
Feb 16 12:12:07 np0005620856.novalocal systemd[1]: Starting OpenSSH server daemon...
Feb 16 12:12:07 np0005620856.novalocal systemd[1]: Starting Permit User Sessions...
Feb 16 12:12:07 np0005620856.novalocal systemd[1]: Started Notify NFS peers of a restart.
Feb 16 12:12:07 np0005620856.novalocal sshd[1018]: Server listening on 0.0.0.0 port 22.
Feb 16 12:12:07 np0005620856.novalocal sshd[1018]: Server listening on :: port 22.
Feb 16 12:12:07 np0005620856.novalocal systemd[1]: Started OpenSSH server daemon.
Feb 16 12:12:07 np0005620856.novalocal systemd[1]: Finished Permit User Sessions.
Feb 16 12:12:07 np0005620856.novalocal systemd[1]: Started Command Scheduler.
Feb 16 12:12:07 np0005620856.novalocal systemd[1]: Started Getty on tty1.
Feb 16 12:12:07 np0005620856.novalocal systemd[1]: Started Serial Getty on ttyS0.
Feb 16 12:12:07 np0005620856.novalocal systemd[1]: Reached target Login Prompts.
Feb 16 12:12:07 np0005620856.novalocal crond[1021]: (CRON) STARTUP (1.5.7)
Feb 16 12:12:07 np0005620856.novalocal crond[1021]: (CRON) INFO (Syslog will be used instead of sendmail.)
Feb 16 12:12:07 np0005620856.novalocal crond[1021]: (CRON) INFO (RANDOM_DELAY will be scaled with factor 59% if used.)
Feb 16 12:12:07 np0005620856.novalocal crond[1021]: (CRON) INFO (running with inotify support)
Feb 16 12:12:07 np0005620856.novalocal rsyslogd[1017]: [origin software="rsyslogd" swVersion="8.2510.0-2.el9" x-pid="1017" x-info="https://www.rsyslog.com"] start
Feb 16 12:12:07 np0005620856.novalocal rsyslogd[1017]: imjournal: No statefile exists, /var/lib/rsyslog/imjournal.state will be created (ignore if this is first run): No such file or directory [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2040 ]
Feb 16 12:12:07 np0005620856.novalocal systemd[1]: Started System Logging Service.
Feb 16 12:12:07 np0005620856.novalocal systemd[1]: Reached target Multi-User System.
Feb 16 12:12:07 np0005620856.novalocal systemd[1]: Starting Record Runlevel Change in UTMP...
Feb 16 12:12:07 np0005620856.novalocal systemd[1]: systemd-update-utmp-runlevel.service: Deactivated successfully.
Feb 16 12:12:07 np0005620856.novalocal systemd[1]: Finished Record Runlevel Change in UTMP.
Feb 16 12:12:07 np0005620856.novalocal rsyslogd[1017]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Feb 16 12:12:07 np0005620856.novalocal cloud-init[1159]: Cloud-init v. 24.4-8.el9 running 'modules:config' at Mon, 16 Feb 2026 12:12:07 +0000. Up 10.74 seconds.
Feb 16 12:12:07 np0005620856.novalocal kdumpctl[1030]: kdump: No kdump initial ramdisk found.
Feb 16 12:12:07 np0005620856.novalocal kdumpctl[1030]: kdump: Rebuilding /boot/initramfs-5.14.0-677.el9.x86_64kdump.img
Feb 16 12:12:07 np0005620856.novalocal systemd[1]: Finished Cloud-init: Config Stage.
Feb 16 12:12:07 np0005620856.novalocal systemd[1]: Starting Cloud-init: Final Stage...
Feb 16 12:12:07 np0005620856.novalocal cloud-init[1486]: Cloud-init v. 24.4-8.el9 running 'modules:final' at Mon, 16 Feb 2026 12:12:07 +0000. Up 11.13 seconds.
Feb 16 12:12:07 np0005620856.novalocal dracut[1504]: dracut-057-110.git20260130.el9
Feb 16 12:12:07 np0005620856.novalocal cloud-init[1509]: #############################################################
Feb 16 12:12:07 np0005620856.novalocal cloud-init[1514]: -----BEGIN SSH HOST KEY FINGERPRINTS-----
Feb 16 12:12:07 np0005620856.novalocal cloud-init[1522]: 256 SHA256:4x6VMRMQtdnYkd1pFU13jLK6owUMBqdrasWbzzLrvGM root@np0005620856.novalocal (ECDSA)
Feb 16 12:12:07 np0005620856.novalocal cloud-init[1526]: 256 SHA256:+LCKMZYn+cn/h0o3Nhv2hGJ2Aldw32s4C/zNibatz9c root@np0005620856.novalocal (ED25519)
Feb 16 12:12:07 np0005620856.novalocal cloud-init[1528]: 3072 SHA256:NnKjmSGw86q82UurWgmJGCfkxo472TWXFH9PF32RSNE root@np0005620856.novalocal (RSA)
Feb 16 12:12:07 np0005620856.novalocal cloud-init[1529]: -----END SSH HOST KEY FINGERPRINTS-----
Feb 16 12:12:07 np0005620856.novalocal cloud-init[1530]: #############################################################
Feb 16 12:12:07 np0005620856.novalocal cloud-init[1486]: Cloud-init v. 24.4-8.el9 finished at Mon, 16 Feb 2026 12:12:07 +0000. Datasource DataSourceConfigDrive [net,ver=2][source=/dev/sr0].  Up 11.28 seconds
Feb 16 12:12:07 np0005620856.novalocal dracut[1506]: Executing: /usr/bin/dracut --quiet --hostonly --hostonly-cmdline --hostonly-i18n --hostonly-mode strict --hostonly-nics  --mount "/dev/disk/by-uuid/19ee07ed-c14b-4aa3-804d-f2cbdae2694f /sysroot xfs rw,relatime,seclabel,attr2,inode64,logbufs=8,logbsize=32k,noquota" --squash-compressor zstd --no-hostonly-default-device --add-confdir /lib/kdump/dracut.conf.d -f /boot/initramfs-5.14.0-677.el9.x86_64kdump.img 5.14.0-677.el9.x86_64
Feb 16 12:12:07 np0005620856.novalocal systemd[1]: Finished Cloud-init: Final Stage.
Feb 16 12:12:07 np0005620856.novalocal systemd[1]: Reached target Cloud-init target.
Feb 16 12:12:08 np0005620856.novalocal dracut[1506]: dracut module 'systemd-networkd' will not be installed, because command 'networkctl' could not be found!
Feb 16 12:12:08 np0005620856.novalocal dracut[1506]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd' could not be found!
Feb 16 12:12:08 np0005620856.novalocal dracut[1506]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd-wait-online' could not be found!
Feb 16 12:12:08 np0005620856.novalocal dracut[1506]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Feb 16 12:12:08 np0005620856.novalocal dracut[1506]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Feb 16 12:12:08 np0005620856.novalocal dracut[1506]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Feb 16 12:12:08 np0005620856.novalocal dracut[1506]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Feb 16 12:12:08 np0005620856.novalocal dracut[1506]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Feb 16 12:12:08 np0005620856.novalocal dracut[1506]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Feb 16 12:12:08 np0005620856.novalocal dracut[1506]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Feb 16 12:12:08 np0005620856.novalocal dracut[1506]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Feb 16 12:12:08 np0005620856.novalocal dracut[1506]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Feb 16 12:12:08 np0005620856.novalocal dracut[1506]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Feb 16 12:12:08 np0005620856.novalocal dracut[1506]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Feb 16 12:12:08 np0005620856.novalocal dracut[1506]: Module 'ifcfg' will not be installed, because it's in the list to be omitted!
Feb 16 12:12:08 np0005620856.novalocal dracut[1506]: Module 'plymouth' will not be installed, because it's in the list to be omitted!
Feb 16 12:12:08 np0005620856.novalocal dracut[1506]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Feb 16 12:12:08 np0005620856.novalocal dracut[1506]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Feb 16 12:12:08 np0005620856.novalocal dracut[1506]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Feb 16 12:12:08 np0005620856.novalocal dracut[1506]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Feb 16 12:12:08 np0005620856.novalocal dracut[1506]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Feb 16 12:12:08 np0005620856.novalocal dracut[1506]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Feb 16 12:12:08 np0005620856.novalocal dracut[1506]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Feb 16 12:12:08 np0005620856.novalocal dracut[1506]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Feb 16 12:12:08 np0005620856.novalocal dracut[1506]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Feb 16 12:12:08 np0005620856.novalocal dracut[1506]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Feb 16 12:12:08 np0005620856.novalocal sshd-session[1809]: Connection closed by 38.102.83.114 port 38062 [preauth]
Feb 16 12:12:08 np0005620856.novalocal sshd-session[1829]: Unable to negotiate with 38.102.83.114 port 35200: no matching host key type found. Their offer: ssh-ed25519,ssh-ed25519-cert-v01@openssh.com [preauth]
Feb 16 12:12:08 np0005620856.novalocal dracut[1506]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Feb 16 12:12:08 np0005620856.novalocal dracut[1506]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Feb 16 12:12:08 np0005620856.novalocal dracut[1506]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Feb 16 12:12:08 np0005620856.novalocal dracut[1506]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Feb 16 12:12:08 np0005620856.novalocal dracut[1506]: Module 'resume' will not be installed, because it's in the list to be omitted!
Feb 16 12:12:08 np0005620856.novalocal sshd-session[1854]: Unable to negotiate with 38.102.83.114 port 35224: no matching host key type found. Their offer: ecdsa-sha2-nistp384,ecdsa-sha2-nistp384-cert-v01@openssh.com [preauth]
Feb 16 12:12:08 np0005620856.novalocal sshd-session[1865]: Unable to negotiate with 38.102.83.114 port 35236: no matching host key type found. Their offer: ecdsa-sha2-nistp521,ecdsa-sha2-nistp521-cert-v01@openssh.com [preauth]
Feb 16 12:12:08 np0005620856.novalocal dracut[1506]: dracut module 'biosdevname' will not be installed, because command 'biosdevname' could not be found!
Feb 16 12:12:08 np0005620856.novalocal dracut[1506]: Module 'earlykdump' will not be installed, because it's in the list to be omitted!
Feb 16 12:12:08 np0005620856.novalocal sshd-session[1903]: Unable to negotiate with 38.102.83.114 port 35262: no matching host key type found. Their offer: ssh-rsa,ssh-rsa-cert-v01@openssh.com [preauth]
Feb 16 12:12:08 np0005620856.novalocal sshd-session[1839]: Connection closed by 38.102.83.114 port 35216 [preauth]
Feb 16 12:12:08 np0005620856.novalocal sshd-session[1909]: Unable to negotiate with 38.102.83.114 port 35276: no matching host key type found. Their offer: ssh-dss,ssh-dss-cert-v01@openssh.com [preauth]
Feb 16 12:12:08 np0005620856.novalocal sshd-session[1876]: Connection closed by 38.102.83.114 port 35240 [preauth]
Feb 16 12:12:08 np0005620856.novalocal sshd-session[1890]: Connection closed by 38.102.83.114 port 35254 [preauth]
Feb 16 12:12:08 np0005620856.novalocal dracut[1506]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Feb 16 12:12:08 np0005620856.novalocal dracut[1506]: memstrack is not available
Feb 16 12:12:08 np0005620856.novalocal dracut[1506]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Feb 16 12:12:08 np0005620856.novalocal dracut[1506]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Feb 16 12:12:08 np0005620856.novalocal dracut[1506]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Feb 16 12:12:08 np0005620856.novalocal dracut[1506]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Feb 16 12:12:08 np0005620856.novalocal dracut[1506]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Feb 16 12:12:08 np0005620856.novalocal dracut[1506]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Feb 16 12:12:08 np0005620856.novalocal dracut[1506]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Feb 16 12:12:08 np0005620856.novalocal dracut[1506]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Feb 16 12:12:09 np0005620856.novalocal dracut[1506]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Feb 16 12:12:09 np0005620856.novalocal dracut[1506]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Feb 16 12:12:09 np0005620856.novalocal dracut[1506]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Feb 16 12:12:09 np0005620856.novalocal dracut[1506]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Feb 16 12:12:09 np0005620856.novalocal dracut[1506]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Feb 16 12:12:09 np0005620856.novalocal dracut[1506]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Feb 16 12:12:09 np0005620856.novalocal dracut[1506]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Feb 16 12:12:09 np0005620856.novalocal dracut[1506]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Feb 16 12:12:09 np0005620856.novalocal dracut[1506]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Feb 16 12:12:09 np0005620856.novalocal dracut[1506]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Feb 16 12:12:09 np0005620856.novalocal dracut[1506]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Feb 16 12:12:09 np0005620856.novalocal dracut[1506]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Feb 16 12:12:09 np0005620856.novalocal dracut[1506]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Feb 16 12:12:09 np0005620856.novalocal dracut[1506]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Feb 16 12:12:09 np0005620856.novalocal dracut[1506]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Feb 16 12:12:09 np0005620856.novalocal dracut[1506]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Feb 16 12:12:09 np0005620856.novalocal dracut[1506]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Feb 16 12:12:09 np0005620856.novalocal dracut[1506]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Feb 16 12:12:09 np0005620856.novalocal dracut[1506]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Feb 16 12:12:09 np0005620856.novalocal dracut[1506]: memstrack is not available
Feb 16 12:12:09 np0005620856.novalocal dracut[1506]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Feb 16 12:12:09 np0005620856.novalocal dracut[1506]: *** Including module: systemd ***
Feb 16 12:12:09 np0005620856.novalocal dracut[1506]: *** Including module: fips ***
Feb 16 12:12:09 np0005620856.novalocal dracut[1506]: *** Including module: systemd-initrd ***
Feb 16 12:12:09 np0005620856.novalocal dracut[1506]: *** Including module: i18n ***
Feb 16 12:12:09 np0005620856.novalocal dracut[1506]: *** Including module: drm ***
Feb 16 12:12:10 np0005620856.novalocal dracut[1506]: *** Including module: prefixdevname ***
Feb 16 12:12:10 np0005620856.novalocal dracut[1506]: *** Including module: kernel-modules ***
Feb 16 12:12:10 np0005620856.novalocal chronyd[842]: Selected source 209.227.173.244 (2.centos.pool.ntp.org)
Feb 16 12:12:10 np0005620856.novalocal chronyd[842]: System clock TAI offset set to 37 seconds
Feb 16 12:12:10 np0005620856.novalocal kernel: block vda: the capability attribute has been deprecated.
Feb 16 12:12:10 np0005620856.novalocal dracut[1506]: *** Including module: kernel-modules-extra ***
Feb 16 12:12:10 np0005620856.novalocal dracut[1506]:   kernel-modules-extra: configuration source "/run/depmod.d" does not exist
Feb 16 12:12:10 np0005620856.novalocal dracut[1506]:   kernel-modules-extra: configuration source "/lib/depmod.d" does not exist
Feb 16 12:12:10 np0005620856.novalocal dracut[1506]:   kernel-modules-extra: parsing configuration file "/etc/depmod.d/dist.conf"
Feb 16 12:12:10 np0005620856.novalocal dracut[1506]:   kernel-modules-extra: /etc/depmod.d/dist.conf: added "updates extra built-in weak-updates" to the list of search directories
Feb 16 12:12:10 np0005620856.novalocal dracut[1506]: *** Including module: qemu ***
Feb 16 12:12:10 np0005620856.novalocal dracut[1506]: *** Including module: fstab-sys ***
Feb 16 12:12:10 np0005620856.novalocal dracut[1506]: *** Including module: rootfs-block ***
Feb 16 12:12:10 np0005620856.novalocal dracut[1506]: *** Including module: terminfo ***
Feb 16 12:12:10 np0005620856.novalocal dracut[1506]: *** Including module: udev-rules ***
Feb 16 12:12:11 np0005620856.novalocal dracut[1506]: Skipping udev rule: 91-permissions.rules
Feb 16 12:12:11 np0005620856.novalocal dracut[1506]: Skipping udev rule: 80-drivers-modprobe.rules
Feb 16 12:12:11 np0005620856.novalocal dracut[1506]: *** Including module: virtiofs ***
Feb 16 12:12:11 np0005620856.novalocal dracut[1506]: *** Including module: dracut-systemd ***
Feb 16 12:12:11 np0005620856.novalocal dracut[1506]: *** Including module: usrmount ***
Feb 16 12:12:11 np0005620856.novalocal dracut[1506]: *** Including module: base ***
Feb 16 12:12:11 np0005620856.novalocal dracut[1506]: *** Including module: fs-lib ***
Feb 16 12:12:11 np0005620856.novalocal dracut[1506]: *** Including module: kdumpbase ***
Feb 16 12:12:12 np0005620856.novalocal dracut[1506]: *** Including module: microcode_ctl-fw_dir_override ***
Feb 16 12:12:12 np0005620856.novalocal dracut[1506]:   microcode_ctl module: mangling fw_dir
Feb 16 12:12:12 np0005620856.novalocal dracut[1506]:     microcode_ctl: reset fw_dir to "/lib/firmware/updates /lib/firmware"
Feb 16 12:12:12 np0005620856.novalocal dracut[1506]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel"...
Feb 16 12:12:12 np0005620856.novalocal dracut[1506]:     microcode_ctl: configuration "intel" is ignored
Feb 16 12:12:12 np0005620856.novalocal dracut[1506]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-2d-07"...
Feb 16 12:12:12 np0005620856.novalocal dracut[1506]:     microcode_ctl: configuration "intel-06-2d-07" is ignored
Feb 16 12:12:12 np0005620856.novalocal dracut[1506]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4e-03"...
Feb 16 12:12:12 np0005620856.novalocal dracut[1506]:     microcode_ctl: configuration "intel-06-4e-03" is ignored
Feb 16 12:12:12 np0005620856.novalocal dracut[1506]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4f-01"...
Feb 16 12:12:12 np0005620856.novalocal dracut[1506]:     microcode_ctl: configuration "intel-06-4f-01" is ignored
Feb 16 12:12:12 np0005620856.novalocal dracut[1506]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-55-04"...
Feb 16 12:12:12 np0005620856.novalocal dracut[1506]:     microcode_ctl: configuration "intel-06-55-04" is ignored
Feb 16 12:12:12 np0005620856.novalocal dracut[1506]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-5e-03"...
Feb 16 12:12:12 np0005620856.novalocal dracut[1506]:     microcode_ctl: configuration "intel-06-5e-03" is ignored
Feb 16 12:12:12 np0005620856.novalocal dracut[1506]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8c-01"...
Feb 16 12:12:12 np0005620856.novalocal dracut[1506]:     microcode_ctl: configuration "intel-06-8c-01" is ignored
Feb 16 12:12:12 np0005620856.novalocal dracut[1506]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-0xca"...
Feb 16 12:12:12 np0005620856.novalocal dracut[1506]:     microcode_ctl: configuration "intel-06-8e-9e-0x-0xca" is ignored
Feb 16 12:12:12 np0005620856.novalocal dracut[1506]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-dell"...
Feb 16 12:12:12 np0005620856.novalocal dracut[1506]:     microcode_ctl: configuration "intel-06-8e-9e-0x-dell" is ignored
Feb 16 12:12:12 np0005620856.novalocal dracut[1506]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8f-08"...
Feb 16 12:12:12 np0005620856.novalocal dracut[1506]:     microcode_ctl: configuration "intel-06-8f-08" is ignored
Feb 16 12:12:12 np0005620856.novalocal dracut[1506]:     microcode_ctl: final fw_dir: "/lib/firmware/updates /lib/firmware"
Feb 16 12:12:12 np0005620856.novalocal dracut[1506]: *** Including module: openssl ***
Feb 16 12:12:12 np0005620856.novalocal dracut[1506]: *** Including module: shutdown ***
Feb 16 12:12:12 np0005620856.novalocal dracut[1506]: *** Including module: squash ***
Feb 16 12:12:13 np0005620856.novalocal dracut[1506]: *** Including modules done ***
Feb 16 12:12:13 np0005620856.novalocal dracut[1506]: *** Installing kernel module dependencies ***
Feb 16 12:12:13 np0005620856.novalocal dracut[1506]: *** Installing kernel module dependencies done ***
Feb 16 12:12:13 np0005620856.novalocal dracut[1506]: *** Resolving executable dependencies ***
Feb 16 12:12:14 np0005620856.novalocal irqbalance[816]: Cannot change IRQ 25 affinity: Operation not permitted
Feb 16 12:12:14 np0005620856.novalocal irqbalance[816]: IRQ 25 affinity is now unmanaged
Feb 16 12:12:14 np0005620856.novalocal irqbalance[816]: Cannot change IRQ 31 affinity: Operation not permitted
Feb 16 12:12:14 np0005620856.novalocal irqbalance[816]: IRQ 31 affinity is now unmanaged
Feb 16 12:12:14 np0005620856.novalocal irqbalance[816]: Cannot change IRQ 28 affinity: Operation not permitted
Feb 16 12:12:14 np0005620856.novalocal irqbalance[816]: IRQ 28 affinity is now unmanaged
Feb 16 12:12:14 np0005620856.novalocal irqbalance[816]: Cannot change IRQ 32 affinity: Operation not permitted
Feb 16 12:12:14 np0005620856.novalocal irqbalance[816]: IRQ 32 affinity is now unmanaged
Feb 16 12:12:14 np0005620856.novalocal irqbalance[816]: Cannot change IRQ 30 affinity: Operation not permitted
Feb 16 12:12:14 np0005620856.novalocal irqbalance[816]: IRQ 30 affinity is now unmanaged
Feb 16 12:12:14 np0005620856.novalocal irqbalance[816]: Cannot change IRQ 29 affinity: Operation not permitted
Feb 16 12:12:14 np0005620856.novalocal irqbalance[816]: IRQ 29 affinity is now unmanaged
Feb 16 12:12:14 np0005620856.novalocal dracut[1506]: *** Resolving executable dependencies done ***
Feb 16 12:12:14 np0005620856.novalocal dracut[1506]: *** Generating early-microcode cpio image ***
Feb 16 12:12:15 np0005620856.novalocal dracut[1506]: *** Store current command line parameters ***
Feb 16 12:12:15 np0005620856.novalocal dracut[1506]: Stored kernel commandline:
Feb 16 12:12:15 np0005620856.novalocal dracut[1506]: No dracut internal kernel commandline stored in the initramfs
Feb 16 12:12:15 np0005620856.novalocal dracut[1506]: *** Install squash loader ***
Feb 16 12:12:15 np0005620856.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Feb 16 12:12:15 np0005620856.novalocal dracut[1506]: *** Squashing the files inside the initramfs ***
Feb 16 12:12:17 np0005620856.novalocal dracut[1506]: *** Squashing the files inside the initramfs done ***
Feb 16 12:12:17 np0005620856.novalocal dracut[1506]: *** Creating image file '/boot/initramfs-5.14.0-677.el9.x86_64kdump.img' ***
Feb 16 12:12:17 np0005620856.novalocal dracut[1506]: *** Hardlinking files ***
Feb 16 12:12:17 np0005620856.novalocal dracut[1506]: Mode:           real
Feb 16 12:12:17 np0005620856.novalocal dracut[1506]: Files:          50
Feb 16 12:12:17 np0005620856.novalocal dracut[1506]: Linked:         0 files
Feb 16 12:12:17 np0005620856.novalocal dracut[1506]: Compared:       0 xattrs
Feb 16 12:12:17 np0005620856.novalocal dracut[1506]: Compared:       0 files
Feb 16 12:12:17 np0005620856.novalocal dracut[1506]: Saved:          0 B
Feb 16 12:12:17 np0005620856.novalocal dracut[1506]: Duration:       0.000566 seconds
Feb 16 12:12:17 np0005620856.novalocal dracut[1506]: *** Hardlinking files done ***
Feb 16 12:12:17 np0005620856.novalocal dracut[1506]: *** Creating initramfs image file '/boot/initramfs-5.14.0-677.el9.x86_64kdump.img' done ***
Feb 16 12:12:18 np0005620856.novalocal kdumpctl[1030]: kdump: kexec: loaded kdump kernel
Feb 16 12:12:18 np0005620856.novalocal kdumpctl[1030]: kdump: Starting kdump: [OK]
Feb 16 12:12:18 np0005620856.novalocal systemd[1]: Finished Crash recovery kernel arming.
Feb 16 12:12:18 np0005620856.novalocal systemd[1]: Startup finished in 1.277s (kernel) + 3.060s (initrd) + 17.351s (userspace) = 21.689s.
Feb 16 12:12:35 np0005620856.novalocal systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Feb 16 12:13:16 np0005620856.novalocal sshd-session[4796]: Accepted publickey for zuul from 38.102.83.114 port 54980 ssh2: RSA SHA256:zhs3MiW0JhxzckYcMHQES8SMYHj1iGcomnyzmbiwor8
Feb 16 12:13:16 np0005620856.novalocal systemd[1]: Created slice User Slice of UID 1000.
Feb 16 12:13:16 np0005620856.novalocal systemd[1]: Starting User Runtime Directory /run/user/1000...
Feb 16 12:13:16 np0005620856.novalocal systemd-logind[818]: New session 1 of user zuul.
Feb 16 12:13:16 np0005620856.novalocal systemd[1]: Finished User Runtime Directory /run/user/1000.
Feb 16 12:13:16 np0005620856.novalocal systemd[1]: Starting User Manager for UID 1000...
Feb 16 12:13:16 np0005620856.novalocal systemd[4800]: pam_unix(systemd-user:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 16 12:13:16 np0005620856.novalocal systemd[4800]: Queued start job for default target Main User Target.
Feb 16 12:13:16 np0005620856.novalocal systemd[4800]: Created slice User Application Slice.
Feb 16 12:13:16 np0005620856.novalocal systemd[4800]: Started Mark boot as successful after the user session has run 2 minutes.
Feb 16 12:13:16 np0005620856.novalocal systemd[4800]: Started Daily Cleanup of User's Temporary Directories.
Feb 16 12:13:16 np0005620856.novalocal systemd[4800]: Reached target Paths.
Feb 16 12:13:16 np0005620856.novalocal systemd[4800]: Reached target Timers.
Feb 16 12:13:16 np0005620856.novalocal systemd[4800]: Starting D-Bus User Message Bus Socket...
Feb 16 12:13:16 np0005620856.novalocal systemd[4800]: Starting Create User's Volatile Files and Directories...
Feb 16 12:13:17 np0005620856.novalocal systemd[4800]: Finished Create User's Volatile Files and Directories.
Feb 16 12:13:17 np0005620856.novalocal systemd[4800]: Listening on D-Bus User Message Bus Socket.
Feb 16 12:13:17 np0005620856.novalocal systemd[4800]: Reached target Sockets.
Feb 16 12:13:17 np0005620856.novalocal systemd[4800]: Reached target Basic System.
Feb 16 12:13:17 np0005620856.novalocal systemd[4800]: Reached target Main User Target.
Feb 16 12:13:17 np0005620856.novalocal systemd[4800]: Startup finished in 113ms.
Feb 16 12:13:17 np0005620856.novalocal systemd[1]: Started User Manager for UID 1000.
Feb 16 12:13:17 np0005620856.novalocal systemd[1]: Started Session 1 of User zuul.
Feb 16 12:13:17 np0005620856.novalocal sshd-session[4796]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 16 12:13:17 np0005620856.novalocal python3[4882]: ansible-setup Invoked with gather_subset=['!all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 16 12:13:23 np0005620856.novalocal python3[4910]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 16 12:13:30 np0005620856.novalocal python3[4968]: ansible-setup Invoked with gather_subset=['network'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 16 12:13:30 np0005620856.novalocal python3[5008]: ansible-zuul_console Invoked with path=/tmp/console-{log_uuid}.log port=19885 state=present
Feb 16 12:13:33 np0005620856.novalocal python3[5034]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDEJ+pe7LiJg9hVdoAKQ3i1qJF3L7BdinKweX4yR5h1ilsosYkEGsDObr6Mpln9z+TBza2qi/QM4ApOm8G9IXqCNzAFmiKaxc3hoc/d9IsH/s3MeDHtpQAnQlqE3Y5hs6PeHJyA3ivsTX406B10I94prnPs/s5E27DC1AMcOxTN942G4U4Bcrzq5z7IODNs1GO0RoIhIf5ineLLFextnNL5bj71Qr4cTvgeBJRky09Csj3rsTUdu9QQ25yMqYYyv+BgrbPJU3OeGEJIpbnVuBGqfnF/+wei2jqiSFyzTLwwz0pLrLOH1/y+JWNWgsbYM95pE8noPRZQBddrESG9zAXy/IhEiYE6GfbucJDR/S05liuwLOtHmw29SVPzkQxae1eI3xZVS2rlxNQ5W09IDsZs9ZSch4KeHJ9YYRBZpdV1tyjMZ6KZrp+YvMaTfUwmPxbZzVfubTQRxAXPWqWxk8yJdz9eys5mQIcsSECfqnPPAC5gYiXrQErNf7T1DXAX4Dc= zuul-build-sshkey manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 16 12:13:33 np0005620856.novalocal python3[5058]: ansible-file Invoked with state=directory path=/home/zuul/.ssh mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 12:13:34 np0005620856.novalocal python3[5157]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 16 12:13:34 np0005620856.novalocal python3[5228]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1771244013.9229226-229-205603632170074/source dest=/home/zuul/.ssh/id_rsa mode=384 force=False _original_basename=7ba9b203e349477e959408b698d6a67f_id_rsa follow=False checksum=8844458f19d0dfd440f6deb91fc2a8ce557db95a backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 12:13:35 np0005620856.novalocal python3[5351]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa.pub follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 16 12:13:35 np0005620856.novalocal python3[5422]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1771244014.861824-273-157724906750509/source dest=/home/zuul/.ssh/id_rsa.pub mode=420 force=False _original_basename=7ba9b203e349477e959408b698d6a67f_id_rsa.pub follow=False checksum=9d5f0045bc91051867707ffccb8d9c1ad0378fd4 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 12:13:37 np0005620856.novalocal python3[5470]: ansible-ping Invoked with data=pong
Feb 16 12:13:38 np0005620856.novalocal python3[5494]: ansible-setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 16 12:13:41 np0005620856.novalocal python3[5552]: ansible-zuul_debug_info Invoked with ipv4_route_required=False ipv6_route_required=False image_manifest_files=['/etc/dib-builddate.txt', '/etc/image-hostname.txt'] image_manifest=None traceroute_host=None
Feb 16 12:13:42 np0005620856.novalocal python3[5584]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 12:13:42 np0005620856.novalocal python3[5608]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 12:13:42 np0005620856.novalocal python3[5632]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 12:13:43 np0005620856.novalocal python3[5656]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 12:13:43 np0005620856.novalocal python3[5680]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 12:13:43 np0005620856.novalocal python3[5704]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 12:13:45 np0005620856.novalocal sudo[5728]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tctklyafbckmbgomfwymbrdjbfnplrht ; /usr/bin/python3'
Feb 16 12:13:45 np0005620856.novalocal sudo[5728]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:13:45 np0005620856.novalocal python3[5730]: ansible-file Invoked with path=/etc/ci state=directory owner=root group=root mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 12:13:45 np0005620856.novalocal sudo[5728]: pam_unix(sudo:session): session closed for user root
Feb 16 12:13:45 np0005620856.novalocal sudo[5806]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nzgzefalcnqdbuielnfktsckzqcsrjsn ; /usr/bin/python3'
Feb 16 12:13:45 np0005620856.novalocal sudo[5806]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:13:46 np0005620856.novalocal python3[5808]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/mirror_info.sh follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 16 12:13:46 np0005620856.novalocal sudo[5806]: pam_unix(sudo:session): session closed for user root
Feb 16 12:13:46 np0005620856.novalocal sudo[5879]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tcekkkskygunrnzjvyusdwwdvtdscnub ; /usr/bin/python3'
Feb 16 12:13:46 np0005620856.novalocal sudo[5879]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:13:46 np0005620856.novalocal python3[5881]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/mirror_info.sh owner=root group=root mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1771244025.5206203-26-63378575643307/source follow=False _original_basename=mirror_info.sh.j2 checksum=92d92a03afdddee82732741071f662c729080c35 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 12:13:46 np0005620856.novalocal sudo[5879]: pam_unix(sudo:session): session closed for user root
Feb 16 12:13:47 np0005620856.novalocal python3[5929]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAABIwAAAQEA4Z/c9osaGGtU6X8fgELwfj/yayRurfcKA0HMFfdpPxev2dbwljysMuzoVp4OZmW1gvGtyYPSNRvnzgsaabPNKNo2ym5NToCP6UM+KSe93aln4BcM/24mXChYAbXJQ5Bqq/pIzsGs/pKetQN+vwvMxLOwTvpcsCJBXaa981RKML6xj9l/UZ7IIq1HSEKMvPLxZMWdu0Ut8DkCd5F4nOw9Wgml2uYpDCj5LLCrQQ9ChdOMz8hz6SighhNlRpPkvPaet3OXxr/ytFMu7j7vv06CaEnuMMiY2aTWN1Imin9eHAylIqFHta/3gFfQSWt9jXM7owkBLKL7ATzhaAn+fjNupw== arxcruz@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 16 12:13:47 np0005620856.novalocal python3[5953]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQDS4Fn6k4deCnIlOtLWqZJyksbepjQt04j8Ed8CGx9EKkj0fKiAxiI4TadXQYPuNHMixZy4Nevjb6aDhL5Z906TfvNHKUrjrG7G26a0k8vdc61NEQ7FmcGMWRLwwc6ReDO7lFpzYKBMk4YqfWgBuGU/K6WLKiVW2cVvwIuGIaYrE1OiiX0iVUUk7KApXlDJMXn7qjSYynfO4mF629NIp8FJal38+Kv+HA+0QkE5Y2xXnzD4Lar5+keymiCHRntPppXHeLIRzbt0gxC7v3L72hpQ3BTBEzwHpeS8KY+SX1y5lRMN45thCHfJqGmARJREDjBvWG8JXOPmVIKQtZmVcD5b mandreou@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 16 12:13:47 np0005620856.novalocal python3[5977]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC9MiLfy30deHA7xPOAlew5qUq3UP2gmRMYJi8PtkjFB20/DKeWwWNnkZPqP9AayruRoo51SIiVg870gbZE2jYl+Ncx/FYDe56JeC3ySZsXoAVkC9bP7gkOGqOmJjirvAgPMI7bogVz8i+66Q4Ar7OKTp3762G4IuWPPEg4ce4Y7lx9qWocZapHYq4cYKMxrOZ7SEbFSATBbe2bPZAPKTw8do/Eny+Hq/LkHFhIeyra6cqTFQYShr+zPln0Cr+ro/pDX3bB+1ubFgTpjpkkkQsLhDfR6cCdCWM2lgnS3BTtYj5Ct9/JRPR5YOphqZz+uB+OEu2IL68hmU9vNTth1KeX rlandy@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 16 12:13:47 np0005620856.novalocal python3[6001]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIFCbgz8gdERiJlk2IKOtkjQxEXejrio6ZYMJAVJYpOIp raukadah@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 16 12:13:48 np0005620856.novalocal python3[6025]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIBqb3Q/9uDf4LmihQ7xeJ9gA/STIQUFPSfyyV0m8AoQi bshewale@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 16 12:13:48 np0005620856.novalocal python3[6049]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC0I8QqQx0Az2ysJt2JuffucLijhBqnsXKEIx5GyHwxVULROa8VtNFXUDH6ZKZavhiMcmfHB2+TBTda+lDP4FldYj06dGmzCY+IYGa+uDRdxHNGYjvCfLFcmLlzRK6fNbTcui+KlUFUdKe0fb9CRoGKyhlJD5GRkM1Dv+Yb6Bj+RNnmm1fVGYxzmrD2utvffYEb0SZGWxq2R9gefx1q/3wCGjeqvufEV+AskPhVGc5T7t9eyZ4qmslkLh1/nMuaIBFcr9AUACRajsvk6mXrAN1g3HlBf2gQlhi1UEyfbqIQvzzFtsbLDlSum/KmKjy818GzvWjERfQ0VkGzCd9bSLVL dviroel@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 16 12:13:48 np0005620856.novalocal python3[6073]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDLOQd4ZLtkZXQGY6UwAr/06ppWQK4fDO3HaqxPk98csyOCBXsliSKK39Bso828+5srIXiW7aI6aC9P5mwi4mUZlGPfJlQbfrcGvY+b/SocuvaGK+1RrHLoJCT52LBhwgrzlXio2jeksZeein8iaTrhsPrOAs7KggIL/rB9hEiB3NaOPWhhoCP4vlW6MEMExGcqB/1FVxXFBPnLkEyW0Lk7ycVflZl2ocRxbfjZi0+tI1Wlinp8PvSQSc/WVrAcDgKjc/mB4ODPOyYy3G8FHgfMsrXSDEyjBKgLKMsdCrAUcqJQWjkqXleXSYOV4q3pzL+9umK+q/e3P/bIoSFQzmJKTU1eDfuvPXmow9F5H54fii/Da7ezlMJ+wPGHJrRAkmzvMbALy7xwswLhZMkOGNtRcPqaKYRmIBKpw3o6bCTtcNUHOtOQnzwY8JzrM2eBWJBXAANYw+9/ho80JIiwhg29CFNpVBuHbql2YxJQNrnl90guN65rYNpDxdIluweyUf8= anbanerj@kaermorhen manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 16 12:13:48 np0005620856.novalocal python3[6097]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC3VwV8Im9kRm49lt3tM36hj4Zv27FxGo4C1Q/0jqhzFmHY7RHbmeRr8ObhwWoHjXSozKWg8FL5ER0z3hTwL0W6lez3sL7hUaCmSuZmG5Hnl3x4vTSxDI9JZ/Y65rtYiiWQo2fC5xJhU/4+0e5e/pseCm8cKRSu+SaxhO+sd6FDojA2x1BzOzKiQRDy/1zWGp/cZkxcEuB1wHI5LMzN03c67vmbu+fhZRAUO4dQkvcnj2LrhQtpa+ytvnSjr8icMDosf1OsbSffwZFyHB/hfWGAfe0eIeSA2XPraxiPknXxiPKx2MJsaUTYbsZcm3EjFdHBBMumw5rBI74zLrMRvCO9GwBEmGT4rFng1nP+yw5DB8sn2zqpOsPg1LYRwCPOUveC13P6pgsZZPh812e8v5EKnETct+5XI3dVpdw6CnNiLwAyVAF15DJvBGT/u1k0Myg/bQn+Gv9k2MSj6LvQmf6WbZu2Wgjm30z3FyCneBqTL7mLF19YXzeC0ufHz5pnO1E= dasm@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 16 12:13:49 np0005620856.novalocal python3[6121]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIHUnwjB20UKmsSed9X73eGNV5AOEFccQ3NYrRW776pEk cjeanner manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 16 12:13:49 np0005620856.novalocal python3[6145]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDercCMGn8rW1C4P67tHgtflPdTeXlpyUJYH+6XDd2lR jgilaber@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 16 12:13:49 np0005620856.novalocal python3[6169]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIAMI6kkg9Wg0sG7jIJmyZemEBwUn1yzNpQQd3gnulOmZ adrianfuscoarnejo@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 16 12:13:50 np0005620856.novalocal python3[6193]: ansible-authorized_key Invoked with user=zuul state=present key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPijwpQu/3jhhhBZInXNOLEH57DrknPc3PLbsRvYyJIFzwYjX+WD4a7+nGnMYS42MuZk6TJcVqgnqofVx4isoD4= ramishra@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 16 12:13:50 np0005620856.novalocal python3[6217]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAICWBreHW95Wz2Toz5YwCGQwFcUG8oFYkienDh9tntmDc ralfieri@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 16 12:13:50 np0005620856.novalocal python3[6241]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDK0iKdi8jQTpQrDdLVH/AAgLVYyTXF7AQ1gjc/5uT3t ykarel@yatinkarel manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 16 12:13:50 np0005620856.novalocal python3[6265]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIF/V/cLotA6LZeO32VL45Hd78skuA2lJA425Sm2LlQeZ fmount@horcrux manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 16 12:13:51 np0005620856.novalocal python3[6289]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDa7QCjuDMVmRPo1rREbGwzYeBCYVN+Ou/3WKXZEC6Sr manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 16 12:13:51 np0005620856.novalocal python3[6313]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQCfNtF7NvKl915TGsGGoseUb06Hj8L/S4toWf0hExeY+F00woL6NvBlJD0nDct+P5a22I4EhvoQCRQ8reaPCm1lybR3uiRIJsj+8zkVvLwby9LXzfZorlNG9ofjd00FEmB09uW/YvTl6Q9XwwwX6tInzIOv3TMqTHHGOL74ibbj8J/FJR0cFEyj0z4WQRvtkh32xAHl83gbuINryMt0sqRI+clj2381NKL55DRLQrVw0gsfqqxiHAnXg21qWmc4J+b9e9kiuAFQjcjwTVkwJCcg3xbPwC/qokYRby/Y5S40UUd7/jEARGXT7RZgpzTuDd1oZiCVrnrqJNPaMNdVv5MLeFdf1B7iIe5aa/fGouX7AO4SdKhZUdnJmCFAGvjC6S3JMZ2wAcUl+OHnssfmdj7XL50cLo27vjuzMtLAgSqi6N99m92WCF2s8J9aVzszX7Xz9OKZCeGsiVJp3/NdABKzSEAyM9xBD/5Vho894Sav+otpySHe3p6RUTgbB5Zu8VyZRZ/UtB3ueXxyo764yrc6qWIDqrehm84Xm9g+/jpIBzGPl07NUNJpdt/6Sgf9RIKXw/7XypO5yZfUcuFNGTxLfqjTNrtgLZNcjfav6sSdVXVcMPL//XNuRdKmVFaO76eV/oGMQGr1fGcCD+N+CpI7+Q+fCNB6VFWG4nZFuI/Iuw== averdagu@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 16 12:13:51 np0005620856.novalocal python3[6337]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDq8l27xI+QlQVdS4djp9ogSoyrNE2+Ox6vKPdhSNL1J3PE5w+WCSvMz9A5gnNuH810zwbekEApbxTze/gLQJwBHA52CChfURpXrFaxY7ePXRElwKAL3mJfzBWY/c5jnNL9TCVmFJTGZkFZP3Nh+BMgZvL6xBkt3WKm6Uq18qzd9XeKcZusrA+O+uLv1fVeQnadY9RIqOCyeFYCzLWrUfTyE8x/XG0hAWIM7qpnF2cALQS2h9n4hW5ybiUN790H08wf9hFwEf5nxY9Z9dVkPFQiTSGKNBzmnCXU9skxS/xhpFjJ5duGSZdtAHe9O+nGZm9c67hxgtf8e5PDuqAdXEv2cf6e3VBAt+Bz8EKI3yosTj0oZHfwr42Yzb1l/SKy14Rggsrc9KAQlrGXan6+u2jcQqqx7l+SWmnpFiWTV9u5cWj2IgOhApOitmRBPYqk9rE2usfO0hLn/Pj/R/Nau4803e1/EikdLE7Ps95s9mX5jRDjAoUa2JwFF5RsVFyL910= ashigupt@ashigupt.remote.csb manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 16 12:13:51 np0005620856.novalocal python3[6361]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIOKLl0NYKwoZ/JY5KeZU8VwRAggeOxqQJeoqp3dsAaY9 manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 16 12:13:52 np0005620856.novalocal python3[6385]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIASASQOH2BcOyLKuuDOdWZlPi2orcjcA8q4400T73DLH evallesp@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 16 12:13:52 np0005620856.novalocal python3[6409]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILeBWlamUph+jRKV2qrx1PGU7vWuGIt5+z9k96I8WehW amsinha@amsinha-mac manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 16 12:13:52 np0005620856.novalocal python3[6433]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIANvVgvJBlK3gb1yz5uef/JqIGq4HLEmY2dYA8e37swb morenod@redhat-laptop manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 16 12:13:53 np0005620856.novalocal python3[6457]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQDZdI7t1cxYx65heVI24HTV4F7oQLW1zyfxHreL2TIJKxjyrUUKIFEUmTutcBlJRLNT2Eoix6x1sOw9YrchloCLcn//SGfTElr9mSc5jbjb7QXEU+zJMhtxyEJ1Po3CUGnj7ckiIXw7wcawZtrEOAQ9pH3ExYCJcEMiyNjRQZCxT3tPK+S4B95EWh5Fsrz9CkwpjNRPPH7LigCeQTM3Wc7r97utAslBUUvYceDSLA7rMgkitJE38b7rZBeYzsGQ8YYUBjTCtehqQXxCRjizbHWaaZkBU+N3zkKB6n/iCNGIO690NK7A/qb6msTijiz1PeuM8ThOsi9qXnbX5v0PoTpcFSojV7NHAQ71f0XXuS43FhZctT+Dcx44dT8Fb5vJu2cJGrk+qF8ZgJYNpRS7gPg0EG2EqjK7JMf9ULdjSu0r+KlqIAyLvtzT4eOnQipoKlb/WG5D/0ohKv7OMQ352ggfkBFIQsRXyyTCT98Ft9juqPuahi3CAQmP4H9dyE+7+Kz437PEtsxLmfm6naNmWi7Ee1DqWPwS8rEajsm4sNM4wW9gdBboJQtc0uZw0DfLj1I9r3Mc8Ol0jYtz0yNQDSzVLrGCaJlC311trU70tZ+ZkAVV6Mn8lOhSbj1cK0lvSr6ZK4dgqGl3I1eTZJJhbLNdg7UOVaiRx9543+C/p/As7w== brjackma@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 16 12:13:53 np0005620856.novalocal python3[6481]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIKwedoZ0TWPJX/z/4TAbO/kKcDZOQVgRH0hAqrL5UCI1 vcastell@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 16 12:13:53 np0005620856.novalocal python3[6505]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIEmv8sE8GCk6ZTPIqF0FQrttBdL3mq7rCm/IJy0xDFh7 michburk@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 16 12:13:53 np0005620856.novalocal python3[6529]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAICy6GpGEtwevXEEn4mmLR5lmSLe23dGgAvzkB9DMNbkf rsafrono@rsafrono manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 16 12:13:56 np0005620856.novalocal sudo[6553]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ejryyhjkogfcapgreyhovdagzstkqqzu ; /usr/bin/python3'
Feb 16 12:13:57 np0005620856.novalocal sudo[6553]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:13:57 np0005620856.novalocal python3[6555]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Feb 16 12:13:57 np0005620856.novalocal systemd[1]: Starting Time & Date Service...
Feb 16 12:13:57 np0005620856.novalocal systemd[1]: Started Time & Date Service.
Feb 16 12:13:57 np0005620856.novalocal systemd-timedated[6557]: Changed time zone to 'UTC' (UTC).
Feb 16 12:13:57 np0005620856.novalocal sudo[6553]: pam_unix(sudo:session): session closed for user root
Feb 16 12:13:57 np0005620856.novalocal sudo[6584]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gwdfujjbwtsxkigbzptnqhtxjnoqvbhw ; /usr/bin/python3'
Feb 16 12:13:57 np0005620856.novalocal sudo[6584]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:13:57 np0005620856.novalocal python3[6586]: ansible-file Invoked with path=/etc/nodepool state=directory mode=511 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 12:13:57 np0005620856.novalocal sudo[6584]: pam_unix(sudo:session): session closed for user root
Feb 16 12:13:58 np0005620856.novalocal python3[6662]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 16 12:13:58 np0005620856.novalocal python3[6733]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes src=/home/zuul/.ansible/tmp/ansible-tmp-1771244037.9967406-202-117610087855263/source _original_basename=tmpxzhpw42a follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 12:13:59 np0005620856.novalocal python3[6833]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 16 12:13:59 np0005620856.novalocal python3[6904]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes_private src=/home/zuul/.ansible/tmp/ansible-tmp-1771244038.8153443-242-264480814498885/source _original_basename=tmp5zl4usx0 follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 12:14:00 np0005620856.novalocal sudo[7004]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gzyiolksseombeizpztsyejwpzmgnklu ; /usr/bin/python3'
Feb 16 12:14:00 np0005620856.novalocal sudo[7004]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:14:00 np0005620856.novalocal python3[7006]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/node_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 16 12:14:00 np0005620856.novalocal sudo[7004]: pam_unix(sudo:session): session closed for user root
Feb 16 12:14:00 np0005620856.novalocal sudo[7077]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bvehuedmehgvcmtnpdcimttwiqydlapo ; /usr/bin/python3'
Feb 16 12:14:00 np0005620856.novalocal sudo[7077]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:14:00 np0005620856.novalocal python3[7079]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/node_private src=/home/zuul/.ansible/tmp/ansible-tmp-1771244040.0357153-306-66559042609084/source _original_basename=tmpe7dx2db3 follow=False checksum=675da38221554070fad736c9d717667e6ac7d120 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 12:14:00 np0005620856.novalocal sudo[7077]: pam_unix(sudo:session): session closed for user root
Feb 16 12:14:01 np0005620856.novalocal python3[7127]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa /etc/nodepool/id_rsa zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 12:14:01 np0005620856.novalocal python3[7153]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa.pub /etc/nodepool/id_rsa.pub zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 12:14:01 np0005620856.novalocal sudo[7231]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lokdwfpgpoldnqslgqwcqqgyvidpkvvv ; /usr/bin/python3'
Feb 16 12:14:01 np0005620856.novalocal sudo[7231]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:14:01 np0005620856.novalocal python3[7233]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/zuul-sudo-grep follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 16 12:14:01 np0005620856.novalocal sudo[7231]: pam_unix(sudo:session): session closed for user root
Feb 16 12:14:02 np0005620856.novalocal sudo[7304]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gwgjsoimtveozbrkmfnwwiqonpleloby ; /usr/bin/python3'
Feb 16 12:14:02 np0005620856.novalocal sudo[7304]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:14:02 np0005620856.novalocal python3[7306]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/zuul-sudo-grep mode=288 src=/home/zuul/.ansible/tmp/ansible-tmp-1771244041.6866364-362-263040601958145/source _original_basename=tmp1cmn_6yb follow=False checksum=bdca1a77493d00fb51567671791f4aa30f66c2f0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 12:14:02 np0005620856.novalocal sudo[7304]: pam_unix(sudo:session): session closed for user root
Feb 16 12:14:02 np0005620856.novalocal sudo[7355]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ooyshgocmxibnzrhhspqcibknofjzjnx ; /usr/bin/python3'
Feb 16 12:14:02 np0005620856.novalocal sudo[7355]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:14:02 np0005620856.novalocal python3[7357]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/visudo -c zuul_log_id=fa163ef9-e89a-8596-46eb-00000000001e-1-compute0 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 12:14:02 np0005620856.novalocal sudo[7355]: pam_unix(sudo:session): session closed for user root
Feb 16 12:14:03 np0005620856.novalocal python3[7385]: ansible-ansible.legacy.command Invoked with executable=/bin/bash _raw_params=env
                                                       _uses_shell=True zuul_log_id=fa163ef9-e89a-8596-46eb-00000000001f-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None creates=None removes=None stdin=None
Feb 16 12:14:04 np0005620856.novalocal python3[7413]: ansible-file Invoked with path=/home/zuul/workspace state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 12:14:22 np0005620856.novalocal sudo[7437]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hwkemmevnstlxnrrcbvizxjofumtloon ; /usr/bin/python3'
Feb 16 12:14:22 np0005620856.novalocal sudo[7437]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:14:22 np0005620856.novalocal python3[7439]: ansible-ansible.builtin.file Invoked with path=/etc/ci/env state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 12:14:22 np0005620856.novalocal sudo[7437]: pam_unix(sudo:session): session closed for user root
Feb 16 12:14:27 np0005620856.novalocal systemd[1]: systemd-timedated.service: Deactivated successfully.
Feb 16 12:14:58 np0005620856.novalocal kernel: pci 0000:00:07.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Feb 16 12:14:58 np0005620856.novalocal kernel: pci 0000:00:07.0: BAR 0 [io  0x0000-0x003f]
Feb 16 12:14:58 np0005620856.novalocal kernel: pci 0000:00:07.0: BAR 1 [mem 0x00000000-0x00000fff]
Feb 16 12:14:58 np0005620856.novalocal kernel: pci 0000:00:07.0: BAR 4 [mem 0x00000000-0x00003fff 64bit pref]
Feb 16 12:14:58 np0005620856.novalocal kernel: pci 0000:00:07.0: ROM [mem 0x00000000-0x0007ffff pref]
Feb 16 12:14:58 np0005620856.novalocal kernel: pci 0000:00:07.0: ROM [mem 0xc0000000-0xc007ffff pref]: assigned
Feb 16 12:14:58 np0005620856.novalocal kernel: pci 0000:00:07.0: BAR 4 [mem 0x240000000-0x240003fff 64bit pref]: assigned
Feb 16 12:14:58 np0005620856.novalocal kernel: pci 0000:00:07.0: BAR 1 [mem 0xc0080000-0xc0080fff]: assigned
Feb 16 12:14:58 np0005620856.novalocal kernel: pci 0000:00:07.0: BAR 0 [io  0x1000-0x103f]: assigned
Feb 16 12:14:58 np0005620856.novalocal kernel: virtio-pci 0000:00:07.0: enabling device (0000 -> 0003)
Feb 16 12:14:58 np0005620856.novalocal NetworkManager[869]: <info>  [1771244098.8124] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Feb 16 12:14:58 np0005620856.novalocal systemd-udevd[7442]: Network interface NamePolicy= disabled on kernel command line.
Feb 16 12:14:58 np0005620856.novalocal NetworkManager[869]: <info>  [1771244098.8359] device (eth1): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Feb 16 12:14:58 np0005620856.novalocal NetworkManager[869]: <info>  [1771244098.8386] settings: (eth1): created default wired connection 'Wired connection 1'
Feb 16 12:14:58 np0005620856.novalocal NetworkManager[869]: <info>  [1771244098.8390] device (eth1): carrier: link connected
Feb 16 12:14:58 np0005620856.novalocal NetworkManager[869]: <info>  [1771244098.8393] device (eth1): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Feb 16 12:14:58 np0005620856.novalocal NetworkManager[869]: <info>  [1771244098.8399] policy: auto-activating connection 'Wired connection 1' (1c93776b-06f5-3002-90c0-48f1720b7a9b)
Feb 16 12:14:58 np0005620856.novalocal NetworkManager[869]: <info>  [1771244098.8404] device (eth1): Activation: starting connection 'Wired connection 1' (1c93776b-06f5-3002-90c0-48f1720b7a9b)
Feb 16 12:14:58 np0005620856.novalocal NetworkManager[869]: <info>  [1771244098.8405] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Feb 16 12:14:58 np0005620856.novalocal NetworkManager[869]: <info>  [1771244098.8409] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Feb 16 12:14:58 np0005620856.novalocal NetworkManager[869]: <info>  [1771244098.8414] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Feb 16 12:14:58 np0005620856.novalocal NetworkManager[869]: <info>  [1771244098.8419] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Feb 16 12:14:59 np0005620856.novalocal python3[7469]: ansible-ansible.legacy.command Invoked with _raw_params=ip -j link zuul_log_id=fa163ef9-e89a-886a-88de-000000000112-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 12:15:06 np0005620856.novalocal sudo[7547]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ksdbcktkpodhygxcgeolzrdzbslnajpq ; OS_CLOUD=vexxhost /usr/bin/python3'
Feb 16 12:15:06 np0005620856.novalocal sudo[7547]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:15:06 np0005620856.novalocal python3[7549]: ansible-ansible.legacy.stat Invoked with path=/etc/NetworkManager/system-connections/ci-private-network.nmconnection follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 16 12:15:06 np0005620856.novalocal sudo[7547]: pam_unix(sudo:session): session closed for user root
Feb 16 12:15:07 np0005620856.novalocal sudo[7620]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-izxnawwoowffmkpyxuihcgdakhdayxhq ; OS_CLOUD=vexxhost /usr/bin/python3'
Feb 16 12:15:07 np0005620856.novalocal sudo[7620]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:15:07 np0005620856.novalocal python3[7622]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1771244106.5938513-103-101905152472686/source dest=/etc/NetworkManager/system-connections/ci-private-network.nmconnection mode=0600 owner=root group=root follow=False _original_basename=bootstrap-ci-network-nm-connection.nmconnection.j2 checksum=eee69192fc62af7024a6c4fe48a21ead5fb78e6a backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 12:15:07 np0005620856.novalocal sudo[7620]: pam_unix(sudo:session): session closed for user root
Feb 16 12:15:07 np0005620856.novalocal sudo[7670]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-syoybwybwsyaqnizshtcanesgbifpljf ; OS_CLOUD=vexxhost /usr/bin/python3'
Feb 16 12:15:07 np0005620856.novalocal sudo[7670]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:15:08 np0005620856.novalocal python3[7672]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 16 12:15:08 np0005620856.novalocal systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Feb 16 12:15:08 np0005620856.novalocal systemd[1]: Stopped Network Manager Wait Online.
Feb 16 12:15:08 np0005620856.novalocal systemd[1]: Stopping Network Manager Wait Online...
Feb 16 12:15:08 np0005620856.novalocal systemd[1]: Stopping Network Manager...
Feb 16 12:15:08 np0005620856.novalocal NetworkManager[869]: <info>  [1771244108.0555] caught SIGTERM, shutting down normally.
Feb 16 12:15:08 np0005620856.novalocal NetworkManager[869]: <info>  [1771244108.0562] dhcp4 (eth0): canceled DHCP transaction
Feb 16 12:15:08 np0005620856.novalocal NetworkManager[869]: <info>  [1771244108.0563] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Feb 16 12:15:08 np0005620856.novalocal NetworkManager[869]: <info>  [1771244108.0563] dhcp4 (eth0): state changed no lease
Feb 16 12:15:08 np0005620856.novalocal NetworkManager[869]: <info>  [1771244108.0565] manager: NetworkManager state is now CONNECTING
Feb 16 12:15:08 np0005620856.novalocal NetworkManager[869]: <info>  [1771244108.0713] dhcp4 (eth1): canceled DHCP transaction
Feb 16 12:15:08 np0005620856.novalocal NetworkManager[869]: <info>  [1771244108.0713] dhcp4 (eth1): state changed no lease
Feb 16 12:15:08 np0005620856.novalocal NetworkManager[869]: <info>  [1771244108.0756] exiting (success)
Feb 16 12:15:08 np0005620856.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Feb 16 12:15:08 np0005620856.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Feb 16 12:15:08 np0005620856.novalocal systemd[1]: NetworkManager.service: Deactivated successfully.
Feb 16 12:15:08 np0005620856.novalocal systemd[1]: Stopped Network Manager.
Feb 16 12:15:08 np0005620856.novalocal systemd[1]: NetworkManager.service: Consumed 1.461s CPU time, 10.0M memory peak.
Feb 16 12:15:08 np0005620856.novalocal systemd[1]: Starting Network Manager...
Feb 16 12:15:08 np0005620856.novalocal NetworkManager[7684]: <info>  [1771244108.1329] NetworkManager (version 1.54.3-2.el9) is starting... (after a restart, boot:8a9ec7d7-dead-4443-8176-f7a3c4743d84)
Feb 16 12:15:08 np0005620856.novalocal NetworkManager[7684]: <info>  [1771244108.1333] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Feb 16 12:15:08 np0005620856.novalocal NetworkManager[7684]: <info>  [1771244108.1397] manager[0x561fc4258000]: monitoring kernel firmware directory '/lib/firmware'.
Feb 16 12:15:08 np0005620856.novalocal systemd[1]: Starting Hostname Service...
Feb 16 12:15:08 np0005620856.novalocal systemd[1]: Started Hostname Service.
Feb 16 12:15:08 np0005620856.novalocal NetworkManager[7684]: <info>  [1771244108.2082] hostname: hostname: using hostnamed
Feb 16 12:15:08 np0005620856.novalocal NetworkManager[7684]: <info>  [1771244108.2082] hostname: static hostname changed from (none) to "np0005620856.novalocal"
Feb 16 12:15:08 np0005620856.novalocal NetworkManager[7684]: <info>  [1771244108.2088] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Feb 16 12:15:08 np0005620856.novalocal NetworkManager[7684]: <info>  [1771244108.2093] manager[0x561fc4258000]: rfkill: Wi-Fi hardware radio set enabled
Feb 16 12:15:08 np0005620856.novalocal NetworkManager[7684]: <info>  [1771244108.2093] manager[0x561fc4258000]: rfkill: WWAN hardware radio set enabled
Feb 16 12:15:08 np0005620856.novalocal NetworkManager[7684]: <info>  [1771244108.2124] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-device-plugin-team.so)
Feb 16 12:15:08 np0005620856.novalocal NetworkManager[7684]: <info>  [1771244108.2124] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Feb 16 12:15:08 np0005620856.novalocal NetworkManager[7684]: <info>  [1771244108.2125] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Feb 16 12:15:08 np0005620856.novalocal NetworkManager[7684]: <info>  [1771244108.2126] manager: Networking is enabled by state file
Feb 16 12:15:08 np0005620856.novalocal NetworkManager[7684]: <info>  [1771244108.2128] settings: Loaded settings plugin: keyfile (internal)
Feb 16 12:15:08 np0005620856.novalocal NetworkManager[7684]: <info>  [1771244108.2132] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-settings-plugin-ifcfg-rh.so")
Feb 16 12:15:08 np0005620856.novalocal NetworkManager[7684]: <info>  [1771244108.2161] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Feb 16 12:15:08 np0005620856.novalocal NetworkManager[7684]: <info>  [1771244108.2173] dhcp: init: Using DHCP client 'internal'
Feb 16 12:15:08 np0005620856.novalocal NetworkManager[7684]: <info>  [1771244108.2176] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Feb 16 12:15:08 np0005620856.novalocal NetworkManager[7684]: <info>  [1771244108.2182] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 16 12:15:08 np0005620856.novalocal NetworkManager[7684]: <info>  [1771244108.2190] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Feb 16 12:15:08 np0005620856.novalocal NetworkManager[7684]: <info>  [1771244108.2197] device (lo): Activation: starting connection 'lo' (e66ad5fa-4651-424c-a7b4-2a119df1e243)
Feb 16 12:15:08 np0005620856.novalocal NetworkManager[7684]: <info>  [1771244108.2204] device (eth0): carrier: link connected
Feb 16 12:15:08 np0005620856.novalocal NetworkManager[7684]: <info>  [1771244108.2206] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Feb 16 12:15:08 np0005620856.novalocal NetworkManager[7684]: <info>  [1771244108.2210] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Feb 16 12:15:08 np0005620856.novalocal NetworkManager[7684]: <info>  [1771244108.2211] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Feb 16 12:15:08 np0005620856.novalocal NetworkManager[7684]: <info>  [1771244108.2218] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Feb 16 12:15:08 np0005620856.novalocal NetworkManager[7684]: <info>  [1771244108.2224] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Feb 16 12:15:08 np0005620856.novalocal NetworkManager[7684]: <info>  [1771244108.2230] device (eth1): carrier: link connected
Feb 16 12:15:08 np0005620856.novalocal NetworkManager[7684]: <info>  [1771244108.2233] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Feb 16 12:15:08 np0005620856.novalocal NetworkManager[7684]: <info>  [1771244108.2239] manager: (eth1): assume: will attempt to assume matching connection 'Wired connection 1' (1c93776b-06f5-3002-90c0-48f1720b7a9b) (indicated)
Feb 16 12:15:08 np0005620856.novalocal NetworkManager[7684]: <info>  [1771244108.2239] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Feb 16 12:15:08 np0005620856.novalocal NetworkManager[7684]: <info>  [1771244108.2243] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Feb 16 12:15:08 np0005620856.novalocal NetworkManager[7684]: <info>  [1771244108.2249] device (eth1): Activation: starting connection 'Wired connection 1' (1c93776b-06f5-3002-90c0-48f1720b7a9b)
Feb 16 12:15:08 np0005620856.novalocal NetworkManager[7684]: <info>  [1771244108.2255] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Feb 16 12:15:08 np0005620856.novalocal systemd[1]: Started Network Manager.
Feb 16 12:15:08 np0005620856.novalocal NetworkManager[7684]: <info>  [1771244108.2260] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Feb 16 12:15:08 np0005620856.novalocal NetworkManager[7684]: <info>  [1771244108.2262] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Feb 16 12:15:08 np0005620856.novalocal NetworkManager[7684]: <info>  [1771244108.2265] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Feb 16 12:15:08 np0005620856.novalocal NetworkManager[7684]: <info>  [1771244108.2266] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Feb 16 12:15:08 np0005620856.novalocal NetworkManager[7684]: <info>  [1771244108.2271] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Feb 16 12:15:08 np0005620856.novalocal NetworkManager[7684]: <info>  [1771244108.2274] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Feb 16 12:15:08 np0005620856.novalocal NetworkManager[7684]: <info>  [1771244108.2278] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Feb 16 12:15:08 np0005620856.novalocal NetworkManager[7684]: <info>  [1771244108.2283] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Feb 16 12:15:08 np0005620856.novalocal NetworkManager[7684]: <info>  [1771244108.2290] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Feb 16 12:15:08 np0005620856.novalocal NetworkManager[7684]: <info>  [1771244108.2294] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Feb 16 12:15:08 np0005620856.novalocal NetworkManager[7684]: <info>  [1771244108.2302] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Feb 16 12:15:08 np0005620856.novalocal NetworkManager[7684]: <info>  [1771244108.2304] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Feb 16 12:15:08 np0005620856.novalocal NetworkManager[7684]: <info>  [1771244108.2322] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Feb 16 12:15:08 np0005620856.novalocal NetworkManager[7684]: <info>  [1771244108.2323] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Feb 16 12:15:08 np0005620856.novalocal NetworkManager[7684]: <info>  [1771244108.2327] device (lo): Activation: successful, device activated.
Feb 16 12:15:08 np0005620856.novalocal systemd[1]: Starting Network Manager Wait Online...
Feb 16 12:15:08 np0005620856.novalocal sudo[7670]: pam_unix(sudo:session): session closed for user root
Feb 16 12:15:08 np0005620856.novalocal python3[7737]: ansible-ansible.legacy.command Invoked with _raw_params=ip route zuul_log_id=fa163ef9-e89a-886a-88de-0000000000b2-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 12:15:10 np0005620856.novalocal NetworkManager[7684]: <info>  [1771244110.4524] dhcp4 (eth0): state changed new lease, address=38.102.83.210
Feb 16 12:15:10 np0005620856.novalocal NetworkManager[7684]: <info>  [1771244110.4539] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Feb 16 12:15:10 np0005620856.novalocal NetworkManager[7684]: <info>  [1771244110.4627] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Feb 16 12:15:10 np0005620856.novalocal NetworkManager[7684]: <info>  [1771244110.4662] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Feb 16 12:15:10 np0005620856.novalocal NetworkManager[7684]: <info>  [1771244110.4664] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Feb 16 12:15:10 np0005620856.novalocal NetworkManager[7684]: <info>  [1771244110.4667] manager: NetworkManager state is now CONNECTED_SITE
Feb 16 12:15:10 np0005620856.novalocal NetworkManager[7684]: <info>  [1771244110.4669] device (eth0): Activation: successful, device activated.
Feb 16 12:15:10 np0005620856.novalocal NetworkManager[7684]: <info>  [1771244110.4674] manager: NetworkManager state is now CONNECTED_GLOBAL
Feb 16 12:15:20 np0005620856.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Feb 16 12:15:27 np0005620856.novalocal systemd[4800]: Starting Mark boot as successful...
Feb 16 12:15:27 np0005620856.novalocal systemd[4800]: Finished Mark boot as successful.
Feb 16 12:15:38 np0005620856.novalocal systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Feb 16 12:15:53 np0005620856.novalocal NetworkManager[7684]: <info>  [1771244153.5494] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Feb 16 12:15:53 np0005620856.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Feb 16 12:15:53 np0005620856.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Feb 16 12:15:53 np0005620856.novalocal NetworkManager[7684]: <info>  [1771244153.5777] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Feb 16 12:15:53 np0005620856.novalocal NetworkManager[7684]: <info>  [1771244153.5781] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Feb 16 12:15:53 np0005620856.novalocal NetworkManager[7684]: <info>  [1771244153.5790] device (eth1): Activation: successful, device activated.
Feb 16 12:15:53 np0005620856.novalocal NetworkManager[7684]: <info>  [1771244153.5803] manager: startup complete
Feb 16 12:15:53 np0005620856.novalocal NetworkManager[7684]: <info>  [1771244153.5806] device (eth1): state change: activated -> failed (reason 'ip-config-unavailable', managed-type: 'full')
Feb 16 12:15:53 np0005620856.novalocal NetworkManager[7684]: <warn>  [1771244153.5814] device (eth1): Activation: failed for connection 'Wired connection 1'
Feb 16 12:15:53 np0005620856.novalocal NetworkManager[7684]: <info>  [1771244153.5828] device (eth1): state change: failed -> disconnected (reason 'none', managed-type: 'full')
Feb 16 12:15:53 np0005620856.novalocal systemd[1]: Finished Network Manager Wait Online.
Feb 16 12:15:53 np0005620856.novalocal NetworkManager[7684]: <info>  [1771244153.5984] dhcp4 (eth1): canceled DHCP transaction
Feb 16 12:15:53 np0005620856.novalocal NetworkManager[7684]: <info>  [1771244153.5985] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Feb 16 12:15:53 np0005620856.novalocal NetworkManager[7684]: <info>  [1771244153.5985] dhcp4 (eth1): state changed no lease
Feb 16 12:15:53 np0005620856.novalocal NetworkManager[7684]: <info>  [1771244153.6009] policy: auto-activating connection 'ci-private-network' (5342df4c-d20b-514f-b7a0-cf6ea02e3054)
Feb 16 12:15:53 np0005620856.novalocal NetworkManager[7684]: <info>  [1771244153.6016] device (eth1): Activation: starting connection 'ci-private-network' (5342df4c-d20b-514f-b7a0-cf6ea02e3054)
Feb 16 12:15:53 np0005620856.novalocal NetworkManager[7684]: <info>  [1771244153.6018] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Feb 16 12:15:53 np0005620856.novalocal NetworkManager[7684]: <info>  [1771244153.6022] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Feb 16 12:15:53 np0005620856.novalocal NetworkManager[7684]: <info>  [1771244153.6031] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Feb 16 12:15:53 np0005620856.novalocal NetworkManager[7684]: <info>  [1771244153.6044] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Feb 16 12:15:53 np0005620856.novalocal NetworkManager[7684]: <info>  [1771244153.6102] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Feb 16 12:15:53 np0005620856.novalocal NetworkManager[7684]: <info>  [1771244153.6106] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Feb 16 12:15:53 np0005620856.novalocal NetworkManager[7684]: <info>  [1771244153.6118] device (eth1): Activation: successful, device activated.
Feb 16 12:16:03 np0005620856.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Feb 16 12:16:08 np0005620856.novalocal sshd-session[4809]: Received disconnect from 38.102.83.114 port 54980:11: disconnected by user
Feb 16 12:16:08 np0005620856.novalocal sshd-session[4809]: Disconnected from user zuul 38.102.83.114 port 54980
Feb 16 12:16:08 np0005620856.novalocal sshd-session[4796]: pam_unix(sshd:session): session closed for user zuul
Feb 16 12:16:08 np0005620856.novalocal systemd-logind[818]: Session 1 logged out. Waiting for processes to exit.
Feb 16 12:16:31 np0005620856.novalocal sshd-session[7785]: Accepted publickey for zuul from 38.102.83.114 port 39242 ssh2: RSA SHA256:5K91TOH9TC6bf3w9o3FKU3gHEsx4E7+lyHRIvDXmBIc
Feb 16 12:16:32 np0005620856.novalocal systemd-logind[818]: New session 3 of user zuul.
Feb 16 12:16:32 np0005620856.novalocal systemd[1]: Started Session 3 of User zuul.
Feb 16 12:16:32 np0005620856.novalocal sshd-session[7785]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 16 12:16:32 np0005620856.novalocal sudo[7864]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oaaqsgpiwodstekgvuyjeigtlhwfdgyn ; OS_CLOUD=vexxhost /usr/bin/python3'
Feb 16 12:16:32 np0005620856.novalocal sudo[7864]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:16:32 np0005620856.novalocal python3[7866]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/env/networking-info.yml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 16 12:16:32 np0005620856.novalocal sudo[7864]: pam_unix(sudo:session): session closed for user root
Feb 16 12:16:32 np0005620856.novalocal sudo[7937]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wsmelsnigeyzcicdystypuhayevwzchr ; OS_CLOUD=vexxhost /usr/bin/python3'
Feb 16 12:16:32 np0005620856.novalocal sudo[7937]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:16:32 np0005620856.novalocal python3[7939]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/env/networking-info.yml owner=root group=root mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771244192.1213892-312-177003983910949/source _original_basename=tmp1uuv6gy0 follow=False checksum=0ef1aafc1c85a846b4f84c91412dbcb842d2a2f1 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 12:16:32 np0005620856.novalocal sudo[7937]: pam_unix(sudo:session): session closed for user root
Feb 16 12:16:35 np0005620856.novalocal sshd-session[7788]: Connection closed by 38.102.83.114 port 39242
Feb 16 12:16:35 np0005620856.novalocal sshd-session[7785]: pam_unix(sshd:session): session closed for user zuul
Feb 16 12:16:35 np0005620856.novalocal systemd[1]: session-3.scope: Deactivated successfully.
Feb 16 12:16:35 np0005620856.novalocal systemd-logind[818]: Session 3 logged out. Waiting for processes to exit.
Feb 16 12:16:35 np0005620856.novalocal systemd-logind[818]: Removed session 3.
Feb 16 12:18:27 np0005620856.novalocal systemd[4800]: Created slice User Background Tasks Slice.
Feb 16 12:18:27 np0005620856.novalocal systemd[4800]: Starting Cleanup of User's Temporary Files and Directories...
Feb 16 12:18:27 np0005620856.novalocal systemd[4800]: Finished Cleanup of User's Temporary Files and Directories.
Feb 16 12:27:27 np0005620856.novalocal systemd[1]: Starting Cleanup of Temporary Directories...
Feb 16 12:27:27 np0005620856.novalocal systemd[1]: systemd-tmpfiles-clean.service: Deactivated successfully.
Feb 16 12:27:27 np0005620856.novalocal systemd[1]: Finished Cleanup of Temporary Directories.
Feb 16 12:27:27 np0005620856.novalocal systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dclean.service.mount: Deactivated successfully.
Feb 16 12:28:10 np0005620856.novalocal sshd-session[7976]: Accepted publickey for zuul from 38.102.83.114 port 41010 ssh2: RSA SHA256:5K91TOH9TC6bf3w9o3FKU3gHEsx4E7+lyHRIvDXmBIc
Feb 16 12:28:10 np0005620856.novalocal systemd-logind[818]: New session 4 of user zuul.
Feb 16 12:28:10 np0005620856.novalocal systemd[1]: Started Session 4 of User zuul.
Feb 16 12:28:10 np0005620856.novalocal sshd-session[7976]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 16 12:28:10 np0005620856.novalocal sudo[8003]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-txbifcybluvrjyvghorbilkgluouqtnp ; /usr/bin/python3'
Feb 16 12:28:10 np0005620856.novalocal sudo[8003]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:28:10 np0005620856.novalocal python3[8005]: ansible-ansible.legacy.command Invoked with _raw_params=lsblk -nd -o MAJ:MIN /dev/vda
                                                       _uses_shell=True zuul_log_id=fa163ef9-e89a-94f8-a218-000000000cd9-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 12:28:10 np0005620856.novalocal sudo[8003]: pam_unix(sudo:session): session closed for user root
Feb 16 12:28:11 np0005620856.novalocal sudo[8032]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ixqcxrkbkjhgkngqsdzxhgsnlnyahrsu ; /usr/bin/python3'
Feb 16 12:28:11 np0005620856.novalocal sudo[8032]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:28:11 np0005620856.novalocal python3[8034]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/init.scope state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 12:28:11 np0005620856.novalocal sudo[8032]: pam_unix(sudo:session): session closed for user root
Feb 16 12:28:11 np0005620856.novalocal sudo[8058]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sfcuottlfyfbekqwkdccizlmicfrrdlu ; /usr/bin/python3'
Feb 16 12:28:11 np0005620856.novalocal sudo[8058]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:28:11 np0005620856.novalocal python3[8060]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/machine.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 12:28:11 np0005620856.novalocal sudo[8058]: pam_unix(sudo:session): session closed for user root
Feb 16 12:28:11 np0005620856.novalocal sudo[8084]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bvxzdkrzxhmhyhkcvxrzidyhbyspywfy ; /usr/bin/python3'
Feb 16 12:28:11 np0005620856.novalocal sudo[8084]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:28:11 np0005620856.novalocal python3[8086]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/system.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 12:28:11 np0005620856.novalocal sudo[8084]: pam_unix(sudo:session): session closed for user root
Feb 16 12:28:12 np0005620856.novalocal sudo[8110]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zltvrvnzbvyqqwdbfbfoobusartytttf ; /usr/bin/python3'
Feb 16 12:28:12 np0005620856.novalocal sudo[8110]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:28:12 np0005620856.novalocal python3[8112]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/user.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 12:28:12 np0005620856.novalocal sudo[8110]: pam_unix(sudo:session): session closed for user root
Feb 16 12:28:13 np0005620856.novalocal sudo[8136]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fgzrloqkflylqxmvrcfsvzikfuabhrqg ; /usr/bin/python3'
Feb 16 12:28:13 np0005620856.novalocal sudo[8136]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:28:13 np0005620856.novalocal python3[8138]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system.conf.d state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 12:28:13 np0005620856.novalocal sudo[8136]: pam_unix(sudo:session): session closed for user root
Feb 16 12:28:13 np0005620856.novalocal sudo[8214]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-upxcwzbcpsinuwtoawdchdqqppvsbxwe ; /usr/bin/python3'
Feb 16 12:28:13 np0005620856.novalocal sudo[8214]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:28:13 np0005620856.novalocal python3[8216]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system.conf.d/override.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 16 12:28:13 np0005620856.novalocal sudo[8214]: pam_unix(sudo:session): session closed for user root
Feb 16 12:28:13 np0005620856.novalocal sudo[8287]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rkeheqwtncaflpldveylheerdwpzkkqc ; /usr/bin/python3'
Feb 16 12:28:13 np0005620856.novalocal sudo[8287]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:28:14 np0005620856.novalocal python3[8289]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system.conf.d/override.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771244893.489874-377-280842303534187/source _original_basename=tmpjujawern follow=False checksum=a05098bd3d2321238ea1169d0e6f135b35b392d4 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 12:28:14 np0005620856.novalocal sudo[8287]: pam_unix(sudo:session): session closed for user root
Feb 16 12:28:14 np0005620856.novalocal irqbalance[816]: Cannot change IRQ 26 affinity: Operation not permitted
Feb 16 12:28:14 np0005620856.novalocal irqbalance[816]: IRQ 26 affinity is now unmanaged
Feb 16 12:28:15 np0005620856.novalocal sudo[8337]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wmyhpbybztrfypjriajdjvfqgiojfibq ; /usr/bin/python3'
Feb 16 12:28:15 np0005620856.novalocal sudo[8337]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:28:15 np0005620856.novalocal python3[8339]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 16 12:28:15 np0005620856.novalocal systemd[1]: Reloading.
Feb 16 12:28:15 np0005620856.novalocal systemd-rc-local-generator[8357]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 12:28:15 np0005620856.novalocal sudo[8337]: pam_unix(sudo:session): session closed for user root
Feb 16 12:28:17 np0005620856.novalocal sudo[8400]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mogllqomvcwgznyhuyomfgegceppjhlf ; /usr/bin/python3'
Feb 16 12:28:17 np0005620856.novalocal sudo[8400]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:28:17 np0005620856.novalocal python3[8402]: ansible-ansible.builtin.wait_for Invoked with path=/sys/fs/cgroup/system.slice/io.max state=present timeout=30 host=127.0.0.1 connect_timeout=5 delay=0 active_connection_states=['ESTABLISHED', 'FIN_WAIT1', 'FIN_WAIT2', 'SYN_RECV', 'SYN_SENT', 'TIME_WAIT'] sleep=1 port=None search_regex=None exclude_hosts=None msg=None
Feb 16 12:28:17 np0005620856.novalocal sudo[8400]: pam_unix(sudo:session): session closed for user root
Feb 16 12:28:17 np0005620856.novalocal sudo[8426]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-osbtclutbebjjhlbsvwtqtgedamicasq ; /usr/bin/python3'
Feb 16 12:28:17 np0005620856.novalocal sudo[8426]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:28:17 np0005620856.novalocal python3[8428]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/init.scope/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 12:28:17 np0005620856.novalocal sudo[8426]: pam_unix(sudo:session): session closed for user root
Feb 16 12:28:17 np0005620856.novalocal sudo[8454]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hmcfzgewddldqoebxrysyeujrdqrujsu ; /usr/bin/python3'
Feb 16 12:28:17 np0005620856.novalocal sudo[8454]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:28:17 np0005620856.novalocal python3[8456]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/machine.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 12:28:17 np0005620856.novalocal sudo[8454]: pam_unix(sudo:session): session closed for user root
Feb 16 12:28:17 np0005620856.novalocal sudo[8482]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nnjwhnmofplniydsaveoqunztihrqpaf ; /usr/bin/python3'
Feb 16 12:28:17 np0005620856.novalocal sudo[8482]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:28:18 np0005620856.novalocal python3[8484]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/system.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 12:28:18 np0005620856.novalocal sudo[8482]: pam_unix(sudo:session): session closed for user root
Feb 16 12:28:18 np0005620856.novalocal sudo[8510]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mkdytzzzyymhxskwfhcvmxcbpuyizacv ; /usr/bin/python3'
Feb 16 12:28:18 np0005620856.novalocal sudo[8510]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:28:18 np0005620856.novalocal python3[8512]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/user.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 12:28:18 np0005620856.novalocal sudo[8510]: pam_unix(sudo:session): session closed for user root
Feb 16 12:28:18 np0005620856.novalocal python3[8539]: ansible-ansible.legacy.command Invoked with _raw_params=echo "init";    cat /sys/fs/cgroup/init.scope/io.max; echo "machine"; cat /sys/fs/cgroup/machine.slice/io.max; echo "system";  cat /sys/fs/cgroup/system.slice/io.max; echo "user";    cat /sys/fs/cgroup/user.slice/io.max;
                                                       _uses_shell=True zuul_log_id=fa163ef9-e89a-94f8-a218-000000000ce0-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 12:28:19 np0005620856.novalocal python3[8569]: ansible-ansible.builtin.stat Invoked with path=/sys/fs/cgroup/kubepods.slice/io.max follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 16 12:28:22 np0005620856.novalocal sshd-session[7979]: Connection closed by 38.102.83.114 port 41010
Feb 16 12:28:22 np0005620856.novalocal sshd-session[7976]: pam_unix(sshd:session): session closed for user zuul
Feb 16 12:28:22 np0005620856.novalocal systemd[1]: session-4.scope: Deactivated successfully.
Feb 16 12:28:22 np0005620856.novalocal systemd[1]: session-4.scope: Consumed 3.658s CPU time.
Feb 16 12:28:22 np0005620856.novalocal systemd-logind[818]: Session 4 logged out. Waiting for processes to exit.
Feb 16 12:28:22 np0005620856.novalocal systemd-logind[818]: Removed session 4.
Feb 16 12:28:23 np0005620856.novalocal sshd-session[8573]: Accepted publickey for zuul from 38.102.83.114 port 37690 ssh2: RSA SHA256:5K91TOH9TC6bf3w9o3FKU3gHEsx4E7+lyHRIvDXmBIc
Feb 16 12:28:23 np0005620856.novalocal systemd-logind[818]: New session 5 of user zuul.
Feb 16 12:28:23 np0005620856.novalocal systemd[1]: Started Session 5 of User zuul.
Feb 16 12:28:23 np0005620856.novalocal sshd-session[8573]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 16 12:28:23 np0005620856.novalocal sudo[8600]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gfjmxqajhkfxeanybszqoishtyofjhdn ; /usr/bin/python3'
Feb 16 12:28:23 np0005620856.novalocal sudo[8600]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:28:24 np0005620856.novalocal python3[8602]: ansible-ansible.legacy.dnf Invoked with name=['podman', 'buildah'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Feb 16 12:28:31 np0005620856.novalocal setsebool[8643]: The virt_use_nfs policy boolean was changed to 1 by root
Feb 16 12:28:31 np0005620856.novalocal setsebool[8643]: The virt_sandbox_use_all_caps policy boolean was changed to 1 by root
Feb 16 12:28:42 np0005620856.novalocal kernel: SELinux:  Converting 385 SID table entries...
Feb 16 12:28:42 np0005620856.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Feb 16 12:28:42 np0005620856.novalocal kernel: SELinux:  policy capability open_perms=1
Feb 16 12:28:42 np0005620856.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Feb 16 12:28:42 np0005620856.novalocal kernel: SELinux:  policy capability always_check_network=0
Feb 16 12:28:42 np0005620856.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Feb 16 12:28:42 np0005620856.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Feb 16 12:28:42 np0005620856.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Feb 16 12:28:51 np0005620856.novalocal kernel: SELinux:  Converting 388 SID table entries...
Feb 16 12:28:51 np0005620856.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Feb 16 12:28:51 np0005620856.novalocal kernel: SELinux:  policy capability open_perms=1
Feb 16 12:28:51 np0005620856.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Feb 16 12:28:51 np0005620856.novalocal kernel: SELinux:  policy capability always_check_network=0
Feb 16 12:28:51 np0005620856.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Feb 16 12:28:51 np0005620856.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Feb 16 12:28:51 np0005620856.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Feb 16 12:29:08 np0005620856.novalocal dbus-broker-launch[808]: avc:  op=load_policy lsm=selinux seqno=4 res=1
Feb 16 12:29:08 np0005620856.novalocal systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Feb 16 12:29:08 np0005620856.novalocal systemd[1]: Starting man-db-cache-update.service...
Feb 16 12:29:08 np0005620856.novalocal systemd[1]: Reloading.
Feb 16 12:29:08 np0005620856.novalocal systemd-rc-local-generator[9427]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 12:29:08 np0005620856.novalocal systemd[1]: Queuing reload/restart jobs for marked units…
Feb 16 12:29:09 np0005620856.novalocal sudo[8600]: pam_unix(sudo:session): session closed for user root
Feb 16 12:29:17 np0005620856.novalocal python3[16244]: ansible-ansible.legacy.command Invoked with _raw_params=echo "openstack-k8s-operators+cirobot"
                                                        _uses_shell=True zuul_log_id=fa163ef9-e89a-c8c4-2e34-00000000000b-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 12:29:18 np0005620856.novalocal kernel: evm: overlay not supported
Feb 16 12:29:18 np0005620856.novalocal systemd[4800]: Starting D-Bus User Message Bus...
Feb 16 12:29:18 np0005620856.novalocal dbus-broker-launch[16990]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +31: Eavesdropping is deprecated and ignored
Feb 16 12:29:18 np0005620856.novalocal dbus-broker-launch[16990]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +33: Eavesdropping is deprecated and ignored
Feb 16 12:29:18 np0005620856.novalocal systemd[4800]: Started D-Bus User Message Bus.
Feb 16 12:29:18 np0005620856.novalocal dbus-broker-lau[16990]: Ready
Feb 16 12:29:18 np0005620856.novalocal systemd[4800]: selinux: avc:  op=load_policy lsm=selinux seqno=4 res=1
Feb 16 12:29:18 np0005620856.novalocal systemd[4800]: Created slice Slice /user.
Feb 16 12:29:18 np0005620856.novalocal systemd[4800]: podman-16916.scope: unit configures an IP firewall, but not running as root.
Feb 16 12:29:18 np0005620856.novalocal systemd[4800]: (This warning is only shown for the first unit using IP firewalling.)
Feb 16 12:29:18 np0005620856.novalocal systemd[4800]: Started podman-16916.scope.
Feb 16 12:29:18 np0005620856.novalocal systemd[4800]: Started podman-pause-9c15f064.scope.
Feb 16 12:29:20 np0005620856.novalocal sudo[17919]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oqwlmwuhetrefxmcfiepjnwnbkwwivlv ; /usr/bin/python3'
Feb 16 12:29:20 np0005620856.novalocal sudo[17919]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:29:20 np0005620856.novalocal python3[17925]: ansible-ansible.builtin.blockinfile Invoked with state=present insertafter=EOF dest=/etc/containers/registries.conf content=[[registry]]
                                                       location = "38.102.83.82:5001"
                                                       insecure = true path=/etc/containers/registries.conf block=[[registry]]
                                                       location = "38.102.83.82:5001"
                                                       insecure = true marker=# {mark} ANSIBLE MANAGED BLOCK create=False backup=False marker_begin=BEGIN marker_end=END unsafe_writes=False insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 12:29:20 np0005620856.novalocal python3[17925]: ansible-ansible.builtin.blockinfile [WARNING] Module remote_tmp /root/.ansible/tmp did not exist and was created with a mode of 0700, this may cause issues when running as another user. To avoid this, create the remote_tmp dir with the correct permissions manually
Feb 16 12:29:20 np0005620856.novalocal sudo[17919]: pam_unix(sudo:session): session closed for user root
Feb 16 12:29:20 np0005620856.novalocal sshd-session[8576]: Connection closed by 38.102.83.114 port 37690
Feb 16 12:29:20 np0005620856.novalocal sshd-session[8573]: pam_unix(sshd:session): session closed for user zuul
Feb 16 12:29:20 np0005620856.novalocal systemd[1]: session-5.scope: Deactivated successfully.
Feb 16 12:29:20 np0005620856.novalocal systemd[1]: session-5.scope: Consumed 40.712s CPU time.
Feb 16 12:29:20 np0005620856.novalocal systemd-logind[818]: Session 5 logged out. Waiting for processes to exit.
Feb 16 12:29:20 np0005620856.novalocal systemd-logind[818]: Removed session 5.
Feb 16 12:29:40 np0005620856.novalocal sshd-session[29073]: Connection closed by 38.102.83.173 port 48932 [preauth]
Feb 16 12:29:40 np0005620856.novalocal sshd-session[29079]: Connection closed by 38.102.83.173 port 48946 [preauth]
Feb 16 12:29:40 np0005620856.novalocal sshd-session[29074]: Unable to negotiate with 38.102.83.173 port 48970: no matching host key type found. Their offer: sk-ssh-ed25519@openssh.com [preauth]
Feb 16 12:29:40 np0005620856.novalocal sshd-session[29077]: Unable to negotiate with 38.102.83.173 port 48960: no matching host key type found. Their offer: ssh-ed25519 [preauth]
Feb 16 12:29:40 np0005620856.novalocal sshd-session[29076]: Unable to negotiate with 38.102.83.173 port 48964: no matching host key type found. Their offer: sk-ecdsa-sha2-nistp256@openssh.com [preauth]
Feb 16 12:29:42 np0005620856.novalocal systemd[1]: man-db-cache-update.service: Deactivated successfully.
Feb 16 12:29:42 np0005620856.novalocal systemd[1]: Finished man-db-cache-update.service.
Feb 16 12:29:42 np0005620856.novalocal systemd[1]: man-db-cache-update.service: Consumed 37.525s CPU time.
Feb 16 12:29:42 np0005620856.novalocal systemd[1]: run-r74eff0bf19ba4197b93fa2d5d1b923e6.service: Deactivated successfully.
Feb 16 12:29:43 np0005620856.novalocal sshd-session[30220]: Accepted publickey for zuul from 38.102.83.114 port 44266 ssh2: RSA SHA256:5K91TOH9TC6bf3w9o3FKU3gHEsx4E7+lyHRIvDXmBIc
Feb 16 12:29:44 np0005620856.novalocal systemd-logind[818]: New session 6 of user zuul.
Feb 16 12:29:44 np0005620856.novalocal systemd[1]: Started Session 6 of User zuul.
Feb 16 12:29:44 np0005620856.novalocal sshd-session[30220]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 16 12:29:44 np0005620856.novalocal python3[30247]: ansible-ansible.posix.authorized_key Invoked with user=zuul key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBLw5bE4eRRUtvrM6OBDt8NKN02ATHZwzFbB7mSYctuYzb/b/Lpi0o/fihCOZ6zxxurHwzkN/sSjk0NQZ4P2XkSE= zuul@np0005620855.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 16 12:29:44 np0005620856.novalocal sudo[30271]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ztkuppyhivztnfnmpioyvsjcrfwbloyr ; /usr/bin/python3'
Feb 16 12:29:44 np0005620856.novalocal sudo[30271]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:29:44 np0005620856.novalocal python3[30273]: ansible-ansible.posix.authorized_key Invoked with user=root key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBLw5bE4eRRUtvrM6OBDt8NKN02ATHZwzFbB7mSYctuYzb/b/Lpi0o/fihCOZ6zxxurHwzkN/sSjk0NQZ4P2XkSE= zuul@np0005620855.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 16 12:29:44 np0005620856.novalocal sudo[30271]: pam_unix(sudo:session): session closed for user root
Feb 16 12:29:45 np0005620856.novalocal sudo[30297]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dtiwrbcanlusgqxxmxiaohlkcwefkdnx ; /usr/bin/python3'
Feb 16 12:29:45 np0005620856.novalocal sudo[30297]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:29:45 np0005620856.novalocal python3[30299]: ansible-ansible.builtin.user Invoked with name=cloud-admin shell=/bin/bash state=present non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005620856.novalocal update_password=always uid=None group=None groups=None comment=None home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None
Feb 16 12:29:45 np0005620856.novalocal useradd[30301]: new group: name=cloud-admin, GID=1002
Feb 16 12:29:45 np0005620856.novalocal useradd[30301]: new user: name=cloud-admin, UID=1002, GID=1002, home=/home/cloud-admin, shell=/bin/bash, from=none
Feb 16 12:29:45 np0005620856.novalocal sudo[30297]: pam_unix(sudo:session): session closed for user root
Feb 16 12:29:45 np0005620856.novalocal sudo[30331]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fadypogbiorjhpffvryyuwvzlhjtzpgf ; /usr/bin/python3'
Feb 16 12:29:45 np0005620856.novalocal sudo[30331]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:29:46 np0005620856.novalocal python3[30333]: ansible-ansible.posix.authorized_key Invoked with user=cloud-admin key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBLw5bE4eRRUtvrM6OBDt8NKN02ATHZwzFbB7mSYctuYzb/b/Lpi0o/fihCOZ6zxxurHwzkN/sSjk0NQZ4P2XkSE= zuul@np0005620855.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 16 12:29:46 np0005620856.novalocal sudo[30331]: pam_unix(sudo:session): session closed for user root
Feb 16 12:29:46 np0005620856.novalocal sudo[30409]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tjiplabptsnquwldsfcztvpretwqgcuj ; /usr/bin/python3'
Feb 16 12:29:46 np0005620856.novalocal sudo[30409]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:29:46 np0005620856.novalocal python3[30411]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/cloud-admin follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 16 12:29:46 np0005620856.novalocal sudo[30409]: pam_unix(sudo:session): session closed for user root
Feb 16 12:29:46 np0005620856.novalocal sudo[30482]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xdbvadmlqgkkuewjemyzbwgqfwfgfvzm ; /usr/bin/python3'
Feb 16 12:29:46 np0005620856.novalocal sudo[30482]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:29:46 np0005620856.novalocal python3[30484]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/cloud-admin mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1771244986.1560895-151-190019472428370/source _original_basename=tmp0eldze6o follow=False checksum=e7614e5ad3ab06eaae55b8efaa2ed81b63ea5634 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 12:29:46 np0005620856.novalocal sudo[30482]: pam_unix(sudo:session): session closed for user root
Feb 16 12:29:47 np0005620856.novalocal sudo[30532]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tklawkfnxatavllnvlthbsurmwsqbyei ; /usr/bin/python3'
Feb 16 12:29:47 np0005620856.novalocal sudo[30532]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:29:47 np0005620856.novalocal python3[30534]: ansible-ansible.builtin.hostname Invoked with name=compute-0 use=systemd
Feb 16 12:29:47 np0005620856.novalocal systemd[1]: Starting Hostname Service...
Feb 16 12:29:47 np0005620856.novalocal systemd[1]: Started Hostname Service.
Feb 16 12:29:47 np0005620856.novalocal systemd-hostnamed[30538]: Changed pretty hostname to 'compute-0'
Feb 16 12:29:47 compute-0 systemd-hostnamed[30538]: Hostname set to <compute-0> (static)
Feb 16 12:29:47 compute-0 NetworkManager[7684]: <info>  [1771244987.6539] hostname: static hostname changed from "np0005620856.novalocal" to "compute-0"
Feb 16 12:29:47 compute-0 systemd[1]: Starting Network Manager Script Dispatcher Service...
Feb 16 12:29:47 compute-0 systemd[1]: Started Network Manager Script Dispatcher Service.
Feb 16 12:29:47 compute-0 sudo[30532]: pam_unix(sudo:session): session closed for user root
Feb 16 12:29:48 compute-0 sshd-session[30223]: Connection closed by 38.102.83.114 port 44266
Feb 16 12:29:48 compute-0 sshd-session[30220]: pam_unix(sshd:session): session closed for user zuul
Feb 16 12:29:48 compute-0 systemd-logind[818]: Session 6 logged out. Waiting for processes to exit.
Feb 16 12:29:48 compute-0 systemd[1]: session-6.scope: Deactivated successfully.
Feb 16 12:29:48 compute-0 systemd[1]: session-6.scope: Consumed 1.882s CPU time.
Feb 16 12:29:48 compute-0 systemd-logind[818]: Removed session 6.
Feb 16 12:29:57 compute-0 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Feb 16 12:30:17 compute-0 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Feb 16 12:33:20 compute-0 sshd-session[30558]: Accepted publickey for zuul from 38.102.83.173 port 43394 ssh2: RSA SHA256:5K91TOH9TC6bf3w9o3FKU3gHEsx4E7+lyHRIvDXmBIc
Feb 16 12:33:20 compute-0 systemd-logind[818]: New session 7 of user zuul.
Feb 16 12:33:20 compute-0 systemd[1]: Started Session 7 of User zuul.
Feb 16 12:33:20 compute-0 sshd-session[30558]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 16 12:33:20 compute-0 python3[30634]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 16 12:33:22 compute-0 sudo[30748]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ctzxivjdjuugqlpubornwrzsrvkmyimv ; /usr/bin/python3'
Feb 16 12:33:22 compute-0 sudo[30748]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:33:22 compute-0 python3[30750]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 16 12:33:22 compute-0 sudo[30748]: pam_unix(sudo:session): session closed for user root
Feb 16 12:33:22 compute-0 sudo[30821]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lknkonlyefzvwnukbfuxkcdiwrrfdnwz ; /usr/bin/python3'
Feb 16 12:33:22 compute-0 sudo[30821]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:33:22 compute-0 python3[30823]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1771245202.1246731-34521-74902007844707/source mode=0755 _original_basename=delorean.repo follow=False checksum=cc4ab4695da8ec58c451521a3dd2f41014af145d backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 12:33:22 compute-0 sudo[30821]: pam_unix(sudo:session): session closed for user root
Feb 16 12:33:22 compute-0 sudo[30847]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qwbvwiyllljbcmyicucyorjyaweewqlk ; /usr/bin/python3'
Feb 16 12:33:22 compute-0 sudo[30847]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:33:23 compute-0 python3[30849]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean-antelope-testing.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 16 12:33:23 compute-0 sudo[30847]: pam_unix(sudo:session): session closed for user root
Feb 16 12:33:23 compute-0 sudo[30920]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-foqsroftxfuklmwjdsjhnjqnfyvcnokl ; /usr/bin/python3'
Feb 16 12:33:23 compute-0 sudo[30920]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:33:23 compute-0 python3[30922]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1771245202.1246731-34521-74902007844707/source mode=0755 _original_basename=delorean-antelope-testing.repo follow=False checksum=4ebc56dead962b5d40b8d420dad43b948b84d3fc backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 12:33:23 compute-0 sudo[30920]: pam_unix(sudo:session): session closed for user root
Feb 16 12:33:23 compute-0 sudo[30946]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fqlucldjvjndhblpdzdxogxxodljsdyq ; /usr/bin/python3'
Feb 16 12:33:23 compute-0 sudo[30946]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:33:23 compute-0 python3[30948]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-highavailability.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 16 12:33:23 compute-0 sudo[30946]: pam_unix(sudo:session): session closed for user root
Feb 16 12:33:23 compute-0 sudo[31019]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kazgkjptjqurgklsmtgstgybdlzzfkow ; /usr/bin/python3'
Feb 16 12:33:23 compute-0 sudo[31019]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:33:23 compute-0 python3[31021]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1771245202.1246731-34521-74902007844707/source mode=0755 _original_basename=repo-setup-centos-highavailability.repo follow=False checksum=55d0f695fd0d8f47cbc3044ce0dcf5f88862490f backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 12:33:23 compute-0 sudo[31019]: pam_unix(sudo:session): session closed for user root
Feb 16 12:33:23 compute-0 sudo[31045]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-deyoybvplehwzftlhkqanqrctnfnhpyz ; /usr/bin/python3'
Feb 16 12:33:23 compute-0 sudo[31045]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:33:24 compute-0 python3[31047]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-powertools.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 16 12:33:24 compute-0 sudo[31045]: pam_unix(sudo:session): session closed for user root
Feb 16 12:33:24 compute-0 sudo[31118]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aggrnckghkhbgaofbwvabexnwzbizmay ; /usr/bin/python3'
Feb 16 12:33:24 compute-0 sudo[31118]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:33:24 compute-0 python3[31120]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1771245202.1246731-34521-74902007844707/source mode=0755 _original_basename=repo-setup-centos-powertools.repo follow=False checksum=4b0cf99aa89c5c5be0151545863a7a7568f67568 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 12:33:24 compute-0 sudo[31118]: pam_unix(sudo:session): session closed for user root
Feb 16 12:33:24 compute-0 sudo[31144]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jzbxxairveadnpluwgiuqapkopackpyo ; /usr/bin/python3'
Feb 16 12:33:24 compute-0 sudo[31144]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:33:24 compute-0 python3[31146]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-appstream.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 16 12:33:24 compute-0 sudo[31144]: pam_unix(sudo:session): session closed for user root
Feb 16 12:33:24 compute-0 sudo[31217]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fctfhjhotxjlbzenubuseflztsgekkrt ; /usr/bin/python3'
Feb 16 12:33:24 compute-0 sudo[31217]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:33:24 compute-0 python3[31219]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1771245202.1246731-34521-74902007844707/source mode=0755 _original_basename=repo-setup-centos-appstream.repo follow=False checksum=e89244d2503b2996429dda1857290c1e91e393a1 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 12:33:24 compute-0 sudo[31217]: pam_unix(sudo:session): session closed for user root
Feb 16 12:33:24 compute-0 sudo[31243]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ozpfgmkykwashpevnbtwimewneoelkct ; /usr/bin/python3'
Feb 16 12:33:24 compute-0 sudo[31243]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:33:25 compute-0 python3[31245]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-baseos.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 16 12:33:25 compute-0 sudo[31243]: pam_unix(sudo:session): session closed for user root
Feb 16 12:33:25 compute-0 sudo[31316]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-orqbohjuzkvqqzrlorfobghtbitxtsog ; /usr/bin/python3'
Feb 16 12:33:25 compute-0 sudo[31316]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:33:25 compute-0 python3[31318]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1771245202.1246731-34521-74902007844707/source mode=0755 _original_basename=repo-setup-centos-baseos.repo follow=False checksum=36d926db23a40dbfa5c84b5e4d43eac6fa2301d6 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 12:33:25 compute-0 sudo[31316]: pam_unix(sudo:session): session closed for user root
Feb 16 12:33:25 compute-0 sudo[31342]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cpgqpjhwylfglgpfxhvugbzwbfyczncy ; /usr/bin/python3'
Feb 16 12:33:25 compute-0 sudo[31342]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:33:25 compute-0 python3[31344]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo.md5 follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 16 12:33:25 compute-0 sudo[31342]: pam_unix(sudo:session): session closed for user root
Feb 16 12:33:25 compute-0 sudo[31415]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ptblakwhewlxivkrjhqhlfjasmpoefbn ; /usr/bin/python3'
Feb 16 12:33:25 compute-0 sudo[31415]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:33:25 compute-0 python3[31417]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1771245202.1246731-34521-74902007844707/source mode=0755 _original_basename=delorean.repo.md5 follow=False checksum=362a603578148d54e8cd25942b88d7f471cc677a backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 12:33:25 compute-0 sudo[31415]: pam_unix(sudo:session): session closed for user root
Feb 16 12:33:27 compute-0 sshd-session[31442]: Connection closed by 192.168.122.11 port 54016 [preauth]
Feb 16 12:33:27 compute-0 sshd-session[31444]: Connection closed by 192.168.122.11 port 54030 [preauth]
Feb 16 12:33:27 compute-0 sshd-session[31445]: Unable to negotiate with 192.168.122.11 port 54042: no matching host key type found. Their offer: ssh-ed25519 [preauth]
Feb 16 12:33:27 compute-0 sshd-session[31446]: Unable to negotiate with 192.168.122.11 port 54050: no matching host key type found. Their offer: sk-ecdsa-sha2-nistp256@openssh.com [preauth]
Feb 16 12:33:28 compute-0 sshd-session[31443]: Unable to negotiate with 192.168.122.11 port 54066: no matching host key type found. Their offer: sk-ssh-ed25519@openssh.com [preauth]
Feb 16 12:36:39 compute-0 sshd-session[31454]: Connection closed by 170.64.172.236 port 53012
Feb 16 12:37:00 compute-0 sshd-session[31455]: Connection closed by authenticating user root 170.64.172.236 port 56138 [preauth]
Feb 16 12:38:19 compute-0 python3[31481]: ansible-ansible.legacy.command Invoked with _raw_params=hostname _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 12:43:18 compute-0 sshd-session[30561]: Received disconnect from 38.102.83.173 port 43394:11: disconnected by user
Feb 16 12:43:18 compute-0 sshd-session[30561]: Disconnected from user zuul 38.102.83.173 port 43394
Feb 16 12:43:18 compute-0 sshd-session[30558]: pam_unix(sshd:session): session closed for user zuul
Feb 16 12:43:18 compute-0 systemd[1]: session-7.scope: Deactivated successfully.
Feb 16 12:43:18 compute-0 systemd[1]: session-7.scope: Consumed 4.197s CPU time.
Feb 16 12:43:18 compute-0 systemd-logind[818]: Session 7 logged out. Waiting for processes to exit.
Feb 16 12:43:18 compute-0 systemd-logind[818]: Removed session 7.
Feb 16 12:52:45 compute-0 sshd-session[31488]: Connection closed by 103.213.244.180 port 38154 [preauth]
Feb 16 12:53:49 compute-0 sshd-session[31490]: Accepted publickey for zuul from 192.168.122.30 port 55774 ssh2: ECDSA SHA256:6P1Q60WP6ePVjzkJMgpM7PslBYS6YIK3pVS2L1FZ4Yk
Feb 16 12:53:49 compute-0 systemd-logind[818]: New session 8 of user zuul.
Feb 16 12:53:49 compute-0 systemd[1]: Started Session 8 of User zuul.
Feb 16 12:53:49 compute-0 sshd-session[31490]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 16 12:53:50 compute-0 python3.9[31643]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 16 12:53:51 compute-0 sudo[31822]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nbnfksbezzzzksmxhpmebqljhxoovxqo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246431.285171-43-14050749758568/AnsiballZ_command.py'
Feb 16 12:53:51 compute-0 sudo[31822]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:53:51 compute-0 python3.9[31824]: ansible-ansible.legacy.command Invoked with _raw_params=set -euxo pipefail
                                            pushd /var/tmp
                                            curl -sL https://github.com/openstack-k8s-operators/repo-setup/archive/refs/heads/main.tar.gz | tar -xz
                                            pushd repo-setup-main
                                            python3 -m venv ./venv
                                            PBR_VERSION=0.0.0 ./venv/bin/pip install ./
                                            ./venv/bin/repo-setup current-podified -b antelope
                                            popd
                                            rm -rf repo-setup-main
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 12:53:58 compute-0 sudo[31822]: pam_unix(sudo:session): session closed for user root
Feb 16 12:53:59 compute-0 sshd-session[31493]: Connection closed by 192.168.122.30 port 55774
Feb 16 12:53:59 compute-0 sshd-session[31490]: pam_unix(sshd:session): session closed for user zuul
Feb 16 12:53:59 compute-0 systemd[1]: session-8.scope: Deactivated successfully.
Feb 16 12:53:59 compute-0 systemd[1]: session-8.scope: Consumed 7.070s CPU time.
Feb 16 12:53:59 compute-0 systemd-logind[818]: Session 8 logged out. Waiting for processes to exit.
Feb 16 12:53:59 compute-0 systemd-logind[818]: Removed session 8.
Feb 16 12:54:05 compute-0 sshd-session[31881]: Accepted publickey for zuul from 192.168.122.30 port 56610 ssh2: ECDSA SHA256:6P1Q60WP6ePVjzkJMgpM7PslBYS6YIK3pVS2L1FZ4Yk
Feb 16 12:54:05 compute-0 systemd-logind[818]: New session 9 of user zuul.
Feb 16 12:54:05 compute-0 systemd[1]: Started Session 9 of User zuul.
Feb 16 12:54:05 compute-0 sshd-session[31881]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 16 12:54:07 compute-0 python3.9[32034]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 16 12:54:07 compute-0 sshd-session[31884]: Connection closed by 192.168.122.30 port 56610
Feb 16 12:54:07 compute-0 sshd-session[31881]: pam_unix(sshd:session): session closed for user zuul
Feb 16 12:54:07 compute-0 systemd-logind[818]: Session 9 logged out. Waiting for processes to exit.
Feb 16 12:54:07 compute-0 systemd[1]: session-9.scope: Deactivated successfully.
Feb 16 12:54:07 compute-0 systemd-logind[818]: Removed session 9.
Feb 16 12:54:23 compute-0 sshd-session[32062]: Accepted publickey for zuul from 192.168.122.30 port 51920 ssh2: ECDSA SHA256:6P1Q60WP6ePVjzkJMgpM7PslBYS6YIK3pVS2L1FZ4Yk
Feb 16 12:54:23 compute-0 systemd-logind[818]: New session 10 of user zuul.
Feb 16 12:54:23 compute-0 systemd[1]: Started Session 10 of User zuul.
Feb 16 12:54:23 compute-0 sshd-session[32062]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 16 12:54:25 compute-0 python3.9[32215]: ansible-ansible.legacy.ping Invoked with data=pong
Feb 16 12:54:26 compute-0 python3.9[32389]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 16 12:54:27 compute-0 sudo[32539]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qsvabstdwasrxcodhhwrjkawaczxwynk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246466.6974247-69-7419463175955/AnsiballZ_command.py'
Feb 16 12:54:27 compute-0 sudo[32539]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:54:27 compute-0 python3.9[32541]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 12:54:27 compute-0 sudo[32539]: pam_unix(sudo:session): session closed for user root
Feb 16 12:54:28 compute-0 sudo[32692]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-chenrxwbgvprkyrsgoelflakomhoghph ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246467.6096275-93-119609612041949/AnsiballZ_stat.py'
Feb 16 12:54:28 compute-0 sudo[32692]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:54:28 compute-0 python3.9[32694]: ansible-ansible.builtin.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 16 12:54:28 compute-0 sudo[32692]: pam_unix(sudo:session): session closed for user root
Feb 16 12:54:29 compute-0 sudo[32844]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-obccwpntgshpshjluyxnjoxvtooglxag ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246468.6959264-109-194013399100064/AnsiballZ_file.py'
Feb 16 12:54:29 compute-0 sudo[32844]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:54:29 compute-0 python3.9[32846]: ansible-ansible.builtin.file Invoked with mode=755 path=/etc/ansible/facts.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 12:54:29 compute-0 sudo[32844]: pam_unix(sudo:session): session closed for user root
Feb 16 12:54:29 compute-0 sudo[32996]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aohpggyfvgnsquoimbgydholsenohycq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246469.6101565-125-192179736074488/AnsiballZ_stat.py'
Feb 16 12:54:29 compute-0 sudo[32996]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:54:30 compute-0 python3.9[32998]: ansible-ansible.legacy.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 12:54:30 compute-0 sudo[32996]: pam_unix(sudo:session): session closed for user root
Feb 16 12:54:30 compute-0 sudo[33119]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mhavirpzgrheexfmtpmdhedzxyqdozio ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246469.6101565-125-192179736074488/AnsiballZ_copy.py'
Feb 16 12:54:30 compute-0 sudo[33119]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:54:30 compute-0 python3.9[33121]: ansible-ansible.legacy.copy Invoked with dest=/etc/ansible/facts.d/bootc.fact mode=755 src=/home/zuul/.ansible/tmp/ansible-tmp-1771246469.6101565-125-192179736074488/.source.fact _original_basename=bootc.fact follow=False checksum=eb4122ce7fc50a38407beb511c4ff8c178005b12 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 12:54:30 compute-0 sudo[33119]: pam_unix(sudo:session): session closed for user root
Feb 16 12:54:31 compute-0 sudo[33271]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qrgjaeounhrznmwzbojlsdbwdnbfeoky ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246470.999282-155-211573344498248/AnsiballZ_setup.py'
Feb 16 12:54:31 compute-0 sudo[33271]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:54:31 compute-0 python3.9[33273]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 16 12:54:31 compute-0 sudo[33271]: pam_unix(sudo:session): session closed for user root
Feb 16 12:54:32 compute-0 sudo[33427]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-orxvgnfptdwzllconkngkurdjkyempol ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246471.9649606-171-37716913584104/AnsiballZ_file.py'
Feb 16 12:54:32 compute-0 sudo[33427]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:54:32 compute-0 python3.9[33429]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/log/journal setype=var_log_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 16 12:54:32 compute-0 sudo[33427]: pam_unix(sudo:session): session closed for user root
Feb 16 12:54:33 compute-0 sudo[33579]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yfwjmvvdfcvmzofnfjvqhxbtszkdciwj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246472.9679794-189-263589959801234/AnsiballZ_file.py'
Feb 16 12:54:33 compute-0 sudo[33579]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:54:33 compute-0 python3.9[33581]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/config-data/ansible-generated recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 16 12:54:33 compute-0 sudo[33579]: pam_unix(sudo:session): session closed for user root
Feb 16 12:54:34 compute-0 python3.9[33731]: ansible-ansible.builtin.service_facts Invoked
Feb 16 12:54:38 compute-0 python3.9[33985]: ansible-ansible.builtin.lineinfile Invoked with line=cloud-init=disabled path=/proc/cmdline state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 12:54:39 compute-0 python3.9[34135]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 16 12:54:40 compute-0 python3.9[34289]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 16 12:54:41 compute-0 sudo[34445]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rvrzsdniujikqnluljhpwcrbymlylwoo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246480.939033-285-280747797846087/AnsiballZ_setup.py'
Feb 16 12:54:41 compute-0 sudo[34445]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:54:41 compute-0 python3.9[34447]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 16 12:54:41 compute-0 sudo[34445]: pam_unix(sudo:session): session closed for user root
Feb 16 12:54:42 compute-0 sudo[34529]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tkrifutlslpigpzkdcunriwcsxjcvvdc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246480.939033-285-280747797846087/AnsiballZ_dnf.py'
Feb 16 12:54:42 compute-0 sudo[34529]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:54:42 compute-0 python3.9[34531]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 16 12:55:24 compute-0 systemd[1]: Reloading.
Feb 16 12:55:24 compute-0 systemd-rc-local-generator[34724]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 12:55:24 compute-0 systemd[1]: Listening on Device-mapper event daemon FIFOs.
Feb 16 12:55:25 compute-0 systemd[1]: Reloading.
Feb 16 12:55:25 compute-0 systemd-rc-local-generator[34773]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 12:55:25 compute-0 systemd[1]: Starting dnf makecache...
Feb 16 12:55:25 compute-0 systemd[1]: Starting Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling...
Feb 16 12:55:25 compute-0 systemd[1]: Finished Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling.
Feb 16 12:55:25 compute-0 systemd[1]: Reloading.
Feb 16 12:55:25 compute-0 systemd-rc-local-generator[34820]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 12:55:25 compute-0 dnf[34795]: Failed determining last makecache time.
Feb 16 12:55:25 compute-0 dnf[34795]: delorean-openstack-barbican-42b4c41831408a8e323 132 kB/s | 3.0 kB     00:00
Feb 16 12:55:25 compute-0 dnf[34795]: delorean-python-glean-642fffe0203a8ffcc2443db52 189 kB/s | 3.0 kB     00:00
Feb 16 12:55:25 compute-0 systemd[1]: Listening on LVM2 poll daemon socket.
Feb 16 12:55:25 compute-0 dnf[34795]: delorean-openstack-cinder-1c00d6490d88e436f26ef 171 kB/s | 3.0 kB     00:00
Feb 16 12:55:25 compute-0 dnf[34795]: delorean-python-stevedore-c4acc5639fd2329372142 185 kB/s | 3.0 kB     00:00
Feb 16 12:55:25 compute-0 dnf[34795]: delorean-python-cloudkitty-tests-tempest-783703 180 kB/s | 3.0 kB     00:00
Feb 16 12:55:25 compute-0 dnf[34795]: delorean-diskimage-builder-61b717cc45660834fe9a 159 kB/s | 3.0 kB     00:00
Feb 16 12:55:25 compute-0 dnf[34795]: delorean-openstack-nova-eaa65f0b85123a4ee343246 172 kB/s | 3.0 kB     00:00
Feb 16 12:55:25 compute-0 dnf[34795]: delorean-python-designate-tests-tempest-347fdbc 168 kB/s | 3.0 kB     00:00
Feb 16 12:55:25 compute-0 dbus-broker-launch[793]: Noticed file-system modification, trigger reload.
Feb 16 12:55:25 compute-0 dbus-broker-launch[793]: Noticed file-system modification, trigger reload.
Feb 16 12:55:25 compute-0 dnf[34795]: delorean-openstack-glance-1fd12c29b339f30fe823e 173 kB/s | 3.0 kB     00:00
Feb 16 12:55:25 compute-0 dbus-broker-launch[793]: Noticed file-system modification, trigger reload.
Feb 16 12:55:25 compute-0 dnf[34795]: delorean-openstack-keystone-e4b40af0ae3698fbbbb 171 kB/s | 3.0 kB     00:00
Feb 16 12:55:25 compute-0 dnf[34795]: delorean-openstack-manila-d783d10e75495b73866db 177 kB/s | 3.0 kB     00:00
Feb 16 12:55:25 compute-0 dnf[34795]: delorean-openstack-neutron-95cadbd379667c8520c8 127 kB/s | 3.0 kB     00:00
Feb 16 12:55:26 compute-0 dnf[34795]: delorean-openstack-octavia-5975097dd4b021385178 157 kB/s | 3.0 kB     00:00
Feb 16 12:55:26 compute-0 dnf[34795]: delorean-openstack-watcher-c014f81a8647287f6dcc 166 kB/s | 3.0 kB     00:00
Feb 16 12:55:26 compute-0 dnf[34795]: delorean-python-tcib-78032d201b02cee27e8e644c61 158 kB/s | 3.0 kB     00:00
Feb 16 12:55:26 compute-0 dnf[34795]: delorean-puppet-ceph-7352068d7b8c84ded636ab3158 160 kB/s | 3.0 kB     00:00
Feb 16 12:55:26 compute-0 dnf[34795]: delorean-openstack-swift-dc98a8463506ac520c469a 156 kB/s | 3.0 kB     00:00
Feb 16 12:55:26 compute-0 dnf[34795]: delorean-python-tempestconf-8515371b7cceebd4282 200 kB/s | 3.0 kB     00:00
Feb 16 12:55:26 compute-0 dnf[34795]: delorean-openstack-heat-ui-013accbfd179753bc3f0 198 kB/s | 3.0 kB     00:00
Feb 16 12:55:26 compute-0 dnf[34795]: CentOS Stream 9 - BaseOS                         65 kB/s | 7.0 kB     00:00
Feb 16 12:55:26 compute-0 dnf[34795]: CentOS Stream 9 - AppStream                      30 kB/s | 7.1 kB     00:00
Feb 16 12:55:26 compute-0 dnf[34795]: CentOS Stream 9 - CRB                            67 kB/s | 6.9 kB     00:00
Feb 16 12:55:26 compute-0 dnf[34795]: CentOS Stream 9 - Extras packages                66 kB/s | 7.6 kB     00:00
Feb 16 12:55:26 compute-0 dnf[34795]: dlrn-antelope-testing                           174 kB/s | 3.0 kB     00:00
Feb 16 12:55:26 compute-0 dnf[34795]: dlrn-antelope-build-deps                        180 kB/s | 3.0 kB     00:00
Feb 16 12:55:26 compute-0 dnf[34795]: centos9-rabbitmq                                112 kB/s | 3.0 kB     00:00
Feb 16 12:55:26 compute-0 dnf[34795]: centos9-storage                                 124 kB/s | 3.0 kB     00:00
Feb 16 12:55:27 compute-0 dnf[34795]: centos9-opstools                                145 kB/s | 3.0 kB     00:00
Feb 16 12:55:27 compute-0 dnf[34795]: NFV SIG OpenvSwitch                             138 kB/s | 3.0 kB     00:00
Feb 16 12:55:27 compute-0 dnf[34795]: repo-setup-centos-appstream                     198 kB/s | 4.4 kB     00:00
Feb 16 12:55:27 compute-0 dnf[34795]: repo-setup-centos-baseos                         83 kB/s | 3.9 kB     00:00
Feb 16 12:55:27 compute-0 dnf[34795]: repo-setup-centos-highavailability              142 kB/s | 3.9 kB     00:00
Feb 16 12:55:27 compute-0 dnf[34795]: repo-setup-centos-powertools                    212 kB/s | 4.3 kB     00:00
Feb 16 12:55:27 compute-0 dnf[34795]: Extra Packages for Enterprise Linux 9 - x86_64  216 kB/s |  29 kB     00:00
Feb 16 12:55:28 compute-0 dnf[34795]: Metadata cache created.
Feb 16 12:55:28 compute-0 systemd[1]: dnf-makecache.service: Deactivated successfully.
Feb 16 12:55:28 compute-0 systemd[1]: Finished dnf makecache.
Feb 16 12:55:28 compute-0 systemd[1]: dnf-makecache.service: Consumed 1.761s CPU time.
Feb 16 12:56:20 compute-0 kernel: SELinux:  Converting 2727 SID table entries...
Feb 16 12:56:20 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Feb 16 12:56:20 compute-0 kernel: SELinux:  policy capability open_perms=1
Feb 16 12:56:20 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Feb 16 12:56:20 compute-0 kernel: SELinux:  policy capability always_check_network=0
Feb 16 12:56:20 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Feb 16 12:56:20 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Feb 16 12:56:20 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Feb 16 12:56:20 compute-0 dbus-broker-launch[808]: avc:  op=load_policy lsm=selinux seqno=6 res=1
Feb 16 12:56:20 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Feb 16 12:56:20 compute-0 systemd[1]: Starting man-db-cache-update.service...
Feb 16 12:56:20 compute-0 systemd[1]: Reloading.
Feb 16 12:56:20 compute-0 systemd-rc-local-generator[35193]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 12:56:21 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Feb 16 12:56:21 compute-0 sudo[34529]: pam_unix(sudo:session): session closed for user root
Feb 16 12:56:21 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Feb 16 12:56:21 compute-0 systemd[1]: Finished man-db-cache-update.service.
Feb 16 12:56:21 compute-0 systemd[1]: run-rac75a1b168fc4acc9219f8b9686b4bb3.service: Deactivated successfully.
Feb 16 12:56:32 compute-0 sudo[36115]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fozeczsxfzdsqmetfghnwhtqpicwaovt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246592.6323104-309-111165002744178/AnsiballZ_command.py'
Feb 16 12:56:32 compute-0 sudo[36115]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:56:33 compute-0 python3.9[36117]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 12:56:34 compute-0 sudo[36115]: pam_unix(sudo:session): session closed for user root
Feb 16 12:56:34 compute-0 sudo[36396]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-obghhlrnidjlhykoaycgztdofvppdnxa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246594.2023098-325-266187781000672/AnsiballZ_selinux.py'
Feb 16 12:56:34 compute-0 sudo[36396]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:56:35 compute-0 python3.9[36398]: ansible-ansible.posix.selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config update_kernel_param=False
Feb 16 12:56:35 compute-0 sudo[36396]: pam_unix(sudo:session): session closed for user root
Feb 16 12:56:35 compute-0 sudo[36548]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bafjwnpgqtzzsnwlljfpqcnkjulaluyv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246595.5042942-347-216356983613913/AnsiballZ_command.py'
Feb 16 12:56:35 compute-0 sudo[36548]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:56:35 compute-0 python3.9[36550]: ansible-ansible.legacy.command Invoked with cmd=dd if=/dev/zero of=/swap count=1024 bs=1M creates=/swap _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None removes=None stdin=None
Feb 16 12:56:36 compute-0 sudo[36548]: pam_unix(sudo:session): session closed for user root
Feb 16 12:56:37 compute-0 sudo[36701]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yosypuqqdiltiifyloceekcqqunijdwv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246597.5110664-363-245015875359115/AnsiballZ_file.py'
Feb 16 12:56:37 compute-0 sudo[36701]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:56:39 compute-0 python3.9[36703]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/swap recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 12:56:39 compute-0 sudo[36701]: pam_unix(sudo:session): session closed for user root
Feb 16 12:56:40 compute-0 sudo[36853]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iwrhewdwcumgjjsdwhfcserbhzonlznp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246600.0175738-379-96586124222120/AnsiballZ_mount.py'
Feb 16 12:56:40 compute-0 sudo[36853]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:56:40 compute-0 python3.9[36855]: ansible-ansible.posix.mount Invoked with dump=0 fstype=swap name=none opts=sw passno=0 src=/swap state=present path=none boot=True opts_no_log=False backup=False fstab=None
Feb 16 12:56:40 compute-0 sudo[36853]: pam_unix(sudo:session): session closed for user root
Feb 16 12:56:41 compute-0 sudo[37005]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ugdrpwmufsrxhgsopffuyowjbcigwnrh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246601.6321623-435-182824675978586/AnsiballZ_file.py'
Feb 16 12:56:41 compute-0 sudo[37005]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:56:42 compute-0 python3.9[37007]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/ca-trust/source/anchors setype=cert_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 16 12:56:42 compute-0 sudo[37005]: pam_unix(sudo:session): session closed for user root
Feb 16 12:56:42 compute-0 sudo[37157]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ujfdnflcwmyzvvjmpaaeuecbfgkingjd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246602.3227048-451-54477349955825/AnsiballZ_stat.py'
Feb 16 12:56:42 compute-0 sudo[37157]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:56:42 compute-0 python3.9[37159]: ansible-ansible.legacy.stat Invoked with path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 12:56:42 compute-0 sudo[37157]: pam_unix(sudo:session): session closed for user root
Feb 16 12:56:43 compute-0 sudo[37280]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bzfmhazssrgxwvliiqmdkpjddnempudi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246602.3227048-451-54477349955825/AnsiballZ_copy.py'
Feb 16 12:56:43 compute-0 sudo[37280]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:56:43 compute-0 python3.9[37282]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771246602.3227048-451-54477349955825/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=01d1f535123be2e7a115b69213a7cb6af06b70ab backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 12:56:43 compute-0 sudo[37280]: pam_unix(sudo:session): session closed for user root
Feb 16 12:56:46 compute-0 sudo[37432]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cfxhqvbskgvrvmpyccomdqwtirlqshyy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246606.6008031-499-74546852306466/AnsiballZ_stat.py'
Feb 16 12:56:46 compute-0 sudo[37432]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:56:47 compute-0 python3.9[37434]: ansible-ansible.builtin.stat Invoked with path=/etc/lvm/devices/system.devices follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 16 12:56:47 compute-0 sudo[37432]: pam_unix(sudo:session): session closed for user root
Feb 16 12:56:47 compute-0 sudo[37584]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dqbecswevsbrmiygbyomwlsdofcwjdsd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246607.2670295-515-55641057755701/AnsiballZ_command.py'
Feb 16 12:56:47 compute-0 sudo[37584]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:56:47 compute-0 python3.9[37586]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/vgimportdevices --all _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 12:56:47 compute-0 sudo[37584]: pam_unix(sudo:session): session closed for user root
Feb 16 12:56:48 compute-0 sudo[37737]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zfszmvuucfxpysjtzzqaioxpmuzugnbr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246607.985988-531-128887757595040/AnsiballZ_file.py'
Feb 16 12:56:48 compute-0 sudo[37737]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:56:48 compute-0 python3.9[37739]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/lvm/devices/system.devices state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 12:56:48 compute-0 sudo[37737]: pam_unix(sudo:session): session closed for user root
Feb 16 12:56:49 compute-0 sudo[37889]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gfqmprkrpsqubjfagzaoucklqamfevwy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246608.8801801-553-221298190609154/AnsiballZ_getent.py'
Feb 16 12:56:49 compute-0 sudo[37889]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:56:49 compute-0 python3.9[37891]: ansible-ansible.builtin.getent Invoked with database=passwd key=qemu fail_key=True service=None split=None
Feb 16 12:56:49 compute-0 sudo[37889]: pam_unix(sudo:session): session closed for user root
Feb 16 12:56:49 compute-0 rsyslogd[1017]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Feb 16 12:56:50 compute-0 sudo[38043]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qyqunwfskcppsydnsxoawyeftcdplzmk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246609.6585686-569-181699041309150/AnsiballZ_group.py'
Feb 16 12:56:50 compute-0 sudo[38043]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:56:50 compute-0 python3.9[38045]: ansible-ansible.builtin.group Invoked with gid=107 name=qemu state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Feb 16 12:56:50 compute-0 groupadd[38046]: group added to /etc/group: name=qemu, GID=107
Feb 16 12:56:50 compute-0 groupadd[38046]: group added to /etc/gshadow: name=qemu
Feb 16 12:56:50 compute-0 groupadd[38046]: new group: name=qemu, GID=107
Feb 16 12:56:50 compute-0 sudo[38043]: pam_unix(sudo:session): session closed for user root
Feb 16 12:56:50 compute-0 sudo[38202]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ephxgscewhuyugmtghzdrawqmrrbwnjc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246610.5010839-585-25786456693673/AnsiballZ_user.py'
Feb 16 12:56:50 compute-0 sudo[38202]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:56:51 compute-0 python3.9[38204]: ansible-ansible.builtin.user Invoked with comment=qemu user group=qemu groups=[''] name=qemu shell=/sbin/nologin state=present uid=107 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Feb 16 12:56:51 compute-0 useradd[38206]: new user: name=qemu, UID=107, GID=107, home=/home/qemu, shell=/sbin/nologin, from=/dev/pts/0
Feb 16 12:56:51 compute-0 sudo[38202]: pam_unix(sudo:session): session closed for user root
Feb 16 12:56:51 compute-0 sudo[38362]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sgzscwqapxgoovygiwaviqggtitfkgxd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246611.3952243-601-194920970579071/AnsiballZ_getent.py'
Feb 16 12:56:51 compute-0 sudo[38362]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:56:51 compute-0 python3.9[38364]: ansible-ansible.builtin.getent Invoked with database=passwd key=hugetlbfs fail_key=True service=None split=None
Feb 16 12:56:51 compute-0 sudo[38362]: pam_unix(sudo:session): session closed for user root
Feb 16 12:56:52 compute-0 sudo[38515]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jkcfxeszgjqjqtjizwuivgrpuyaqzwha ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246612.018336-617-115373480442272/AnsiballZ_group.py'
Feb 16 12:56:52 compute-0 sudo[38515]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:56:52 compute-0 python3.9[38517]: ansible-ansible.builtin.group Invoked with gid=42477 name=hugetlbfs state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Feb 16 12:56:52 compute-0 groupadd[38518]: group added to /etc/group: name=hugetlbfs, GID=42477
Feb 16 12:56:52 compute-0 groupadd[38518]: group added to /etc/gshadow: name=hugetlbfs
Feb 16 12:56:52 compute-0 groupadd[38518]: new group: name=hugetlbfs, GID=42477
Feb 16 12:56:52 compute-0 sudo[38515]: pam_unix(sudo:session): session closed for user root
Feb 16 12:56:53 compute-0 sudo[38673]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-grpuunawmslatraivwhnfebhmfqufrfy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246612.7836714-635-57040620327949/AnsiballZ_file.py'
Feb 16 12:56:53 compute-0 sudo[38673]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:56:53 compute-0 python3.9[38675]: ansible-ansible.builtin.file Invoked with group=qemu mode=0755 owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None serole=None selevel=None attributes=None
Feb 16 12:56:53 compute-0 sudo[38673]: pam_unix(sudo:session): session closed for user root
Feb 16 12:56:53 compute-0 sudo[38825]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-idfsjreagpxyaphyeexvcqzgevzirxof ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246613.6265082-657-46317017360501/AnsiballZ_dnf.py'
Feb 16 12:56:53 compute-0 sudo[38825]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:56:54 compute-0 python3.9[38827]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 16 12:56:56 compute-0 sudo[38825]: pam_unix(sudo:session): session closed for user root
Feb 16 12:56:56 compute-0 sudo[38979]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dribtjanrexkeyftjmyjmcyheqjqykif ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246616.3592064-673-120334065755558/AnsiballZ_file.py'
Feb 16 12:56:56 compute-0 sudo[38979]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:56:56 compute-0 python3.9[38981]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/modules-load.d setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 16 12:56:56 compute-0 sudo[38979]: pam_unix(sudo:session): session closed for user root
Feb 16 12:56:57 compute-0 sudo[39131]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pwlnjljvsdweznguctdvvxxlnqxbetwo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246617.158202-689-20541925107554/AnsiballZ_stat.py'
Feb 16 12:56:57 compute-0 sudo[39131]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:56:57 compute-0 python3.9[39133]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 12:56:57 compute-0 sudo[39131]: pam_unix(sudo:session): session closed for user root
Feb 16 12:56:57 compute-0 sudo[39254]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wsnjjoavtygxjdhejshchjwyxhbjdunz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246617.158202-689-20541925107554/AnsiballZ_copy.py'
Feb 16 12:56:57 compute-0 sudo[39254]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:56:58 compute-0 python3.9[39256]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771246617.158202-689-20541925107554/.source.conf follow=False _original_basename=edpm-modprobe.conf.j2 checksum=8021efe01721d8fa8cab46b95c00ec1be6dbb9d0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Feb 16 12:56:58 compute-0 sudo[39254]: pam_unix(sudo:session): session closed for user root
Feb 16 12:56:59 compute-0 sudo[39406]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ugqdfesffzhfcwblrogmehwtgfvjlodj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246618.3435721-719-11567983857161/AnsiballZ_systemd.py'
Feb 16 12:56:59 compute-0 sudo[39406]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:56:59 compute-0 python3.9[39408]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 16 12:56:59 compute-0 systemd[1]: Starting Load Kernel Modules...
Feb 16 12:56:59 compute-0 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this.
Feb 16 12:56:59 compute-0 kernel: Bridge firewalling registered
Feb 16 12:56:59 compute-0 systemd-modules-load[39412]: Inserted module 'br_netfilter'
Feb 16 12:56:59 compute-0 systemd[1]: Finished Load Kernel Modules.
Feb 16 12:56:59 compute-0 sudo[39406]: pam_unix(sudo:session): session closed for user root
Feb 16 12:56:59 compute-0 sudo[39566]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ysgqfknkkptssdbpdvqmzggoherozrpt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246619.6210835-735-200601555706526/AnsiballZ_stat.py'
Feb 16 12:56:59 compute-0 sudo[39566]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:57:00 compute-0 python3.9[39568]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 12:57:00 compute-0 sudo[39566]: pam_unix(sudo:session): session closed for user root
Feb 16 12:57:00 compute-0 sudo[39689]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tpnykxbiochuifthxchxfrievzpaligt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246619.6210835-735-200601555706526/AnsiballZ_copy.py'
Feb 16 12:57:00 compute-0 sudo[39689]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:57:00 compute-0 python3.9[39691]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysctl.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771246619.6210835-735-200601555706526/.source.conf follow=False _original_basename=edpm-sysctl.conf.j2 checksum=2a366439721b855adcfe4d7f152babb68596a007 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Feb 16 12:57:00 compute-0 sudo[39689]: pam_unix(sudo:session): session closed for user root
Feb 16 12:57:01 compute-0 sudo[39841]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qfllnqaqzelbvbgpqihehxtopzojiabd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246620.9730766-771-196144494293172/AnsiballZ_dnf.py'
Feb 16 12:57:01 compute-0 sudo[39841]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:57:01 compute-0 python3.9[39843]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 16 12:57:05 compute-0 dbus-broker-launch[793]: Noticed file-system modification, trigger reload.
Feb 16 12:57:05 compute-0 dbus-broker-launch[793]: Noticed file-system modification, trigger reload.
Feb 16 12:57:06 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Feb 16 12:57:06 compute-0 systemd[1]: Starting man-db-cache-update.service...
Feb 16 12:57:06 compute-0 systemd[1]: Reloading.
Feb 16 12:57:06 compute-0 systemd-rc-local-generator[39898]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 12:57:06 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Feb 16 12:57:06 compute-0 sudo[39841]: pam_unix(sudo:session): session closed for user root
Feb 16 12:57:09 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Feb 16 12:57:09 compute-0 systemd[1]: Finished man-db-cache-update.service.
Feb 16 12:57:09 compute-0 systemd[1]: man-db-cache-update.service: Consumed 3.863s CPU time.
Feb 16 12:57:09 compute-0 systemd[1]: run-rdb06f9b7d91f43beb15c2dcd05ce5298.service: Deactivated successfully.
Feb 16 12:57:10 compute-0 python3.9[43628]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 16 12:57:11 compute-0 python3.9[43780]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile
Feb 16 12:57:11 compute-0 python3.9[43930]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 16 12:57:12 compute-0 sudo[44080]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vhtvgahjblydppazqnfjyderisgtenbh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246632.4194698-849-5193553015351/AnsiballZ_command.py'
Feb 16 12:57:12 compute-0 sudo[44080]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:57:12 compute-0 python3.9[44082]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/tuned-adm profile throughput-performance _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 12:57:13 compute-0 systemd[1]: Starting Dynamic System Tuning Daemon...
Feb 16 12:57:13 compute-0 systemd[1]: Starting Authorization Manager...
Feb 16 12:57:13 compute-0 systemd[1]: Started Dynamic System Tuning Daemon.
Feb 16 12:57:13 compute-0 polkitd[44299]: Started polkitd version 0.117
Feb 16 12:57:13 compute-0 polkitd[44299]: Loading rules from directory /etc/polkit-1/rules.d
Feb 16 12:57:13 compute-0 polkitd[44299]: Loading rules from directory /usr/share/polkit-1/rules.d
Feb 16 12:57:13 compute-0 polkitd[44299]: Finished loading, compiling and executing 2 rules
Feb 16 12:57:13 compute-0 systemd[1]: Started Authorization Manager.
Feb 16 12:57:13 compute-0 polkitd[44299]: Acquired the name org.freedesktop.PolicyKit1 on the system bus
Feb 16 12:57:13 compute-0 sudo[44080]: pam_unix(sudo:session): session closed for user root
Feb 16 12:57:14 compute-0 sudo[44467]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tkfnvzazdzkuiavyiyddnznkyaebdmau ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246633.984533-867-227594344489312/AnsiballZ_systemd.py'
Feb 16 12:57:14 compute-0 sudo[44467]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:57:14 compute-0 python3.9[44469]: ansible-ansible.builtin.systemd Invoked with enabled=True name=tuned state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 16 12:57:14 compute-0 systemd[1]: Stopping Dynamic System Tuning Daemon...
Feb 16 12:57:14 compute-0 systemd[1]: tuned.service: Deactivated successfully.
Feb 16 12:57:14 compute-0 systemd[1]: Stopped Dynamic System Tuning Daemon.
Feb 16 12:57:14 compute-0 systemd[1]: Starting Dynamic System Tuning Daemon...
Feb 16 12:57:14 compute-0 systemd[1]: Started Dynamic System Tuning Daemon.
Feb 16 12:57:14 compute-0 sudo[44467]: pam_unix(sudo:session): session closed for user root
Feb 16 12:57:15 compute-0 python3.9[44630]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline
Feb 16 12:57:18 compute-0 sudo[44780]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dwzxoaihxppyzpdharfnldyhqifigsyx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246638.483824-981-42249003642746/AnsiballZ_systemd.py'
Feb 16 12:57:18 compute-0 sudo[44780]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:57:19 compute-0 python3.9[44782]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksm.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 16 12:57:19 compute-0 systemd[1]: Reloading.
Feb 16 12:57:19 compute-0 systemd-rc-local-generator[44801]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 12:57:19 compute-0 sudo[44780]: pam_unix(sudo:session): session closed for user root
Feb 16 12:57:19 compute-0 sudo[44976]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tpxbdleaabsdhbktblordgqjdszrkoar ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246639.406021-981-210398862841825/AnsiballZ_systemd.py'
Feb 16 12:57:19 compute-0 sudo[44976]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:57:20 compute-0 python3.9[44978]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksmtuned.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 16 12:57:20 compute-0 systemd[1]: Reloading.
Feb 16 12:57:20 compute-0 systemd-rc-local-generator[45003]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 12:57:20 compute-0 sudo[44976]: pam_unix(sudo:session): session closed for user root
Feb 16 12:57:21 compute-0 sudo[45172]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fuoptqtwmevwarhcbvksejtpksogxgdn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246641.3393178-1013-11450045013493/AnsiballZ_command.py'
Feb 16 12:57:21 compute-0 sudo[45172]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:57:21 compute-0 python3.9[45174]: ansible-ansible.legacy.command Invoked with _raw_params=mkswap "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 12:57:21 compute-0 sudo[45172]: pam_unix(sudo:session): session closed for user root
Feb 16 12:57:22 compute-0 sudo[45325]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tjgphhvitxqupcxibqpjyklfirholkec ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246642.0384107-1029-254878283879924/AnsiballZ_command.py'
Feb 16 12:57:22 compute-0 sudo[45325]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:57:22 compute-0 python3.9[45327]: ansible-ansible.legacy.command Invoked with _raw_params=swapon "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 12:57:22 compute-0 kernel: Adding 1048572k swap on /swap.  Priority:-2 extents:1 across:1048572k 
Feb 16 12:57:22 compute-0 sudo[45325]: pam_unix(sudo:session): session closed for user root
Feb 16 12:57:22 compute-0 sudo[45478]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nktyynjwumiifqpwlrzsfpdxsuhcmsnx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246642.7185402-1045-126802461512002/AnsiballZ_command.py'
Feb 16 12:57:22 compute-0 sudo[45478]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:57:23 compute-0 python3.9[45480]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/update-ca-trust _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 12:57:24 compute-0 sudo[45478]: pam_unix(sudo:session): session closed for user root
Feb 16 12:57:25 compute-0 sudo[45640]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-devjeczlbpymoiwrycbsocpmjoziiigq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246644.8899438-1061-26549379719641/AnsiballZ_command.py'
Feb 16 12:57:25 compute-0 sudo[45640]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:57:25 compute-0 python3.9[45642]: ansible-ansible.legacy.command Invoked with _raw_params=echo 2 >/sys/kernel/mm/ksm/run _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 12:57:25 compute-0 sudo[45640]: pam_unix(sudo:session): session closed for user root
Feb 16 12:57:25 compute-0 sudo[45793]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yyijgenuupsafgzogczwsrgiramthyfx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246645.5200531-1077-198340026567291/AnsiballZ_systemd.py'
Feb 16 12:57:25 compute-0 sudo[45793]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:57:26 compute-0 python3.9[45795]: ansible-ansible.builtin.systemd Invoked with name=systemd-sysctl.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 16 12:57:26 compute-0 systemd[1]: systemd-sysctl.service: Deactivated successfully.
Feb 16 12:57:26 compute-0 systemd[1]: Stopped Apply Kernel Variables.
Feb 16 12:57:26 compute-0 systemd[1]: Stopping Apply Kernel Variables...
Feb 16 12:57:26 compute-0 systemd[1]: Starting Apply Kernel Variables...
Feb 16 12:57:26 compute-0 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Feb 16 12:57:26 compute-0 systemd[1]: Finished Apply Kernel Variables.
Feb 16 12:57:26 compute-0 sudo[45793]: pam_unix(sudo:session): session closed for user root
Feb 16 12:57:26 compute-0 sshd-session[32065]: Connection closed by 192.168.122.30 port 51920
Feb 16 12:57:26 compute-0 sshd-session[32062]: pam_unix(sshd:session): session closed for user zuul
Feb 16 12:57:26 compute-0 systemd[1]: session-10.scope: Deactivated successfully.
Feb 16 12:57:26 compute-0 systemd[1]: session-10.scope: Consumed 2min 4.296s CPU time.
Feb 16 12:57:26 compute-0 systemd-logind[818]: Session 10 logged out. Waiting for processes to exit.
Feb 16 12:57:26 compute-0 systemd-logind[818]: Removed session 10.
Feb 16 12:57:32 compute-0 sshd-session[45826]: Accepted publickey for zuul from 192.168.122.30 port 54562 ssh2: ECDSA SHA256:6P1Q60WP6ePVjzkJMgpM7PslBYS6YIK3pVS2L1FZ4Yk
Feb 16 12:57:32 compute-0 systemd-logind[818]: New session 11 of user zuul.
Feb 16 12:57:32 compute-0 systemd[1]: Started Session 11 of User zuul.
Feb 16 12:57:32 compute-0 sshd-session[45826]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 16 12:57:33 compute-0 python3.9[45979]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 16 12:57:34 compute-0 python3.9[46133]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 16 12:57:36 compute-0 sudo[46287]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-garuzeugxormgjfxfjcnldjqixuysawi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246655.5736964-80-47218964132022/AnsiballZ_command.py'
Feb 16 12:57:36 compute-0 sudo[46287]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:57:36 compute-0 python3.9[46289]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 12:57:36 compute-0 sudo[46287]: pam_unix(sudo:session): session closed for user root
Feb 16 12:57:37 compute-0 python3.9[46440]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 16 12:57:38 compute-0 sudo[46594]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yutsdsrmdqjiwqgcmvdsiijsetcujxhs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246657.724014-120-172452172956883/AnsiballZ_setup.py'
Feb 16 12:57:38 compute-0 sudo[46594]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:57:38 compute-0 python3.9[46596]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 16 12:57:38 compute-0 sudo[46594]: pam_unix(sudo:session): session closed for user root
Feb 16 12:57:38 compute-0 sudo[46678]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zccplwpfsriwwhwxatdsxvbiinhpuumx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246657.724014-120-172452172956883/AnsiballZ_dnf.py'
Feb 16 12:57:38 compute-0 sudo[46678]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:57:39 compute-0 python3.9[46680]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 16 12:57:40 compute-0 sudo[46678]: pam_unix(sudo:session): session closed for user root
Feb 16 12:57:41 compute-0 sudo[46831]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vhyushiuejlhooyehazagbdigketjvbi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246660.9637356-144-97841243016743/AnsiballZ_setup.py'
Feb 16 12:57:41 compute-0 sudo[46831]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:57:41 compute-0 python3.9[46833]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 16 12:57:41 compute-0 sudo[46831]: pam_unix(sudo:session): session closed for user root
Feb 16 12:57:42 compute-0 sudo[47002]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bbdhoytyhrokfggouqcbyenbwigfdbah ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246661.9126716-166-69297825591223/AnsiballZ_file.py'
Feb 16 12:57:42 compute-0 sudo[47002]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:57:42 compute-0 python3.9[47004]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 12:57:42 compute-0 sudo[47002]: pam_unix(sudo:session): session closed for user root
Feb 16 12:57:43 compute-0 sudo[47154]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tuzlxjjeorrcibazlevqgxjefhxzermx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246662.760499-182-199494798496395/AnsiballZ_command.py'
Feb 16 12:57:43 compute-0 sudo[47154]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:57:43 compute-0 python3.9[47156]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 12:57:43 compute-0 podman[47157]: 2026-02-16 12:57:43.301723144 +0000 UTC m=+0.054353703 system refresh
Feb 16 12:57:43 compute-0 sudo[47154]: pam_unix(sudo:session): session closed for user root
Feb 16 12:57:43 compute-0 sudo[47317]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mbvagyponvblpbjkjesjpiadccywobdj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246663.5013723-198-145652760970249/AnsiballZ_stat.py'
Feb 16 12:57:43 compute-0 sudo[47317]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:57:44 compute-0 python3.9[47319]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 12:57:44 compute-0 sudo[47317]: pam_unix(sudo:session): session closed for user root
Feb 16 12:57:44 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 16 12:57:44 compute-0 sudo[47440]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rrjjmhfwjgohwtvwbskhgrhzhsgshedn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246663.5013723-198-145652760970249/AnsiballZ_copy.py'
Feb 16 12:57:44 compute-0 sudo[47440]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:57:45 compute-0 python3.9[47442]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/networks/podman.json group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771246663.5013723-198-145652760970249/.source.json follow=False _original_basename=podman_network_config.j2 checksum=e5a60c77ec6118ba64c52ff5200610eba7fde00c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 12:57:45 compute-0 sudo[47440]: pam_unix(sudo:session): session closed for user root
Feb 16 12:57:45 compute-0 sudo[47592]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pthjsszlpcnkspvzjekolvaycendgyip ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246665.2493997-228-174077922939491/AnsiballZ_stat.py'
Feb 16 12:57:45 compute-0 sudo[47592]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:57:45 compute-0 python3.9[47594]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 12:57:45 compute-0 sudo[47592]: pam_unix(sudo:session): session closed for user root
Feb 16 12:57:46 compute-0 sudo[47715]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ylodzbqzpdwuohovaipkfazliyfslipm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246665.2493997-228-174077922939491/AnsiballZ_copy.py'
Feb 16 12:57:46 compute-0 sudo[47715]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:57:46 compute-0 python3.9[47717]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771246665.2493997-228-174077922939491/.source.conf follow=False _original_basename=registries.conf.j2 checksum=9b3bcfdba57b23b453cfef4c881e370a3c5d3bf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Feb 16 12:57:46 compute-0 sudo[47715]: pam_unix(sudo:session): session closed for user root
Feb 16 12:57:46 compute-0 sudo[47867]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bhgpoprgnujxywpadyghryryplucpoik ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246666.5011165-260-135897507845221/AnsiballZ_ini_file.py'
Feb 16 12:57:46 compute-0 sudo[47867]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:57:47 compute-0 python3.9[47869]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Feb 16 12:57:47 compute-0 sudo[47867]: pam_unix(sudo:session): session closed for user root
Feb 16 12:57:47 compute-0 sudo[48019]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tgxhybjmrfomfimypytzrzirdqhotdzt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246667.1977956-260-270403937187404/AnsiballZ_ini_file.py'
Feb 16 12:57:47 compute-0 sudo[48019]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:57:47 compute-0 python3.9[48021]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Feb 16 12:57:47 compute-0 sudo[48019]: pam_unix(sudo:session): session closed for user root
Feb 16 12:57:48 compute-0 sudo[48171]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dfpaygujogaavepxlsvvyypyowplnlji ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246667.787158-260-150048574841670/AnsiballZ_ini_file.py'
Feb 16 12:57:48 compute-0 sudo[48171]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:57:48 compute-0 python3.9[48173]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Feb 16 12:57:48 compute-0 sudo[48171]: pam_unix(sudo:session): session closed for user root
Feb 16 12:57:48 compute-0 sudo[48323]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nbyigimlihigzbegdrvsyhhzgzfcnwgi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246668.4118962-260-222354985294816/AnsiballZ_ini_file.py'
Feb 16 12:57:48 compute-0 sudo[48323]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:57:48 compute-0 python3.9[48325]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Feb 16 12:57:48 compute-0 sudo[48323]: pam_unix(sudo:session): session closed for user root
Feb 16 12:57:49 compute-0 python3.9[48475]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 16 12:57:50 compute-0 sudo[48627]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yfyeaglcacvbjkqmcrwzxcxhrqyxfnqy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246670.2106721-340-28618153494927/AnsiballZ_dnf.py'
Feb 16 12:57:50 compute-0 sudo[48627]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:57:50 compute-0 python3.9[48629]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Feb 16 12:57:51 compute-0 sudo[48627]: pam_unix(sudo:session): session closed for user root
Feb 16 12:57:52 compute-0 sudo[48780]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bqswrszzwgdefxhujctlkoqdjffliyqp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246672.3376088-356-62766499767400/AnsiballZ_dnf.py'
Feb 16 12:57:52 compute-0 sudo[48780]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:57:52 compute-0 python3.9[48782]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openstack-network-scripts'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Feb 16 12:57:55 compute-0 sudo[48780]: pam_unix(sudo:session): session closed for user root
Feb 16 12:57:55 compute-0 sudo[48940]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nyjvqynppyxkfcqlvdoiztqxflbuffsm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246675.546652-376-187307291799898/AnsiballZ_dnf.py'
Feb 16 12:57:55 compute-0 sudo[48940]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:57:55 compute-0 python3.9[48942]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['podman', 'buildah'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Feb 16 12:57:57 compute-0 sudo[48940]: pam_unix(sudo:session): session closed for user root
Feb 16 12:57:57 compute-0 sudo[49093]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-msirpcscvbinqtzprasfogpjhnjdkdkk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246677.6256742-394-32960526227691/AnsiballZ_dnf.py'
Feb 16 12:57:57 compute-0 sudo[49093]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:57:58 compute-0 python3.9[49095]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['tuned', 'tuned-profiles-cpu-partitioning'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Feb 16 12:57:59 compute-0 sudo[49093]: pam_unix(sudo:session): session closed for user root
Feb 16 12:58:00 compute-0 sudo[49246]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-unxzevfoluldirovsnzabgppbxyaedus ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246679.753769-416-245268270116409/AnsiballZ_dnf.py'
Feb 16 12:58:00 compute-0 sudo[49246]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:58:00 compute-0 python3.9[49248]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['NetworkManager-ovs'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Feb 16 12:58:01 compute-0 sudo[49246]: pam_unix(sudo:session): session closed for user root
Feb 16 12:58:02 compute-0 sudo[49402]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uxyfnzlglqznyqzmrzqesilsgxfmqxqp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246682.0913875-432-13938743265804/AnsiballZ_dnf.py'
Feb 16 12:58:02 compute-0 sudo[49402]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:58:02 compute-0 python3.9[49404]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['os-net-config'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Feb 16 12:58:05 compute-0 sudo[49402]: pam_unix(sudo:session): session closed for user root
Feb 16 12:58:13 compute-0 sudo[49572]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vgvdwizlxivrrqwckxotikojrjkisazd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246693.2965248-450-85778917871214/AnsiballZ_dnf.py'
Feb 16 12:58:13 compute-0 sudo[49572]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:58:13 compute-0 python3.9[49574]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openssh-server'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Feb 16 12:58:14 compute-0 sudo[49572]: pam_unix(sudo:session): session closed for user root
Feb 16 12:58:15 compute-0 sudo[49725]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fzzswmgnosswxoncpyxatffiliqurfgl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246695.330417-468-19330246484380/AnsiballZ_dnf.py'
Feb 16 12:58:15 compute-0 sudo[49725]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:58:15 compute-0 python3.9[49727]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Feb 16 12:58:43 compute-0 sudo[49725]: pam_unix(sudo:session): session closed for user root
Feb 16 12:58:44 compute-0 sudo[50062]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bblyklmknjtkuhrrmichmvwnkvywfoct ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246724.4873703-486-257263349343448/AnsiballZ_dnf.py'
Feb 16 12:58:44 compute-0 sudo[50062]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:58:44 compute-0 python3.9[50064]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['iscsi-initiator-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Feb 16 12:58:46 compute-0 sudo[50062]: pam_unix(sudo:session): session closed for user root
Feb 16 12:58:46 compute-0 sudo[50218]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-khasomiqbkjbrlufkemmgwqpfrzifpxn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246726.557253-506-186548746726172/AnsiballZ_dnf.py'
Feb 16 12:58:46 compute-0 sudo[50218]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:58:47 compute-0 python3.9[50220]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['device-mapper-multipath'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Feb 16 12:58:48 compute-0 sudo[50218]: pam_unix(sudo:session): session closed for user root
Feb 16 12:58:49 compute-0 sudo[50375]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ilbzveyfvnmqglkxgduvdvkpscunwvqq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246729.355433-528-230089352851344/AnsiballZ_file.py'
Feb 16 12:58:49 compute-0 sudo[50375]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:58:49 compute-0 python3.9[50377]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 12:58:49 compute-0 sudo[50375]: pam_unix(sudo:session): session closed for user root
Feb 16 12:58:50 compute-0 sudo[50550]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-enmctscjpomuwrqvpexthgjcjmilqrcu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246729.9872532-544-175930099035496/AnsiballZ_stat.py'
Feb 16 12:58:50 compute-0 sudo[50550]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:58:50 compute-0 python3.9[50552]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 12:58:50 compute-0 sudo[50550]: pam_unix(sudo:session): session closed for user root
Feb 16 12:58:50 compute-0 sudo[50673]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qkcquqbhxbewizfrjyhvyvnqtpuvqqab ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246729.9872532-544-175930099035496/AnsiballZ_copy.py'
Feb 16 12:58:50 compute-0 sudo[50673]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:58:51 compute-0 python3.9[50675]: ansible-ansible.legacy.copy Invoked with dest=/root/.config/containers/auth.json group=zuul mode=0660 owner=zuul src=/home/zuul/.ansible/tmp/ansible-tmp-1771246729.9872532-544-175930099035496/.source.json _original_basename=.ra5o769a follow=False checksum=bf21a9e8fbc5a3846fb05b4fa0859e0917b2202f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 12:58:51 compute-0 sudo[50673]: pam_unix(sudo:session): session closed for user root
Feb 16 12:58:51 compute-0 sudo[50825]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rvtwysoeyjjcnryyckacxjbvrtujhcud ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246731.3284564-580-259549342133108/AnsiballZ_podman_image.py'
Feb 16 12:58:51 compute-0 sudo[50825]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:58:52 compute-0 python3.9[50827]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Feb 16 12:58:52 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 16 12:58:54 compute-0 systemd[1]: var-lib-containers-storage-overlay-compat703426854-lower\x2dmapped.mount: Deactivated successfully.
Feb 16 12:58:56 compute-0 podman[50839]: 2026-02-16 12:58:56.644573377 +0000 UTC m=+4.555258974 image pull 9f8c6308802db66f6c1100257e3fa9593740e85d82f038b4185cf756493dc94e quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Feb 16 12:58:56 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 16 12:58:56 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 16 12:58:56 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 16 12:58:56 compute-0 sudo[50825]: pam_unix(sudo:session): session closed for user root
Feb 16 12:58:58 compute-0 sudo[51138]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ejywpsnlwjkbibnedjjwdjndesoytkwg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246738.101452-602-105590509309235/AnsiballZ_podman_image.py'
Feb 16 12:58:58 compute-0 sudo[51138]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:58:58 compute-0 python3.9[51140]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Feb 16 12:58:58 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 16 12:59:06 compute-0 podman[51152]: 2026-02-16 12:59:06.455188756 +0000 UTC m=+7.857765407 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Feb 16 12:59:06 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 16 12:59:06 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 16 12:59:06 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 16 12:59:06 compute-0 sudo[51138]: pam_unix(sudo:session): session closed for user root
Feb 16 12:59:09 compute-0 sudo[51452]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fckdorfqezewubulhwjwxjhorwkdrsxy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246748.9331024-622-249968633910397/AnsiballZ_podman_image.py'
Feb 16 12:59:09 compute-0 sudo[51452]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:59:09 compute-0 python3.9[51454]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Feb 16 12:59:09 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 16 12:59:14 compute-0 irqbalance[816]: Cannot change IRQ 27 affinity: Operation not permitted
Feb 16 12:59:14 compute-0 irqbalance[816]: IRQ 27 affinity is now unmanaged
Feb 16 12:59:18 compute-0 podman[51466]: 2026-02-16 12:59:18.51863295 +0000 UTC m=+9.022157050 image pull f4e0688689eb3c524117ae65df199eeb4e620e591d26898b5cb25b819a2d79fd quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Feb 16 12:59:18 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 16 12:59:18 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 16 12:59:18 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 16 12:59:18 compute-0 sudo[51452]: pam_unix(sudo:session): session closed for user root
Feb 16 12:59:24 compute-0 sudo[51720]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pecbyclxvzbjoqjxxvacnrlqnvamxjqc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246764.4549186-644-190138389735108/AnsiballZ_podman_image.py'
Feb 16 12:59:24 compute-0 sudo[51720]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:59:25 compute-0 python3.9[51722]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Feb 16 12:59:25 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 16 12:59:27 compute-0 podman[51734]: 2026-02-16 12:59:27.311560087 +0000 UTC m=+2.251402790 image pull be811c7ef606e5fdf21f4bb60e867487043c4ca0ef316c864692549ee6c1c369 quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified
Feb 16 12:59:27 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 16 12:59:27 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 16 12:59:27 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 16 12:59:27 compute-0 sudo[51720]: pam_unix(sudo:session): session closed for user root
Feb 16 12:59:27 compute-0 sudo[51991]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-puidbmfitbmndqaiowpcwiufqabsstef ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246767.6682382-644-102826252816117/AnsiballZ_podman_image.py'
Feb 16 12:59:27 compute-0 sudo[51991]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:59:28 compute-0 python3.9[51993]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/prometheus/node-exporter:v1.5.0 tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Feb 16 12:59:29 compute-0 podman[52006]: 2026-02-16 12:59:29.382649621 +0000 UTC m=+1.172170297 image pull 0da6a335fe1356545476b749c68f022c897de3a2139e8f0054f6937349ee2b83 quay.io/prometheus/node-exporter:v1.5.0
Feb 16 12:59:29 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 16 12:59:29 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 16 12:59:29 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 16 12:59:29 compute-0 sudo[51991]: pam_unix(sudo:session): session closed for user root
Feb 16 12:59:33 compute-0 sshd-session[45829]: Connection closed by 192.168.122.30 port 54562
Feb 16 12:59:33 compute-0 sshd-session[45826]: pam_unix(sshd:session): session closed for user zuul
Feb 16 12:59:33 compute-0 systemd[1]: session-11.scope: Deactivated successfully.
Feb 16 12:59:33 compute-0 systemd[1]: session-11.scope: Consumed 1min 33.786s CPU time.
Feb 16 12:59:33 compute-0 systemd-logind[818]: Session 11 logged out. Waiting for processes to exit.
Feb 16 12:59:33 compute-0 systemd-logind[818]: Removed session 11.
Feb 16 12:59:41 compute-0 sshd-session[52154]: Accepted publickey for zuul from 192.168.122.30 port 56532 ssh2: ECDSA SHA256:6P1Q60WP6ePVjzkJMgpM7PslBYS6YIK3pVS2L1FZ4Yk
Feb 16 12:59:41 compute-0 systemd-logind[818]: New session 12 of user zuul.
Feb 16 12:59:41 compute-0 systemd[1]: Started Session 12 of User zuul.
Feb 16 12:59:41 compute-0 sshd-session[52154]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 16 12:59:42 compute-0 python3.9[52307]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 16 12:59:44 compute-0 sudo[52461]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dbnhhtstpygphuvrfnshwqntstqofqlu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246783.6426232-54-198075047668270/AnsiballZ_getent.py'
Feb 16 12:59:44 compute-0 sudo[52461]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:59:44 compute-0 python3.9[52463]: ansible-ansible.builtin.getent Invoked with database=passwd key=openvswitch fail_key=True service=None split=None
Feb 16 12:59:44 compute-0 sudo[52461]: pam_unix(sudo:session): session closed for user root
Feb 16 12:59:44 compute-0 sudo[52614]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-atkxxgvheradqmdittabazbiakyaaagp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246784.5038042-70-1069347140139/AnsiballZ_group.py'
Feb 16 12:59:44 compute-0 sudo[52614]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:59:45 compute-0 python3.9[52616]: ansible-ansible.builtin.group Invoked with gid=42476 name=openvswitch state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Feb 16 12:59:45 compute-0 groupadd[52617]: group added to /etc/group: name=openvswitch, GID=42476
Feb 16 12:59:45 compute-0 groupadd[52617]: group added to /etc/gshadow: name=openvswitch
Feb 16 12:59:45 compute-0 groupadd[52617]: new group: name=openvswitch, GID=42476
Feb 16 12:59:45 compute-0 sudo[52614]: pam_unix(sudo:session): session closed for user root
Feb 16 12:59:45 compute-0 sudo[52772]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bniwgewqucqtataxbxetgavpstugxeas ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246785.4967873-86-74300359790999/AnsiballZ_user.py'
Feb 16 12:59:45 compute-0 sudo[52772]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:59:46 compute-0 python3.9[52774]: ansible-ansible.builtin.user Invoked with comment=openvswitch user group=openvswitch groups=['hugetlbfs'] name=openvswitch shell=/sbin/nologin state=present uid=42476 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Feb 16 12:59:46 compute-0 useradd[52776]: new user: name=openvswitch, UID=42476, GID=42476, home=/home/openvswitch, shell=/sbin/nologin, from=/dev/pts/0
Feb 16 12:59:46 compute-0 useradd[52776]: add 'openvswitch' to group 'hugetlbfs'
Feb 16 12:59:46 compute-0 useradd[52776]: add 'openvswitch' to shadow group 'hugetlbfs'
Feb 16 12:59:46 compute-0 sudo[52772]: pam_unix(sudo:session): session closed for user root
Feb 16 12:59:47 compute-0 sudo[52932]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ysuyafinkjfodhhzqhurrausarxvvvfv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246786.9054575-106-211431869463229/AnsiballZ_setup.py'
Feb 16 12:59:47 compute-0 sudo[52932]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:59:47 compute-0 python3.9[52934]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 16 12:59:47 compute-0 sudo[52932]: pam_unix(sudo:session): session closed for user root
Feb 16 12:59:48 compute-0 sudo[53016]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tmdtyxequuwjdiecthrwkmtrnlfmjmse ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246786.9054575-106-211431869463229/AnsiballZ_dnf.py'
Feb 16 12:59:48 compute-0 sudo[53016]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:59:49 compute-0 python3.9[53018]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openvswitch'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Feb 16 12:59:49 compute-0 sshd-session[53019]: Connection closed by 146.190.226.24 port 39620
Feb 16 12:59:51 compute-0 sudo[53016]: pam_unix(sudo:session): session closed for user root
Feb 16 12:59:51 compute-0 sudo[53178]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vrswqsvryqejhlhlmijkiidqenxharbn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246791.4058125-134-95132509370041/AnsiballZ_dnf.py'
Feb 16 12:59:51 compute-0 sudo[53178]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:59:51 compute-0 python3.9[53180]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 16 13:00:02 compute-0 kernel: SELinux:  Converting 2740 SID table entries...
Feb 16 13:00:02 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Feb 16 13:00:02 compute-0 kernel: SELinux:  policy capability open_perms=1
Feb 16 13:00:02 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Feb 16 13:00:02 compute-0 kernel: SELinux:  policy capability always_check_network=0
Feb 16 13:00:02 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Feb 16 13:00:02 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Feb 16 13:00:02 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Feb 16 13:00:02 compute-0 groupadd[53203]: group added to /etc/group: name=unbound, GID=994
Feb 16 13:00:02 compute-0 groupadd[53203]: group added to /etc/gshadow: name=unbound
Feb 16 13:00:02 compute-0 groupadd[53203]: new group: name=unbound, GID=994
Feb 16 13:00:02 compute-0 useradd[53210]: new user: name=unbound, UID=993, GID=994, home=/var/lib/unbound, shell=/sbin/nologin, from=none
Feb 16 13:00:02 compute-0 dbus-broker-launch[808]: avc:  op=load_policy lsm=selinux seqno=7 res=1
Feb 16 13:00:02 compute-0 systemd[1]: Started daily update of the root trust anchor for DNSSEC.
Feb 16 13:00:03 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Feb 16 13:00:03 compute-0 systemd[1]: Starting man-db-cache-update.service...
Feb 16 13:00:03 compute-0 systemd[1]: Reloading.
Feb 16 13:00:03 compute-0 systemd-sysv-generator[53713]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 16 13:00:03 compute-0 systemd-rc-local-generator[53709]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 13:00:03 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Feb 16 13:00:04 compute-0 sudo[53178]: pam_unix(sudo:session): session closed for user root
Feb 16 13:00:04 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Feb 16 13:00:04 compute-0 systemd[1]: Finished man-db-cache-update.service.
Feb 16 13:00:04 compute-0 systemd[1]: run-r12be192cf3ed4d5582540d1aec418a71.service: Deactivated successfully.
Feb 16 13:00:07 compute-0 sudo[54287]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lzzzggpjdrwylgplbwvuwkdqptqjqifd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246806.670589-150-39990971888874/AnsiballZ_systemd.py'
Feb 16 13:00:07 compute-0 sudo[54287]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:00:07 compute-0 python3.9[54289]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Feb 16 13:00:07 compute-0 systemd[1]: Reloading.
Feb 16 13:00:07 compute-0 systemd-sysv-generator[54314]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 16 13:00:07 compute-0 systemd-rc-local-generator[54309]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 13:00:07 compute-0 systemd[1]: Starting Open vSwitch Database Unit...
Feb 16 13:00:07 compute-0 chown[54338]: /usr/bin/chown: cannot access '/run/openvswitch': No such file or directory
Feb 16 13:00:07 compute-0 ovs-ctl[54343]: /etc/openvswitch/conf.db does not exist ... (warning).
Feb 16 13:00:07 compute-0 ovs-ctl[54343]: Creating empty database /etc/openvswitch/conf.db [  OK  ]
Feb 16 13:00:07 compute-0 ovs-ctl[54343]: Starting ovsdb-server [  OK  ]
Feb 16 13:00:07 compute-0 ovs-vsctl[54392]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait -- init -- set Open_vSwitch . db-version=8.5.1
Feb 16 13:00:08 compute-0 ovs-vsctl[54408]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait set Open_vSwitch . ovs-version=3.3.5-115.el9s "external-ids:system-id=\"b0e583b2-47d7-4bde-bbd6-282143e0c194\"" "external-ids:rundir=\"/var/run/openvswitch\"" "system-type=\"centos\"" "system-version=\"9\""
Feb 16 13:00:08 compute-0 ovs-ctl[54343]: Configuring Open vSwitch system IDs [  OK  ]
Feb 16 13:00:08 compute-0 ovs-ctl[54343]: Enabling remote OVSDB managers [  OK  ]
Feb 16 13:00:08 compute-0 systemd[1]: Started Open vSwitch Database Unit.
Feb 16 13:00:08 compute-0 ovs-vsctl[54418]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-0
Feb 16 13:00:08 compute-0 systemd[1]: Starting Open vSwitch Delete Transient Ports...
Feb 16 13:00:08 compute-0 systemd[1]: Finished Open vSwitch Delete Transient Ports.
Feb 16 13:00:08 compute-0 systemd[1]: Starting Open vSwitch Forwarding Unit...
Feb 16 13:00:08 compute-0 kernel: openvswitch: Open vSwitch switching datapath
Feb 16 13:00:08 compute-0 ovs-ctl[54463]: Inserting openvswitch module [  OK  ]
Feb 16 13:00:08 compute-0 ovs-ctl[54432]: Starting ovs-vswitchd [  OK  ]
Feb 16 13:00:08 compute-0 ovs-vsctl[54480]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-0
Feb 16 13:00:08 compute-0 ovs-ctl[54432]: Enabling remote OVSDB managers [  OK  ]
Feb 16 13:00:08 compute-0 systemd[1]: Started Open vSwitch Forwarding Unit.
Feb 16 13:00:08 compute-0 systemd[1]: Starting Open vSwitch...
Feb 16 13:00:08 compute-0 systemd[1]: Finished Open vSwitch.
Feb 16 13:00:08 compute-0 sudo[54287]: pam_unix(sudo:session): session closed for user root
Feb 16 13:00:09 compute-0 python3.9[54632]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 16 13:00:09 compute-0 sudo[54782]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cohdjgrkmgewyhnflwwlyawebaqphelu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246809.4857826-188-266163342007408/AnsiballZ_sefcontext.py'
Feb 16 13:00:09 compute-0 sudo[54782]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:00:10 compute-0 python3.9[54784]: ansible-community.general.sefcontext Invoked with selevel=s0 setype=container_file_t state=present target=/var/lib/edpm-config(/.*)? ignore_selinux_state=False ftype=a reload=True substitute=None seuser=None
Feb 16 13:00:11 compute-0 kernel: SELinux:  Converting 2754 SID table entries...
Feb 16 13:00:11 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Feb 16 13:00:11 compute-0 kernel: SELinux:  policy capability open_perms=1
Feb 16 13:00:11 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Feb 16 13:00:11 compute-0 kernel: SELinux:  policy capability always_check_network=0
Feb 16 13:00:11 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Feb 16 13:00:11 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Feb 16 13:00:11 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Feb 16 13:00:11 compute-0 sudo[54782]: pam_unix(sudo:session): session closed for user root
Feb 16 13:00:12 compute-0 python3.9[54939]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 16 13:00:12 compute-0 sudo[55095]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gbtsuviougahvitqqeqigmfjkgnvvggg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246812.5369582-224-259862701527555/AnsiballZ_dnf.py'
Feb 16 13:00:12 compute-0 dbus-broker-launch[808]: avc:  op=load_policy lsm=selinux seqno=8 res=1
Feb 16 13:00:12 compute-0 sudo[55095]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:00:13 compute-0 python3.9[55097]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 16 13:00:14 compute-0 sudo[55095]: pam_unix(sudo:session): session closed for user root
Feb 16 13:00:15 compute-0 sudo[55248]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oujbzijnohylhtfjqbzobahvtsuequaf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246814.8544056-240-93493558881582/AnsiballZ_command.py'
Feb 16 13:00:15 compute-0 sudo[55248]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:00:15 compute-0 python3.9[55250]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 13:00:16 compute-0 sudo[55248]: pam_unix(sudo:session): session closed for user root
Feb 16 13:00:16 compute-0 sudo[55535]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dzuluhfncyhbudhzknxksvnigkektcwr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246816.3717232-256-49946369973789/AnsiballZ_file.py'
Feb 16 13:00:16 compute-0 sudo[55535]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:00:17 compute-0 python3.9[55537]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None attributes=None
Feb 16 13:00:17 compute-0 sudo[55535]: pam_unix(sudo:session): session closed for user root
Feb 16 13:00:17 compute-0 python3.9[55687]: ansible-ansible.builtin.stat Invoked with path=/etc/cloud/cloud.cfg.d follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 16 13:00:18 compute-0 sudo[55839]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aefecwokxkfwivywajwqkbbooxiypyrh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246818.126307-288-8923587692428/AnsiballZ_dnf.py'
Feb 16 13:00:18 compute-0 sudo[55839]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:00:18 compute-0 python3.9[55841]: ansible-ansible.legacy.dnf Invoked with name=['NetworkManager-ovs'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 16 13:00:20 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Feb 16 13:00:20 compute-0 systemd[1]: Starting man-db-cache-update.service...
Feb 16 13:00:20 compute-0 systemd[1]: Reloading.
Feb 16 13:00:20 compute-0 systemd-sysv-generator[55879]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 16 13:00:20 compute-0 systemd-rc-local-generator[55876]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 13:00:20 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Feb 16 13:00:20 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Feb 16 13:00:20 compute-0 systemd[1]: Finished man-db-cache-update.service.
Feb 16 13:00:20 compute-0 systemd[1]: run-r69a361a329054bdda33027c02aa0dc65.service: Deactivated successfully.
Feb 16 13:00:20 compute-0 sudo[55839]: pam_unix(sudo:session): session closed for user root
Feb 16 13:00:21 compute-0 sudo[56162]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-klffdrxlcywmufsbfxntzkrcqdwepyeu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246821.4543233-304-267156219844250/AnsiballZ_systemd.py'
Feb 16 13:00:21 compute-0 sudo[56162]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:00:22 compute-0 python3.9[56164]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 16 13:00:22 compute-0 systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Feb 16 13:00:22 compute-0 systemd[1]: Stopped Network Manager Wait Online.
Feb 16 13:00:22 compute-0 systemd[1]: Stopping Network Manager Wait Online...
Feb 16 13:00:22 compute-0 systemd[1]: Stopping Network Manager...
Feb 16 13:00:22 compute-0 NetworkManager[7684]: <info>  [1771246822.0344] caught SIGTERM, shutting down normally.
Feb 16 13:00:22 compute-0 NetworkManager[7684]: <info>  [1771246822.0365] dhcp4 (eth0): canceled DHCP transaction
Feb 16 13:00:22 compute-0 NetworkManager[7684]: <info>  [1771246822.0366] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Feb 16 13:00:22 compute-0 NetworkManager[7684]: <info>  [1771246822.0366] dhcp4 (eth0): state changed no lease
Feb 16 13:00:22 compute-0 NetworkManager[7684]: <info>  [1771246822.0369] manager: NetworkManager state is now CONNECTED_SITE
Feb 16 13:00:22 compute-0 NetworkManager[7684]: <info>  [1771246822.0433] exiting (success)
Feb 16 13:00:22 compute-0 systemd[1]: Starting Network Manager Script Dispatcher Service...
Feb 16 13:00:22 compute-0 systemd[1]: Started Network Manager Script Dispatcher Service.
Feb 16 13:00:22 compute-0 systemd[1]: NetworkManager.service: Deactivated successfully.
Feb 16 13:00:22 compute-0 systemd[1]: Stopped Network Manager.
Feb 16 13:00:22 compute-0 systemd[1]: NetworkManager.service: Consumed 18.554s CPU time, 4.1M memory peak, read 0B from disk, written 30.5K to disk.
Feb 16 13:00:22 compute-0 systemd[1]: Starting Network Manager...
Feb 16 13:00:22 compute-0 NetworkManager[56177]: <info>  [1771246822.1086] NetworkManager (version 1.54.3-2.el9) is starting... (after a restart, boot:8a9ec7d7-dead-4443-8176-f7a3c4743d84)
Feb 16 13:00:22 compute-0 NetworkManager[56177]: <info>  [1771246822.1089] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Feb 16 13:00:22 compute-0 NetworkManager[56177]: <info>  [1771246822.1137] manager[0x563e7c2f2000]: monitoring kernel firmware directory '/lib/firmware'.
Feb 16 13:00:22 compute-0 systemd[1]: Starting Hostname Service...
Feb 16 13:00:22 compute-0 systemd[1]: Started Hostname Service.
Feb 16 13:00:22 compute-0 NetworkManager[56177]: <info>  [1771246822.1845] hostname: hostname: using hostnamed
Feb 16 13:00:22 compute-0 NetworkManager[56177]: <info>  [1771246822.1846] hostname: static hostname changed from (none) to "compute-0"
Feb 16 13:00:22 compute-0 NetworkManager[56177]: <info>  [1771246822.1852] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Feb 16 13:00:22 compute-0 NetworkManager[56177]: <info>  [1771246822.1859] manager[0x563e7c2f2000]: rfkill: Wi-Fi hardware radio set enabled
Feb 16 13:00:22 compute-0 NetworkManager[56177]: <info>  [1771246822.1860] manager[0x563e7c2f2000]: rfkill: WWAN hardware radio set enabled
Feb 16 13:00:22 compute-0 NetworkManager[56177]: <info>  [1771246822.1881] Loaded device plugin: NMOvsFactory (/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-device-plugin-ovs.so)
Feb 16 13:00:22 compute-0 NetworkManager[56177]: <info>  [1771246822.1890] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-device-plugin-team.so)
Feb 16 13:00:22 compute-0 NetworkManager[56177]: <info>  [1771246822.1891] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Feb 16 13:00:22 compute-0 NetworkManager[56177]: <info>  [1771246822.1891] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Feb 16 13:00:22 compute-0 NetworkManager[56177]: <info>  [1771246822.1892] manager: Networking is enabled by state file
Feb 16 13:00:22 compute-0 NetworkManager[56177]: <info>  [1771246822.1894] settings: Loaded settings plugin: keyfile (internal)
Feb 16 13:00:22 compute-0 NetworkManager[56177]: <info>  [1771246822.1897] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-settings-plugin-ifcfg-rh.so")
Feb 16 13:00:22 compute-0 NetworkManager[56177]: <info>  [1771246822.1921] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Feb 16 13:00:22 compute-0 NetworkManager[56177]: <info>  [1771246822.1930] dhcp: init: Using DHCP client 'internal'
Feb 16 13:00:22 compute-0 NetworkManager[56177]: <info>  [1771246822.1933] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Feb 16 13:00:22 compute-0 NetworkManager[56177]: <info>  [1771246822.1939] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 16 13:00:22 compute-0 NetworkManager[56177]: <info>  [1771246822.1944] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Feb 16 13:00:22 compute-0 NetworkManager[56177]: <info>  [1771246822.1951] device (lo): Activation: starting connection 'lo' (e66ad5fa-4651-424c-a7b4-2a119df1e243)
Feb 16 13:00:22 compute-0 NetworkManager[56177]: <info>  [1771246822.1957] device (eth0): carrier: link connected
Feb 16 13:00:22 compute-0 NetworkManager[56177]: <info>  [1771246822.1961] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Feb 16 13:00:22 compute-0 NetworkManager[56177]: <info>  [1771246822.1966] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Feb 16 13:00:22 compute-0 NetworkManager[56177]: <info>  [1771246822.1967] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Feb 16 13:00:22 compute-0 NetworkManager[56177]: <info>  [1771246822.1973] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Feb 16 13:00:22 compute-0 NetworkManager[56177]: <info>  [1771246822.1981] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Feb 16 13:00:22 compute-0 NetworkManager[56177]: <info>  [1771246822.1988] device (eth1): carrier: link connected
Feb 16 13:00:22 compute-0 NetworkManager[56177]: <info>  [1771246822.1992] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Feb 16 13:00:22 compute-0 NetworkManager[56177]: <info>  [1771246822.1997] manager: (eth1): assume: will attempt to assume matching connection 'ci-private-network' (5342df4c-d20b-514f-b7a0-cf6ea02e3054) (indicated)
Feb 16 13:00:22 compute-0 NetworkManager[56177]: <info>  [1771246822.1998] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Feb 16 13:00:22 compute-0 NetworkManager[56177]: <info>  [1771246822.2002] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Feb 16 13:00:22 compute-0 NetworkManager[56177]: <info>  [1771246822.2012] device (eth1): Activation: starting connection 'ci-private-network' (5342df4c-d20b-514f-b7a0-cf6ea02e3054)
Feb 16 13:00:22 compute-0 NetworkManager[56177]: <info>  [1771246822.2020] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Feb 16 13:00:22 compute-0 systemd[1]: Started Network Manager.
Feb 16 13:00:22 compute-0 NetworkManager[56177]: <info>  [1771246822.2029] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Feb 16 13:00:22 compute-0 NetworkManager[56177]: <info>  [1771246822.2031] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Feb 16 13:00:22 compute-0 NetworkManager[56177]: <info>  [1771246822.2036] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Feb 16 13:00:22 compute-0 NetworkManager[56177]: <info>  [1771246822.2038] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Feb 16 13:00:22 compute-0 NetworkManager[56177]: <info>  [1771246822.2042] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Feb 16 13:00:22 compute-0 NetworkManager[56177]: <info>  [1771246822.2046] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Feb 16 13:00:22 compute-0 NetworkManager[56177]: <info>  [1771246822.2049] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Feb 16 13:00:22 compute-0 NetworkManager[56177]: <info>  [1771246822.2066] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Feb 16 13:00:22 compute-0 NetworkManager[56177]: <info>  [1771246822.2081] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Feb 16 13:00:22 compute-0 NetworkManager[56177]: <info>  [1771246822.2088] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Feb 16 13:00:22 compute-0 NetworkManager[56177]: <info>  [1771246822.2099] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Feb 16 13:00:22 compute-0 NetworkManager[56177]: <info>  [1771246822.2120] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Feb 16 13:00:22 compute-0 NetworkManager[56177]: <info>  [1771246822.2136] dhcp4 (eth0): state changed new lease, address=38.102.83.210
Feb 16 13:00:22 compute-0 NetworkManager[56177]: <info>  [1771246822.2145] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Feb 16 13:00:22 compute-0 systemd[1]: Starting Network Manager Wait Online...
Feb 16 13:00:22 compute-0 NetworkManager[56177]: <info>  [1771246822.2223] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Feb 16 13:00:22 compute-0 NetworkManager[56177]: <info>  [1771246822.2233] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Feb 16 13:00:22 compute-0 NetworkManager[56177]: <info>  [1771246822.2235] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Feb 16 13:00:22 compute-0 NetworkManager[56177]: <info>  [1771246822.2238] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Feb 16 13:00:22 compute-0 NetworkManager[56177]: <info>  [1771246822.2247] device (lo): Activation: successful, device activated.
Feb 16 13:00:22 compute-0 NetworkManager[56177]: <info>  [1771246822.2256] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Feb 16 13:00:22 compute-0 NetworkManager[56177]: <info>  [1771246822.2261] manager: NetworkManager state is now CONNECTED_LOCAL
Feb 16 13:00:22 compute-0 NetworkManager[56177]: <info>  [1771246822.2267] device (eth1): Activation: successful, device activated.
Feb 16 13:00:22 compute-0 NetworkManager[56177]: <info>  [1771246822.2281] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Feb 16 13:00:22 compute-0 NetworkManager[56177]: <info>  [1771246822.2284] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Feb 16 13:00:22 compute-0 NetworkManager[56177]: <info>  [1771246822.2290] manager: NetworkManager state is now CONNECTED_SITE
Feb 16 13:00:22 compute-0 NetworkManager[56177]: <info>  [1771246822.2296] device (eth0): Activation: successful, device activated.
Feb 16 13:00:22 compute-0 NetworkManager[56177]: <info>  [1771246822.2304] manager: NetworkManager state is now CONNECTED_GLOBAL
Feb 16 13:00:22 compute-0 NetworkManager[56177]: <info>  [1771246822.2309] manager: startup complete
Feb 16 13:00:22 compute-0 systemd[1]: Finished Network Manager Wait Online.
Feb 16 13:00:22 compute-0 sudo[56162]: pam_unix(sudo:session): session closed for user root
Feb 16 13:00:22 compute-0 sudo[56389]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-llyksdraqcvpluodeubnudmwimigygse ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246822.4111755-320-228501923931014/AnsiballZ_dnf.py'
Feb 16 13:00:22 compute-0 sudo[56389]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:00:22 compute-0 python3.9[56391]: ansible-ansible.legacy.dnf Invoked with name=['os-net-config'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 16 13:00:27 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Feb 16 13:00:27 compute-0 systemd[1]: Starting man-db-cache-update.service...
Feb 16 13:00:27 compute-0 systemd[1]: Reloading.
Feb 16 13:00:27 compute-0 systemd-rc-local-generator[56440]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 13:00:27 compute-0 systemd-sysv-generator[56445]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 16 13:00:28 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Feb 16 13:00:28 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Feb 16 13:00:28 compute-0 systemd[1]: Finished man-db-cache-update.service.
Feb 16 13:00:28 compute-0 systemd[1]: run-r4dfc6f0a9d3f4c5484242f06916d06dc.service: Deactivated successfully.
Feb 16 13:00:28 compute-0 sudo[56389]: pam_unix(sudo:session): session closed for user root
Feb 16 13:00:32 compute-0 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Feb 16 13:00:33 compute-0 sudo[56861]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hzkzguydzedbhkfwbkqguvvswmdvwere ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246833.586946-344-189129275082458/AnsiballZ_stat.py'
Feb 16 13:00:33 compute-0 sudo[56861]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:00:34 compute-0 python3.9[56863]: ansible-ansible.builtin.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 16 13:00:34 compute-0 sudo[56861]: pam_unix(sudo:session): session closed for user root
Feb 16 13:00:34 compute-0 sudo[57013]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eciogwxlxcrbmldiqbljegghibeklqpp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246834.252263-362-82421786177284/AnsiballZ_ini_file.py'
Feb 16 13:00:34 compute-0 sudo[57013]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:00:34 compute-0 python3.9[57015]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=no-auto-default path=/etc/NetworkManager/NetworkManager.conf section=main state=present value=* exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:00:34 compute-0 sudo[57013]: pam_unix(sudo:session): session closed for user root
Feb 16 13:00:35 compute-0 sudo[57167]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nljcpbsanlwvtsztygykgpcbmfldfvhf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246835.135567-382-56366759518897/AnsiballZ_ini_file.py'
Feb 16 13:00:35 compute-0 sudo[57167]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:00:35 compute-0 python3.9[57169]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:00:35 compute-0 sudo[57167]: pam_unix(sudo:session): session closed for user root
Feb 16 13:00:36 compute-0 sudo[57319]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cqgziwrsqxofvcilczdtgjfwpfuexauk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246835.7962754-382-37574902299291/AnsiballZ_ini_file.py'
Feb 16 13:00:36 compute-0 sudo[57319]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:00:36 compute-0 python3.9[57321]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:00:36 compute-0 sudo[57319]: pam_unix(sudo:session): session closed for user root
Feb 16 13:00:36 compute-0 sudo[57471]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uwuuvxgvxrsotwraujxorgfwkidphikr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246836.5968513-412-153641970412887/AnsiballZ_ini_file.py'
Feb 16 13:00:36 compute-0 sudo[57471]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:00:37 compute-0 python3.9[57473]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:00:37 compute-0 sudo[57471]: pam_unix(sudo:session): session closed for user root
Feb 16 13:00:37 compute-0 sudo[57623]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sevvzobzwuismwjxxwrdlhwapxvchday ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246837.177205-412-262138789153120/AnsiballZ_ini_file.py'
Feb 16 13:00:37 compute-0 sudo[57623]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:00:37 compute-0 python3.9[57625]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:00:37 compute-0 sudo[57623]: pam_unix(sudo:session): session closed for user root
Feb 16 13:00:38 compute-0 sudo[57775]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ttwtduqnebgwgovosplqvguvlpbxbdzi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246837.7741673-442-220046919077587/AnsiballZ_stat.py'
Feb 16 13:00:38 compute-0 sudo[57775]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:00:38 compute-0 python3.9[57777]: ansible-ansible.legacy.stat Invoked with path=/etc/dhcp/dhclient-enter-hooks follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:00:38 compute-0 sudo[57775]: pam_unix(sudo:session): session closed for user root
Feb 16 13:00:38 compute-0 sudo[57898]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hbgkkllpmlciqbpopfscvrxiwwdipwmk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246837.7741673-442-220046919077587/AnsiballZ_copy.py'
Feb 16 13:00:38 compute-0 sudo[57898]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:00:38 compute-0 python3.9[57900]: ansible-ansible.legacy.copy Invoked with dest=/etc/dhcp/dhclient-enter-hooks mode=0755 src=/home/zuul/.ansible/tmp/ansible-tmp-1771246837.7741673-442-220046919077587/.source _original_basename=.0f9kfjr5 follow=False checksum=f6278a40de79a9841f6ed1fc584538225566990c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:00:38 compute-0 sudo[57898]: pam_unix(sudo:session): session closed for user root
Feb 16 13:00:39 compute-0 sudo[58050]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qyxwmvzajexygyviytdmilldqqavfmeb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246838.97174-472-251823410326012/AnsiballZ_file.py'
Feb 16 13:00:39 compute-0 sudo[58050]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:00:39 compute-0 python3.9[58052]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/os-net-config state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:00:39 compute-0 sudo[58050]: pam_unix(sudo:session): session closed for user root
Feb 16 13:00:40 compute-0 sudo[58202]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hebgrrusyeomaqetylsvudclwrbpaxea ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246839.6370792-488-145705024631478/AnsiballZ_edpm_os_net_config_mappings.py'
Feb 16 13:00:40 compute-0 sudo[58202]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:00:40 compute-0 python3.9[58204]: ansible-edpm_os_net_config_mappings Invoked with net_config_data_lookup={}
Feb 16 13:00:40 compute-0 sudo[58202]: pam_unix(sudo:session): session closed for user root
Feb 16 13:00:40 compute-0 sudo[58354]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ilolktlyzzvcdivkktsnzofdtheawxqo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246840.4744635-506-165491555359990/AnsiballZ_file.py'
Feb 16 13:00:40 compute-0 sudo[58354]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:00:40 compute-0 python3.9[58356]: ansible-ansible.builtin.file Invoked with path=/var/lib/edpm-config/scripts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:00:40 compute-0 sudo[58354]: pam_unix(sudo:session): session closed for user root
Feb 16 13:00:41 compute-0 sudo[58506]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bmzgnjvwzeqsckknzbymalneqjlrzvie ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246841.225802-526-274179576236717/AnsiballZ_stat.py'
Feb 16 13:00:41 compute-0 sudo[58506]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:00:41 compute-0 sudo[58506]: pam_unix(sudo:session): session closed for user root
Feb 16 13:00:42 compute-0 sudo[58629]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ayhlwxpoirmcyfmyhnsasnlgirmqrncx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246841.225802-526-274179576236717/AnsiballZ_copy.py'
Feb 16 13:00:42 compute-0 sudo[58629]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:00:42 compute-0 sudo[58629]: pam_unix(sudo:session): session closed for user root
Feb 16 13:00:42 compute-0 sudo[58781]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ghmrmkdugosylaxeoxmivsewgnsyzhci ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246842.4017942-556-255502346878812/AnsiballZ_slurp.py'
Feb 16 13:00:42 compute-0 sudo[58781]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:00:42 compute-0 python3.9[58783]: ansible-ansible.builtin.slurp Invoked with path=/etc/os-net-config/config.yaml src=/etc/os-net-config/config.yaml
Feb 16 13:00:43 compute-0 sudo[58781]: pam_unix(sudo:session): session closed for user root
Feb 16 13:00:43 compute-0 sudo[58956]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kqotlycfubqtifdslhpziyqmivpfeiql ; ANSIBLE_ASYNC_DIR=\'~/.ansible_async\' /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246843.2572596-574-214014839718050/async_wrapper.py j805786542711 300 /home/zuul/.ansible/tmp/ansible-tmp-1771246843.2572596-574-214014839718050/AnsiballZ_edpm_os_net_config.py _'
Feb 16 13:00:43 compute-0 sudo[58956]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:00:44 compute-0 ansible-async_wrapper.py[58958]: Invoked with j805786542711 300 /home/zuul/.ansible/tmp/ansible-tmp-1771246843.2572596-574-214014839718050/AnsiballZ_edpm_os_net_config.py _
Feb 16 13:00:44 compute-0 ansible-async_wrapper.py[58961]: Starting module and watcher
Feb 16 13:00:44 compute-0 ansible-async_wrapper.py[58961]: Start watching 58962 (300)
Feb 16 13:00:44 compute-0 ansible-async_wrapper.py[58962]: Start module (58962)
Feb 16 13:00:44 compute-0 ansible-async_wrapper.py[58958]: Return async_wrapper task started.
Feb 16 13:00:44 compute-0 sudo[58956]: pam_unix(sudo:session): session closed for user root
Feb 16 13:00:44 compute-0 python3.9[58963]: ansible-edpm_os_net_config Invoked with cleanup=True config_file=/etc/os-net-config/config.yaml debug=True detailed_exit_codes=True remove_config=False safe_defaults=False use_nmstate=True purge_provider=
Feb 16 13:00:44 compute-0 kernel: cfg80211: Loading compiled-in X.509 certificates for regulatory database
Feb 16 13:00:44 compute-0 kernel: Loaded X.509 cert 'sforshee: 00b28ddf47aef9cea7'
Feb 16 13:00:44 compute-0 kernel: Loaded X.509 cert 'wens: 61c038651aabdcf94bd0ac7ff06c7248db18c600'
Feb 16 13:00:44 compute-0 kernel: platform regulatory.0: Direct firmware load for regulatory.db failed with error -2
Feb 16 13:00:44 compute-0 kernel: cfg80211: failed to load regulatory.db
Feb 16 13:00:45 compute-0 NetworkManager[56177]: <info>  [1771246845.8848] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=58964 uid=0 result="success"
Feb 16 13:00:45 compute-0 NetworkManager[56177]: <info>  [1771246845.8869] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=58964 uid=0 result="success"
Feb 16 13:00:45 compute-0 NetworkManager[56177]: <info>  [1771246845.9453] manager: (br-ex): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/4)
Feb 16 13:00:45 compute-0 NetworkManager[56177]: <info>  [1771246845.9454] audit: op="connection-add" uuid="6aad6b43-ea5c-4d5d-a4a2-9c47926a5230" name="br-ex-br" pid=58964 uid=0 result="success"
Feb 16 13:00:45 compute-0 NetworkManager[56177]: <info>  [1771246845.9466] manager: (br-ex): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/5)
Feb 16 13:00:45 compute-0 NetworkManager[56177]: <info>  [1771246845.9467] audit: op="connection-add" uuid="774366e0-f3d0-439c-a936-c5208547c2a9" name="br-ex-port" pid=58964 uid=0 result="success"
Feb 16 13:00:45 compute-0 NetworkManager[56177]: <info>  [1771246845.9476] manager: (eth1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/6)
Feb 16 13:00:45 compute-0 NetworkManager[56177]: <info>  [1771246845.9476] audit: op="connection-add" uuid="a3b4a1be-db0b-4e79-8219-7bb6742fe7fe" name="eth1-port" pid=58964 uid=0 result="success"
Feb 16 13:00:45 compute-0 NetworkManager[56177]: <info>  [1771246845.9486] manager: (vlan20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/7)
Feb 16 13:00:45 compute-0 NetworkManager[56177]: <info>  [1771246845.9487] audit: op="connection-add" uuid="8d8b9a63-ede1-4e7a-a597-cf6d31864e31" name="vlan20-port" pid=58964 uid=0 result="success"
Feb 16 13:00:45 compute-0 NetworkManager[56177]: <info>  [1771246845.9498] manager: (vlan21): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/8)
Feb 16 13:00:45 compute-0 NetworkManager[56177]: <info>  [1771246845.9500] audit: op="connection-add" uuid="c1ed7cef-ae09-4f4f-a299-d481de99fd09" name="vlan21-port" pid=58964 uid=0 result="success"
Feb 16 13:00:45 compute-0 NetworkManager[56177]: <info>  [1771246845.9509] manager: (vlan22): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/9)
Feb 16 13:00:45 compute-0 NetworkManager[56177]: <info>  [1771246845.9510] audit: op="connection-add" uuid="25c80987-1304-4178-ad32-2100911a7fd9" name="vlan22-port" pid=58964 uid=0 result="success"
Feb 16 13:00:45 compute-0 NetworkManager[56177]: <info>  [1771246845.9526] audit: op="connection-update" uuid="5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03" name="System eth0" args="connection.timestamp,connection.autoconnect-priority,ipv4.dhcp-client-id,ipv4.dhcp-timeout,ipv6.addr-gen-mode,ipv6.dhcp-timeout,ipv6.method,802-3-ethernet.mtu" pid=58964 uid=0 result="success"
Feb 16 13:00:45 compute-0 NetworkManager[56177]: <info>  [1771246845.9540] manager: (br-ex): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/10)
Feb 16 13:00:45 compute-0 NetworkManager[56177]: <info>  [1771246845.9542] audit: op="connection-add" uuid="3740008e-3c44-41b4-86a2-3f30fcf7aac5" name="br-ex-if" pid=58964 uid=0 result="success"
Feb 16 13:00:45 compute-0 NetworkManager[56177]: <info>  [1771246845.9657] audit: op="connection-update" uuid="5342df4c-d20b-514f-b7a0-cf6ea02e3054" name="ci-private-network" args="connection.slave-type,connection.port-type,connection.controller,connection.master,connection.timestamp,ipv4.routes,ipv4.never-default,ipv4.method,ipv4.routing-rules,ipv4.addresses,ipv4.dns,ipv6.routes,ipv6.addr-gen-mode,ipv6.method,ipv6.routing-rules,ipv6.addresses,ipv6.dns,ovs-interface.type,ovs-external-ids.data" pid=58964 uid=0 result="success"
Feb 16 13:00:45 compute-0 NetworkManager[56177]: <info>  [1771246845.9677] manager: (vlan20): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/11)
Feb 16 13:00:45 compute-0 NetworkManager[56177]: <info>  [1771246845.9679] audit: op="connection-add" uuid="068844c1-3394-421f-8f37-7b6614935cb1" name="vlan20-if" pid=58964 uid=0 result="success"
Feb 16 13:00:45 compute-0 NetworkManager[56177]: <info>  [1771246845.9695] manager: (vlan21): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/12)
Feb 16 13:00:45 compute-0 NetworkManager[56177]: <info>  [1771246845.9697] audit: op="connection-add" uuid="0adf48ee-32be-4b2a-aa2e-1b7bb6a38696" name="vlan21-if" pid=58964 uid=0 result="success"
Feb 16 13:00:45 compute-0 NetworkManager[56177]: <info>  [1771246845.9714] manager: (vlan22): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/13)
Feb 16 13:00:45 compute-0 NetworkManager[56177]: <info>  [1771246845.9715] audit: op="connection-add" uuid="7d7518a7-efd6-45c6-bd42-4214f659a57a" name="vlan22-if" pid=58964 uid=0 result="success"
Feb 16 13:00:45 compute-0 NetworkManager[56177]: <info>  [1771246845.9727] audit: op="connection-delete" uuid="1c93776b-06f5-3002-90c0-48f1720b7a9b" name="Wired connection 1" pid=58964 uid=0 result="success"
Feb 16 13:00:45 compute-0 NetworkManager[56177]: <info>  [1771246845.9738] device (br-ex)[Open vSwitch Bridge]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Feb 16 13:00:45 compute-0 NetworkManager[56177]: <warn>  [1771246845.9740] device (br-ex)[Open vSwitch Bridge]: error setting IPv4 forwarding to '1': Success
Feb 16 13:00:45 compute-0 NetworkManager[56177]: <info>  [1771246845.9746] device (br-ex)[Open vSwitch Bridge]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Feb 16 13:00:45 compute-0 NetworkManager[56177]: <info>  [1771246845.9750] device (br-ex)[Open vSwitch Bridge]: Activation: starting connection 'br-ex-br' (6aad6b43-ea5c-4d5d-a4a2-9c47926a5230)
Feb 16 13:00:45 compute-0 NetworkManager[56177]: <info>  [1771246845.9751] audit: op="connection-activate" uuid="6aad6b43-ea5c-4d5d-a4a2-9c47926a5230" name="br-ex-br" pid=58964 uid=0 result="success"
Feb 16 13:00:45 compute-0 NetworkManager[56177]: <info>  [1771246845.9752] device (br-ex)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Feb 16 13:00:45 compute-0 NetworkManager[56177]: <warn>  [1771246845.9753] device (br-ex)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Feb 16 13:00:45 compute-0 NetworkManager[56177]: <info>  [1771246845.9758] device (br-ex)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Feb 16 13:00:45 compute-0 NetworkManager[56177]: <info>  [1771246845.9762] device (br-ex)[Open vSwitch Port]: Activation: starting connection 'br-ex-port' (774366e0-f3d0-439c-a936-c5208547c2a9)
Feb 16 13:00:45 compute-0 NetworkManager[56177]: <info>  [1771246845.9764] device (eth1)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Feb 16 13:00:45 compute-0 NetworkManager[56177]: <warn>  [1771246845.9764] device (eth1)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Feb 16 13:00:45 compute-0 NetworkManager[56177]: <info>  [1771246845.9768] device (eth1)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Feb 16 13:00:45 compute-0 NetworkManager[56177]: <info>  [1771246845.9772] device (eth1)[Open vSwitch Port]: Activation: starting connection 'eth1-port' (a3b4a1be-db0b-4e79-8219-7bb6742fe7fe)
Feb 16 13:00:45 compute-0 NetworkManager[56177]: <info>  [1771246845.9773] device (vlan20)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Feb 16 13:00:45 compute-0 NetworkManager[56177]: <warn>  [1771246845.9774] device (vlan20)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Feb 16 13:00:45 compute-0 NetworkManager[56177]: <info>  [1771246845.9779] device (vlan20)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Feb 16 13:00:45 compute-0 NetworkManager[56177]: <info>  [1771246845.9783] device (vlan20)[Open vSwitch Port]: Activation: starting connection 'vlan20-port' (8d8b9a63-ede1-4e7a-a597-cf6d31864e31)
Feb 16 13:00:45 compute-0 NetworkManager[56177]: <info>  [1771246845.9784] device (vlan21)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Feb 16 13:00:45 compute-0 NetworkManager[56177]: <warn>  [1771246845.9785] device (vlan21)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Feb 16 13:00:45 compute-0 NetworkManager[56177]: <info>  [1771246845.9790] device (vlan21)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Feb 16 13:00:45 compute-0 NetworkManager[56177]: <info>  [1771246845.9794] device (vlan21)[Open vSwitch Port]: Activation: starting connection 'vlan21-port' (c1ed7cef-ae09-4f4f-a299-d481de99fd09)
Feb 16 13:00:45 compute-0 NetworkManager[56177]: <info>  [1771246845.9795] device (vlan22)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Feb 16 13:00:45 compute-0 NetworkManager[56177]: <warn>  [1771246845.9796] device (vlan22)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Feb 16 13:00:45 compute-0 NetworkManager[56177]: <info>  [1771246845.9801] device (vlan22)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Feb 16 13:00:45 compute-0 NetworkManager[56177]: <info>  [1771246845.9805] device (vlan22)[Open vSwitch Port]: Activation: starting connection 'vlan22-port' (25c80987-1304-4178-ad32-2100911a7fd9)
Feb 16 13:00:45 compute-0 NetworkManager[56177]: <info>  [1771246845.9805] device (br-ex)[Open vSwitch Bridge]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Feb 16 13:00:45 compute-0 NetworkManager[56177]: <info>  [1771246845.9807] device (br-ex)[Open vSwitch Bridge]: state change: prepare -> config (reason 'none', managed-type: 'full')
Feb 16 13:00:45 compute-0 NetworkManager[56177]: <info>  [1771246845.9809] device (br-ex)[Open vSwitch Bridge]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Feb 16 13:00:45 compute-0 NetworkManager[56177]: <info>  [1771246845.9815] device (br-ex)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Feb 16 13:00:45 compute-0 NetworkManager[56177]: <warn>  [1771246845.9816] device (br-ex)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Feb 16 13:00:45 compute-0 NetworkManager[56177]: <info>  [1771246845.9818] device (br-ex)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Feb 16 13:00:45 compute-0 NetworkManager[56177]: <info>  [1771246845.9822] device (br-ex)[Open vSwitch Interface]: Activation: starting connection 'br-ex-if' (3740008e-3c44-41b4-86a2-3f30fcf7aac5)
Feb 16 13:00:45 compute-0 NetworkManager[56177]: <info>  [1771246845.9822] device (br-ex)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Feb 16 13:00:45 compute-0 NetworkManager[56177]: <info>  [1771246845.9825] device (br-ex)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Feb 16 13:00:45 compute-0 NetworkManager[56177]: <info>  [1771246845.9827] device (br-ex)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Feb 16 13:00:45 compute-0 NetworkManager[56177]: <info>  [1771246845.9828] device (br-ex)[Open vSwitch Port]: Activation: connection 'br-ex-port' attached as port, continuing activation
Feb 16 13:00:45 compute-0 NetworkManager[56177]: <info>  [1771246845.9829] device (eth1): state change: activated -> deactivating (reason 'new-activation', managed-type: 'full')
Feb 16 13:00:45 compute-0 NetworkManager[56177]: <info>  [1771246845.9839] device (eth1): disconnecting for new activation request.
Feb 16 13:00:45 compute-0 NetworkManager[56177]: <info>  [1771246845.9839] device (eth1)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Feb 16 13:00:45 compute-0 NetworkManager[56177]: <info>  [1771246845.9841] device (eth1)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Feb 16 13:00:45 compute-0 NetworkManager[56177]: <info>  [1771246845.9843] device (eth1)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Feb 16 13:00:45 compute-0 NetworkManager[56177]: <info>  [1771246845.9844] device (eth1)[Open vSwitch Port]: Activation: connection 'eth1-port' attached as port, continuing activation
Feb 16 13:00:45 compute-0 NetworkManager[56177]: <info>  [1771246845.9846] device (vlan20)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Feb 16 13:00:45 compute-0 NetworkManager[56177]: <warn>  [1771246845.9847] device (vlan20)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Feb 16 13:00:45 compute-0 NetworkManager[56177]: <info>  [1771246845.9850] device (vlan20)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Feb 16 13:00:45 compute-0 NetworkManager[56177]: <info>  [1771246845.9853] device (vlan20)[Open vSwitch Interface]: Activation: starting connection 'vlan20-if' (068844c1-3394-421f-8f37-7b6614935cb1)
Feb 16 13:00:45 compute-0 NetworkManager[56177]: <info>  [1771246845.9854] device (vlan20)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Feb 16 13:00:45 compute-0 NetworkManager[56177]: <info>  [1771246845.9856] device (vlan20)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Feb 16 13:00:45 compute-0 NetworkManager[56177]: <info>  [1771246845.9858] device (vlan20)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Feb 16 13:00:45 compute-0 NetworkManager[56177]: <info>  [1771246845.9859] device (vlan20)[Open vSwitch Port]: Activation: connection 'vlan20-port' attached as port, continuing activation
Feb 16 13:00:45 compute-0 NetworkManager[56177]: <info>  [1771246845.9861] device (vlan21)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Feb 16 13:00:45 compute-0 NetworkManager[56177]: <warn>  [1771246845.9862] device (vlan21)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Feb 16 13:00:45 compute-0 NetworkManager[56177]: <info>  [1771246845.9864] device (vlan21)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Feb 16 13:00:45 compute-0 NetworkManager[56177]: <info>  [1771246845.9868] device (vlan21)[Open vSwitch Interface]: Activation: starting connection 'vlan21-if' (0adf48ee-32be-4b2a-aa2e-1b7bb6a38696)
Feb 16 13:00:45 compute-0 NetworkManager[56177]: <info>  [1771246845.9869] device (vlan21)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Feb 16 13:00:45 compute-0 NetworkManager[56177]: <info>  [1771246845.9871] device (vlan21)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Feb 16 13:00:45 compute-0 NetworkManager[56177]: <info>  [1771246845.9873] device (vlan21)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Feb 16 13:00:45 compute-0 NetworkManager[56177]: <info>  [1771246845.9874] device (vlan21)[Open vSwitch Port]: Activation: connection 'vlan21-port' attached as port, continuing activation
Feb 16 13:00:45 compute-0 NetworkManager[56177]: <info>  [1771246845.9877] device (vlan22)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Feb 16 13:00:45 compute-0 NetworkManager[56177]: <warn>  [1771246845.9877] device (vlan22)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Feb 16 13:00:45 compute-0 NetworkManager[56177]: <info>  [1771246845.9880] device (vlan22)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Feb 16 13:00:45 compute-0 NetworkManager[56177]: <info>  [1771246845.9884] device (vlan22)[Open vSwitch Interface]: Activation: starting connection 'vlan22-if' (7d7518a7-efd6-45c6-bd42-4214f659a57a)
Feb 16 13:00:45 compute-0 NetworkManager[56177]: <info>  [1771246845.9884] device (vlan22)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Feb 16 13:00:45 compute-0 NetworkManager[56177]: <info>  [1771246845.9887] device (vlan22)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Feb 16 13:00:45 compute-0 NetworkManager[56177]: <info>  [1771246845.9889] device (vlan22)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Feb 16 13:00:45 compute-0 NetworkManager[56177]: <info>  [1771246845.9890] device (vlan22)[Open vSwitch Port]: Activation: connection 'vlan22-port' attached as port, continuing activation
Feb 16 13:00:45 compute-0 NetworkManager[56177]: <info>  [1771246845.9891] device (br-ex)[Open vSwitch Bridge]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Feb 16 13:00:45 compute-0 NetworkManager[56177]: <info>  [1771246845.9906] audit: op="device-reapply" interface="eth0" ifindex=2 args="connection.autoconnect-priority,ipv4.dhcp-client-id,ipv4.dhcp-timeout,ipv6.addr-gen-mode,ipv6.method,802-3-ethernet.mtu" pid=58964 uid=0 result="success"
Feb 16 13:00:45 compute-0 NetworkManager[56177]: <info>  [1771246845.9907] device (br-ex)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Feb 16 13:00:45 compute-0 NetworkManager[56177]: <info>  [1771246845.9910] device (br-ex)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Feb 16 13:00:45 compute-0 NetworkManager[56177]: <info>  [1771246845.9911] device (br-ex)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Feb 16 13:00:45 compute-0 NetworkManager[56177]: <info>  [1771246845.9918] device (br-ex)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Feb 16 13:00:45 compute-0 NetworkManager[56177]: <info>  [1771246845.9921] device (eth1)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Feb 16 13:00:45 compute-0 NetworkManager[56177]: <info>  [1771246845.9925] device (vlan20)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Feb 16 13:00:45 compute-0 NetworkManager[56177]: <info>  [1771246845.9928] device (vlan20)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Feb 16 13:00:45 compute-0 NetworkManager[56177]: <info>  [1771246845.9930] device (vlan20)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Feb 16 13:00:45 compute-0 NetworkManager[56177]: <info>  [1771246845.9936] device (vlan20)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Feb 16 13:00:45 compute-0 NetworkManager[56177]: <info>  [1771246845.9942] device (vlan21)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Feb 16 13:00:45 compute-0 kernel: ovs-system: entered promiscuous mode
Feb 16 13:00:45 compute-0 NetworkManager[56177]: <info>  [1771246845.9947] device (vlan21)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Feb 16 13:00:45 compute-0 NetworkManager[56177]: <info>  [1771246845.9950] device (vlan21)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Feb 16 13:00:45 compute-0 NetworkManager[56177]: <info>  [1771246845.9957] device (vlan21)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Feb 16 13:00:45 compute-0 kernel: Timeout policy base is empty
Feb 16 13:00:45 compute-0 NetworkManager[56177]: <info>  [1771246845.9963] device (vlan22)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Feb 16 13:00:45 compute-0 systemd-udevd[58968]: Network interface NamePolicy= disabled on kernel command line.
Feb 16 13:00:45 compute-0 NetworkManager[56177]: <info>  [1771246845.9968] device (vlan22)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Feb 16 13:00:45 compute-0 NetworkManager[56177]: <info>  [1771246845.9970] device (vlan22)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Feb 16 13:00:45 compute-0 NetworkManager[56177]: <info>  [1771246845.9976] device (vlan22)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Feb 16 13:00:45 compute-0 NetworkManager[56177]: <info>  [1771246845.9981] dhcp4 (eth0): canceled DHCP transaction
Feb 16 13:00:45 compute-0 NetworkManager[56177]: <info>  [1771246845.9981] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Feb 16 13:00:45 compute-0 NetworkManager[56177]: <info>  [1771246845.9981] dhcp4 (eth0): state changed no lease
Feb 16 13:00:45 compute-0 NetworkManager[56177]: <info>  [1771246845.9982] dhcp4 (eth0): activation: beginning transaction (no timeout)
Feb 16 13:00:45 compute-0 NetworkManager[56177]: <info>  [1771246845.9992] device (br-ex)[Open vSwitch Interface]: Activation: connection 'br-ex-if' attached as port, continuing activation
Feb 16 13:00:45 compute-0 NetworkManager[56177]: <info>  [1771246845.9996] audit: op="device-reapply" interface="eth1" ifindex=3 pid=58964 uid=0 result="fail" reason="Device is not activated"
Feb 16 13:00:46 compute-0 NetworkManager[56177]: <info>  [1771246846.0002] device (vlan20)[Open vSwitch Interface]: Activation: connection 'vlan20-if' attached as port, continuing activation
Feb 16 13:00:46 compute-0 systemd[1]: Starting Network Manager Script Dispatcher Service...
Feb 16 13:00:46 compute-0 NetworkManager[56177]: <info>  [1771246846.0033] device (vlan21)[Open vSwitch Interface]: Activation: connection 'vlan21-if' attached as port, continuing activation
Feb 16 13:00:46 compute-0 NetworkManager[56177]: <info>  [1771246846.0037] dhcp4 (eth0): state changed new lease, address=38.102.83.210
Feb 16 13:00:46 compute-0 NetworkManager[56177]: <info>  [1771246846.0041] device (vlan22)[Open vSwitch Interface]: Activation: connection 'vlan22-if' attached as port, continuing activation
Feb 16 13:00:46 compute-0 NetworkManager[56177]: <info>  [1771246846.0075] device (eth1): disconnecting for new activation request.
Feb 16 13:00:46 compute-0 NetworkManager[56177]: <info>  [1771246846.0076] audit: op="connection-activate" uuid="5342df4c-d20b-514f-b7a0-cf6ea02e3054" name="ci-private-network" pid=58964 uid=0 result="success"
Feb 16 13:00:46 compute-0 NetworkManager[56177]: <info>  [1771246846.0100] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=58964 uid=0 result="success"
Feb 16 13:00:46 compute-0 systemd[1]: Started Network Manager Script Dispatcher Service.
Feb 16 13:00:46 compute-0 NetworkManager[56177]: <info>  [1771246846.0102] device (eth1): state change: deactivating -> disconnected (reason 'new-activation', managed-type: 'full')
Feb 16 13:00:46 compute-0 NetworkManager[56177]: <info>  [1771246846.0184] device (eth1): Activation: starting connection 'ci-private-network' (5342df4c-d20b-514f-b7a0-cf6ea02e3054)
Feb 16 13:00:46 compute-0 NetworkManager[56177]: <info>  [1771246846.0199] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Feb 16 13:00:46 compute-0 NetworkManager[56177]: <info>  [1771246846.0203] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Feb 16 13:00:46 compute-0 NetworkManager[56177]: <info>  [1771246846.0210] device (br-ex)[Open vSwitch Bridge]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Feb 16 13:00:46 compute-0 NetworkManager[56177]: <info>  [1771246846.0211] device (br-ex)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Feb 16 13:00:46 compute-0 NetworkManager[56177]: <info>  [1771246846.0213] device (eth1)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Feb 16 13:00:46 compute-0 NetworkManager[56177]: <info>  [1771246846.0214] device (vlan20)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Feb 16 13:00:46 compute-0 NetworkManager[56177]: <info>  [1771246846.0216] device (vlan21)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Feb 16 13:00:46 compute-0 NetworkManager[56177]: <info>  [1771246846.0218] device (vlan22)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Feb 16 13:00:46 compute-0 NetworkManager[56177]: <info>  [1771246846.0230] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Feb 16 13:00:46 compute-0 NetworkManager[56177]: <info>  [1771246846.0236] device (br-ex)[Open vSwitch Bridge]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Feb 16 13:00:46 compute-0 NetworkManager[56177]: <info>  [1771246846.0240] device (br-ex)[Open vSwitch Bridge]: Activation: successful, device activated.
Feb 16 13:00:46 compute-0 NetworkManager[56177]: <info>  [1771246846.0246] device (br-ex)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Feb 16 13:00:46 compute-0 NetworkManager[56177]: <info>  [1771246846.0251] device (br-ex)[Open vSwitch Port]: Activation: successful, device activated.
Feb 16 13:00:46 compute-0 kernel: br-ex: entered promiscuous mode
Feb 16 13:00:46 compute-0 NetworkManager[56177]: <info>  [1771246846.0255] device (eth1)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Feb 16 13:00:46 compute-0 NetworkManager[56177]: <info>  [1771246846.0258] device (eth1)[Open vSwitch Port]: Activation: successful, device activated.
Feb 16 13:00:46 compute-0 NetworkManager[56177]: <info>  [1771246846.0261] device (vlan20)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Feb 16 13:00:46 compute-0 NetworkManager[56177]: <info>  [1771246846.0265] device (vlan20)[Open vSwitch Port]: Activation: successful, device activated.
Feb 16 13:00:46 compute-0 NetworkManager[56177]: <info>  [1771246846.0270] device (vlan21)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Feb 16 13:00:46 compute-0 kernel: virtio_net virtio5 eth1: entered promiscuous mode
Feb 16 13:00:46 compute-0 NetworkManager[56177]: <info>  [1771246846.0274] device (vlan21)[Open vSwitch Port]: Activation: successful, device activated.
Feb 16 13:00:46 compute-0 NetworkManager[56177]: <info>  [1771246846.0278] device (vlan22)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Feb 16 13:00:46 compute-0 NetworkManager[56177]: <info>  [1771246846.0281] device (vlan22)[Open vSwitch Port]: Activation: successful, device activated.
Feb 16 13:00:46 compute-0 NetworkManager[56177]: <info>  [1771246846.0287] device (eth1): Activation: connection 'ci-private-network' attached as port, continuing activation
Feb 16 13:00:46 compute-0 NetworkManager[56177]: <info>  [1771246846.0292] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Feb 16 13:00:46 compute-0 NetworkManager[56177]: <info>  [1771246846.0308] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Feb 16 13:00:46 compute-0 NetworkManager[56177]: <info>  [1771246846.0310] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Feb 16 13:00:46 compute-0 NetworkManager[56177]: <info>  [1771246846.0315] device (eth1): Activation: successful, device activated.
Feb 16 13:00:46 compute-0 kernel: vlan22: entered promiscuous mode
Feb 16 13:00:46 compute-0 systemd-udevd[58969]: Network interface NamePolicy= disabled on kernel command line.
Feb 16 13:00:46 compute-0 systemd-udevd[58970]: Network interface NamePolicy= disabled on kernel command line.
Feb 16 13:00:46 compute-0 kernel: vlan21: entered promiscuous mode
Feb 16 13:00:46 compute-0 NetworkManager[56177]: <info>  [1771246846.0400] device (br-ex)[Open vSwitch Interface]: carrier: link connected
Feb 16 13:00:46 compute-0 kernel: vlan20: entered promiscuous mode
Feb 16 13:00:46 compute-0 NetworkManager[56177]: <info>  [1771246846.0448] device (vlan22)[Open vSwitch Interface]: carrier: link connected
Feb 16 13:00:46 compute-0 NetworkManager[56177]: <info>  [1771246846.0451] device (br-ex)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Feb 16 13:00:46 compute-0 NetworkManager[56177]: <info>  [1771246846.0470] device (vlan22)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Feb 16 13:00:46 compute-0 NetworkManager[56177]: <info>  [1771246846.0488] device (vlan21)[Open vSwitch Interface]: carrier: link connected
Feb 16 13:00:46 compute-0 NetworkManager[56177]: <info>  [1771246846.0492] device (br-ex)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Feb 16 13:00:46 compute-0 NetworkManager[56177]: <info>  [1771246846.0497] device (br-ex)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Feb 16 13:00:46 compute-0 NetworkManager[56177]: <info>  [1771246846.0501] device (br-ex)[Open vSwitch Interface]: Activation: successful, device activated.
Feb 16 13:00:46 compute-0 NetworkManager[56177]: <info>  [1771246846.0507] device (vlan22)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Feb 16 13:00:46 compute-0 NetworkManager[56177]: <info>  [1771246846.0513] device (vlan22)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Feb 16 13:00:46 compute-0 NetworkManager[56177]: <info>  [1771246846.0517] device (vlan22)[Open vSwitch Interface]: Activation: successful, device activated.
Feb 16 13:00:46 compute-0 NetworkManager[56177]: <info>  [1771246846.0523] device (vlan21)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Feb 16 13:00:46 compute-0 NetworkManager[56177]: <info>  [1771246846.0566] device (vlan21)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Feb 16 13:00:46 compute-0 NetworkManager[56177]: <info>  [1771246846.0567] device (vlan21)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Feb 16 13:00:46 compute-0 NetworkManager[56177]: <info>  [1771246846.0572] device (vlan21)[Open vSwitch Interface]: Activation: successful, device activated.
Feb 16 13:00:46 compute-0 NetworkManager[56177]: <info>  [1771246846.0604] device (vlan20)[Open vSwitch Interface]: carrier: link connected
Feb 16 13:00:46 compute-0 NetworkManager[56177]: <info>  [1771246846.0614] device (vlan20)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Feb 16 13:00:46 compute-0 NetworkManager[56177]: <info>  [1771246846.0630] device (vlan20)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Feb 16 13:00:46 compute-0 NetworkManager[56177]: <info>  [1771246846.0634] device (vlan20)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Feb 16 13:00:46 compute-0 NetworkManager[56177]: <info>  [1771246846.0638] device (vlan20)[Open vSwitch Interface]: Activation: successful, device activated.
Feb 16 13:00:47 compute-0 NetworkManager[56177]: <info>  [1771246847.1589] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=58964 uid=0 result="success"
Feb 16 13:00:47 compute-0 NetworkManager[56177]: <info>  [1771246847.3366] checkpoint[0x563e7c2c6950]: destroy /org/freedesktop/NetworkManager/Checkpoint/1
Feb 16 13:00:47 compute-0 NetworkManager[56177]: <info>  [1771246847.3370] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=58964 uid=0 result="success"
Feb 16 13:00:47 compute-0 NetworkManager[56177]: <info>  [1771246847.5839] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=58964 uid=0 result="success"
Feb 16 13:00:47 compute-0 NetworkManager[56177]: <info>  [1771246847.5850] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=58964 uid=0 result="success"
Feb 16 13:00:47 compute-0 NetworkManager[56177]: <info>  [1771246847.7503] audit: op="networking-control" arg="global-dns-configuration" pid=58964 uid=0 result="success"
Feb 16 13:00:47 compute-0 NetworkManager[56177]: <info>  [1771246847.7535] config: signal: SET_VALUES,values,values-intern,global-dns-config (/etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf)
Feb 16 13:00:47 compute-0 NetworkManager[56177]: <info>  [1771246847.7587] audit: op="networking-control" arg="global-dns-configuration" pid=58964 uid=0 result="success"
Feb 16 13:00:47 compute-0 NetworkManager[56177]: <info>  [1771246847.8091] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=58964 uid=0 result="success"
Feb 16 13:00:47 compute-0 sudo[59301]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wsrnlshaiqvgmyybwmyeqyhjsweznkwy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246847.2847855-574-56899373368951/AnsiballZ_async_status.py'
Feb 16 13:00:47 compute-0 sudo[59301]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:00:47 compute-0 NetworkManager[56177]: <info>  [1771246847.9591] checkpoint[0x563e7c2c6a20]: destroy /org/freedesktop/NetworkManager/Checkpoint/2
Feb 16 13:00:47 compute-0 NetworkManager[56177]: <info>  [1771246847.9598] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=58964 uid=0 result="success"
Feb 16 13:00:48 compute-0 ansible-async_wrapper.py[58962]: Module complete (58962)
Feb 16 13:00:48 compute-0 python3.9[59303]: ansible-ansible.legacy.async_status Invoked with jid=j805786542711.58958 mode=status _async_dir=/root/.ansible_async
Feb 16 13:00:48 compute-0 sudo[59301]: pam_unix(sudo:session): session closed for user root
Feb 16 13:00:48 compute-0 sudo[59401]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bgwlgcragcgrqqfjatdygfkebhzcuiyv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246847.2847855-574-56899373368951/AnsiballZ_async_status.py'
Feb 16 13:00:48 compute-0 sudo[59401]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:00:48 compute-0 python3.9[59403]: ansible-ansible.legacy.async_status Invoked with jid=j805786542711.58958 mode=cleanup _async_dir=/root/.ansible_async
Feb 16 13:00:48 compute-0 sudo[59401]: pam_unix(sudo:session): session closed for user root
Feb 16 13:00:49 compute-0 ansible-async_wrapper.py[58961]: Done in kid B.
Feb 16 13:00:49 compute-0 sudo[59553]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hkwlnpnskdzzcwszusqdzmvkkuevkrqt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246848.8836682-618-276127241867185/AnsiballZ_stat.py'
Feb 16 13:00:49 compute-0 sudo[59553]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:00:49 compute-0 python3.9[59555]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:00:49 compute-0 sudo[59553]: pam_unix(sudo:session): session closed for user root
Feb 16 13:00:49 compute-0 sudo[59676]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-znfxkprszoyzprpupxpygdxgeovubkhv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246848.8836682-618-276127241867185/AnsiballZ_copy.py'
Feb 16 13:00:49 compute-0 sudo[59676]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:00:49 compute-0 python3.9[59678]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/os-net-config.returncode mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771246848.8836682-618-276127241867185/.source.returncode _original_basename=.ocs4xj22 follow=False checksum=b6589fc6ab0dc82cf12099d1c2d40ab994e8410c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:00:49 compute-0 sudo[59676]: pam_unix(sudo:session): session closed for user root
Feb 16 13:00:50 compute-0 sudo[59828]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-anhcisvdonrnqlwojluaeocafeogbgcz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246850.127053-650-256121575611754/AnsiballZ_stat.py'
Feb 16 13:00:50 compute-0 sudo[59828]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:00:50 compute-0 python3.9[59830]: ansible-ansible.legacy.stat Invoked with path=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:00:50 compute-0 sudo[59828]: pam_unix(sudo:session): session closed for user root
Feb 16 13:00:50 compute-0 sudo[59951]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yolsguybomlrfdgfzftogtytxeaggwqw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246850.127053-650-256121575611754/AnsiballZ_copy.py'
Feb 16 13:00:50 compute-0 sudo[59951]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:00:51 compute-0 python3.9[59953]: ansible-ansible.legacy.copy Invoked with dest=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771246850.127053-650-256121575611754/.source.cfg _original_basename=.p1xe4mzh follow=False checksum=f3c5952a9cd4c6c31b314b25eb897168971cc86e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:00:51 compute-0 sudo[59951]: pam_unix(sudo:session): session closed for user root
Feb 16 13:00:51 compute-0 sudo[60104]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iklossdtylzfhcjvzdukjwdreytwmztc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246851.2785323-680-40200189323277/AnsiballZ_systemd.py'
Feb 16 13:00:51 compute-0 sudo[60104]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:00:51 compute-0 python3.9[60106]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 16 13:00:51 compute-0 systemd[1]: Reloading Network Manager...
Feb 16 13:00:51 compute-0 NetworkManager[56177]: <info>  [1771246851.9300] audit: op="reload" arg="0" pid=60110 uid=0 result="success"
Feb 16 13:00:51 compute-0 NetworkManager[56177]: <info>  [1771246851.9307] config: signal: SIGHUP,config-files,values,values-user,no-auto-default (/etc/NetworkManager/NetworkManager.conf, /usr/lib/NetworkManager/conf.d/00-server.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf, /var/lib/NetworkManager/NetworkManager-intern.conf)
Feb 16 13:00:51 compute-0 systemd[1]: Reloaded Network Manager.
Feb 16 13:00:51 compute-0 sudo[60104]: pam_unix(sudo:session): session closed for user root
Feb 16 13:00:52 compute-0 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Feb 16 13:00:52 compute-0 sshd-session[52157]: Connection closed by 192.168.122.30 port 56532
Feb 16 13:00:52 compute-0 sshd-session[52154]: pam_unix(sshd:session): session closed for user zuul
Feb 16 13:00:52 compute-0 systemd[1]: session-12.scope: Deactivated successfully.
Feb 16 13:00:52 compute-0 systemd[1]: session-12.scope: Consumed 43.719s CPU time.
Feb 16 13:00:52 compute-0 systemd-logind[818]: Session 12 logged out. Waiting for processes to exit.
Feb 16 13:00:52 compute-0 systemd-logind[818]: Removed session 12.
Feb 16 13:00:58 compute-0 sshd-session[60143]: Accepted publickey for zuul from 192.168.122.30 port 39654 ssh2: ECDSA SHA256:6P1Q60WP6ePVjzkJMgpM7PslBYS6YIK3pVS2L1FZ4Yk
Feb 16 13:00:58 compute-0 systemd-logind[818]: New session 13 of user zuul.
Feb 16 13:00:58 compute-0 systemd[1]: Started Session 13 of User zuul.
Feb 16 13:00:58 compute-0 sshd-session[60143]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 16 13:00:59 compute-0 python3.9[60296]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 16 13:01:01 compute-0 CROND[60453]: (root) CMD (run-parts /etc/cron.hourly)
Feb 16 13:01:01 compute-0 run-parts[60456]: (/etc/cron.hourly) starting 0anacron
Feb 16 13:01:01 compute-0 anacron[60464]: Anacron started on 2026-02-16
Feb 16 13:01:01 compute-0 anacron[60464]: Will run job `cron.daily' in 11 min.
Feb 16 13:01:01 compute-0 anacron[60464]: Will run job `cron.weekly' in 31 min.
Feb 16 13:01:01 compute-0 anacron[60464]: Will run job `cron.monthly' in 51 min.
Feb 16 13:01:01 compute-0 anacron[60464]: Jobs will be executed sequentially
Feb 16 13:01:01 compute-0 python3.9[60450]: ansible-ansible.builtin.setup Invoked with filter=['ansible_default_ipv4'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 16 13:01:01 compute-0 run-parts[60466]: (/etc/cron.hourly) finished 0anacron
Feb 16 13:01:01 compute-0 CROND[60452]: (root) CMDEND (run-parts /etc/cron.hourly)
Feb 16 13:01:01 compute-0 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Feb 16 13:01:02 compute-0 python3.9[60656]: ansible-ansible.legacy.command Invoked with _raw_params=hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 13:01:02 compute-0 sshd-session[60146]: Connection closed by 192.168.122.30 port 39654
Feb 16 13:01:02 compute-0 sshd-session[60143]: pam_unix(sshd:session): session closed for user zuul
Feb 16 13:01:02 compute-0 systemd[1]: session-13.scope: Deactivated successfully.
Feb 16 13:01:02 compute-0 systemd[1]: session-13.scope: Consumed 1.935s CPU time.
Feb 16 13:01:02 compute-0 systemd-logind[818]: Session 13 logged out. Waiting for processes to exit.
Feb 16 13:01:02 compute-0 systemd-logind[818]: Removed session 13.
Feb 16 13:01:08 compute-0 sshd-session[60684]: Accepted publickey for zuul from 192.168.122.30 port 35960 ssh2: ECDSA SHA256:6P1Q60WP6ePVjzkJMgpM7PslBYS6YIK3pVS2L1FZ4Yk
Feb 16 13:01:08 compute-0 systemd-logind[818]: New session 14 of user zuul.
Feb 16 13:01:08 compute-0 systemd[1]: Started Session 14 of User zuul.
Feb 16 13:01:08 compute-0 sshd-session[60684]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 16 13:01:09 compute-0 python3.9[60837]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 16 13:01:10 compute-0 python3.9[60991]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 16 13:01:10 compute-0 sudo[61146]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tshtyipnhjqazlizleritcraawkomwyi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246870.5752037-60-264161532479833/AnsiballZ_setup.py'
Feb 16 13:01:10 compute-0 sudo[61146]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:01:11 compute-0 python3.9[61148]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 16 13:01:11 compute-0 sudo[61146]: pam_unix(sudo:session): session closed for user root
Feb 16 13:01:11 compute-0 sudo[61230]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tvnlbnncbllokmdvwjrilftkfopzboeb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246870.5752037-60-264161532479833/AnsiballZ_dnf.py'
Feb 16 13:01:11 compute-0 sudo[61230]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:01:12 compute-0 python3.9[61232]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 16 13:01:13 compute-0 sudo[61230]: pam_unix(sudo:session): session closed for user root
Feb 16 13:01:13 compute-0 sudo[61383]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vwmpfszvnkpmacxunnhgnkyqbbcylbls ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246873.4141815-84-164161615029019/AnsiballZ_setup.py'
Feb 16 13:01:13 compute-0 sudo[61383]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:01:13 compute-0 python3.9[61385]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 16 13:01:14 compute-0 sudo[61383]: pam_unix(sudo:session): session closed for user root
Feb 16 13:01:15 compute-0 sudo[61575]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-paxtcamjwvfnikckmkkuvvqtmaxyvtld ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246874.459948-106-117920274948268/AnsiballZ_file.py'
Feb 16 13:01:15 compute-0 sudo[61575]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:01:15 compute-0 python3.9[61577]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:01:15 compute-0 sudo[61575]: pam_unix(sudo:session): session closed for user root
Feb 16 13:01:16 compute-0 sudo[61727]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gbqxfxwxvhpilguebldrpenngsyukipt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246875.9807198-122-248365905876031/AnsiballZ_command.py'
Feb 16 13:01:16 compute-0 sudo[61727]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:01:16 compute-0 python3.9[61729]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 13:01:16 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 16 13:01:16 compute-0 sudo[61727]: pam_unix(sudo:session): session closed for user root
Feb 16 13:01:17 compute-0 sudo[61892]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-opmdyrciaxgpvqntczughnoqrzxeeptf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246876.8869846-138-209332492205637/AnsiballZ_stat.py'
Feb 16 13:01:17 compute-0 sudo[61892]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:01:17 compute-0 python3.9[61894]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:01:17 compute-0 sudo[61892]: pam_unix(sudo:session): session closed for user root
Feb 16 13:01:17 compute-0 sudo[61970]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yisryzajfaewzkbtsaopdjlwtxeboovd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246876.8869846-138-209332492205637/AnsiballZ_file.py'
Feb 16 13:01:17 compute-0 sudo[61970]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:01:17 compute-0 python3.9[61972]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/containers/networks/podman.json _original_basename=podman_network_config.j2 recurse=False state=file path=/etc/containers/networks/podman.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:01:17 compute-0 sudo[61970]: pam_unix(sudo:session): session closed for user root
Feb 16 13:01:18 compute-0 sudo[62122]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gvxlzxgyrkbuifnrxlhdbnejpuuydysf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246878.1366336-162-22143989162497/AnsiballZ_stat.py'
Feb 16 13:01:18 compute-0 sudo[62122]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:01:18 compute-0 python3.9[62124]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:01:18 compute-0 sudo[62122]: pam_unix(sudo:session): session closed for user root
Feb 16 13:01:18 compute-0 sudo[62200]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ucwtheekryafekezwundkzlnskpcozgr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246878.1366336-162-22143989162497/AnsiballZ_file.py'
Feb 16 13:01:18 compute-0 sudo[62200]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:01:19 compute-0 python3.9[62202]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf _original_basename=registries.conf.j2 recurse=False state=file path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 16 13:01:19 compute-0 sudo[62200]: pam_unix(sudo:session): session closed for user root
Feb 16 13:01:20 compute-0 sudo[62352]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lxgyqcfoudqipdicgrqsmlpgiyesvjlb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246879.7662363-188-183645899756575/AnsiballZ_ini_file.py'
Feb 16 13:01:20 compute-0 sudo[62352]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:01:20 compute-0 python3.9[62354]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Feb 16 13:01:20 compute-0 sudo[62352]: pam_unix(sudo:session): session closed for user root
Feb 16 13:01:20 compute-0 sudo[62504]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wbqihgtgbxazuldkghspwimyubjkwlid ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246880.5636723-188-98471648507375/AnsiballZ_ini_file.py'
Feb 16 13:01:20 compute-0 sudo[62504]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:01:21 compute-0 python3.9[62506]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Feb 16 13:01:21 compute-0 sudo[62504]: pam_unix(sudo:session): session closed for user root
Feb 16 13:01:21 compute-0 sudo[62656]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aypnauqnsbbfhxlkzjwczykrqzjaoufl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246881.1516292-188-92963465761511/AnsiballZ_ini_file.py'
Feb 16 13:01:21 compute-0 sudo[62656]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:01:21 compute-0 python3.9[62658]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Feb 16 13:01:21 compute-0 sudo[62656]: pam_unix(sudo:session): session closed for user root
Feb 16 13:01:21 compute-0 sudo[62808]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wurcgoahnwwhjwfqsknywjnzbfczrxju ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246881.7163765-188-53678658455384/AnsiballZ_ini_file.py'
Feb 16 13:01:21 compute-0 sudo[62808]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:01:22 compute-0 python3.9[62810]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Feb 16 13:01:22 compute-0 sudo[62808]: pam_unix(sudo:session): session closed for user root
Feb 16 13:01:22 compute-0 sudo[62960]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-txadgnuyymnrzjcrqtlajosndmonzrve ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246882.3567207-250-100549916476698/AnsiballZ_dnf.py'
Feb 16 13:01:22 compute-0 sudo[62960]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:01:22 compute-0 python3.9[62962]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 16 13:01:24 compute-0 sudo[62960]: pam_unix(sudo:session): session closed for user root
Feb 16 13:01:24 compute-0 sudo[63113]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uqtwtubjempfcfacaacougztxpohovrs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246884.581589-272-271549836689148/AnsiballZ_setup.py'
Feb 16 13:01:24 compute-0 sudo[63113]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:01:25 compute-0 python3.9[63115]: ansible-setup Invoked with gather_subset=['!all', '!min', 'distribution', 'distribution_major_version', 'distribution_version', 'os_family'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 16 13:01:25 compute-0 sudo[63113]: pam_unix(sudo:session): session closed for user root
Feb 16 13:01:25 compute-0 sudo[63267]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ganzgwlmrmvemghcxvtezibfoqpnaxxn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246885.316581-288-144170138963791/AnsiballZ_stat.py'
Feb 16 13:01:25 compute-0 sudo[63267]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:01:25 compute-0 python3.9[63269]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 16 13:01:25 compute-0 sudo[63267]: pam_unix(sudo:session): session closed for user root
Feb 16 13:01:26 compute-0 sudo[63419]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ihwfzqjmexwnzutvgeaotxcqinpbqzyg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246885.9997923-306-160711164046820/AnsiballZ_stat.py'
Feb 16 13:01:26 compute-0 sudo[63419]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:01:26 compute-0 python3.9[63421]: ansible-stat Invoked with path=/sbin/transactional-update follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 16 13:01:26 compute-0 sudo[63419]: pam_unix(sudo:session): session closed for user root
Feb 16 13:01:26 compute-0 sudo[63571]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mlqveerduunvrzkvlrpfnlcttxmdjasl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246886.7524552-326-187321621226694/AnsiballZ_command.py'
Feb 16 13:01:26 compute-0 sudo[63571]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:01:27 compute-0 python3.9[63573]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-system-running _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 13:01:27 compute-0 sudo[63571]: pam_unix(sudo:session): session closed for user root
Feb 16 13:01:28 compute-0 sudo[63724]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dxlqhurezhmyvqnozftscijsgexlpqlv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246887.4842834-346-264574700707906/AnsiballZ_service_facts.py'
Feb 16 13:01:28 compute-0 sudo[63724]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:01:28 compute-0 python3.9[63726]: ansible-service_facts Invoked
Feb 16 13:01:28 compute-0 network[63743]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Feb 16 13:01:28 compute-0 network[63744]: 'network-scripts' will be removed from distribution in near future.
Feb 16 13:01:28 compute-0 network[63745]: It is advised to switch to 'NetworkManager' instead for network management.
Feb 16 13:01:30 compute-0 sudo[63724]: pam_unix(sudo:session): session closed for user root
Feb 16 13:01:32 compute-0 sudo[64029]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kflppsxxovxyvfoxgluwqovgjwakkkqx ; /bin/bash /home/zuul/.ansible/tmp/ansible-tmp-1771246892.2641103-376-274976647914260/AnsiballZ_timesync_provider.sh /home/zuul/.ansible/tmp/ansible-tmp-1771246892.2641103-376-274976647914260/args'
Feb 16 13:01:32 compute-0 sudo[64029]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:01:32 compute-0 sudo[64029]: pam_unix(sudo:session): session closed for user root
Feb 16 13:01:33 compute-0 sudo[64196]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tiizickzsyvelrndfifvfpevtwpthsja ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246892.9925203-398-153702268590393/AnsiballZ_dnf.py'
Feb 16 13:01:33 compute-0 sudo[64196]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:01:33 compute-0 python3.9[64198]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 16 13:01:34 compute-0 sudo[64196]: pam_unix(sudo:session): session closed for user root
Feb 16 13:01:36 compute-0 sudo[64349]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qvxeiqkgxqwpdvsumnftkdbpdssvgaup ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246895.5444384-424-107308146360846/AnsiballZ_package_facts.py'
Feb 16 13:01:36 compute-0 sudo[64349]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:01:36 compute-0 python3.9[64351]: ansible-package_facts Invoked with manager=['auto'] strategy=first
Feb 16 13:01:36 compute-0 sudo[64349]: pam_unix(sudo:session): session closed for user root
Feb 16 13:01:37 compute-0 sudo[64501]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nsrjdwbxjiignbcpykvrrsyuhuhedtgy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246897.3629885-444-46759536034046/AnsiballZ_stat.py'
Feb 16 13:01:37 compute-0 sudo[64501]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:01:37 compute-0 python3.9[64503]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:01:37 compute-0 sudo[64501]: pam_unix(sudo:session): session closed for user root
Feb 16 13:01:38 compute-0 sudo[64626]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fkogbciyqdwyjurnfvnzrtslbdytddox ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246897.3629885-444-46759536034046/AnsiballZ_copy.py'
Feb 16 13:01:38 compute-0 sudo[64626]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:01:38 compute-0 python3.9[64628]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/chrony.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771246897.3629885-444-46759536034046/.source.conf follow=False _original_basename=chrony.conf.j2 checksum=cfb003e56d02d0d2c65555452eb1a05073fecdad force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:01:38 compute-0 sudo[64626]: pam_unix(sudo:session): session closed for user root
Feb 16 13:01:39 compute-0 sudo[64780]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-staomzgzziwqaqpxotozdhuvxksunxcp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246898.8536575-474-86429293914838/AnsiballZ_stat.py'
Feb 16 13:01:39 compute-0 sudo[64780]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:01:39 compute-0 python3.9[64782]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/chronyd follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:01:39 compute-0 sudo[64780]: pam_unix(sudo:session): session closed for user root
Feb 16 13:01:39 compute-0 sudo[64905]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fauiylgfwcjxdbqyychjcfpllpxyejns ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246898.8536575-474-86429293914838/AnsiballZ_copy.py'
Feb 16 13:01:39 compute-0 sudo[64905]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:01:39 compute-0 python3.9[64907]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/sysconfig/chronyd mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771246898.8536575-474-86429293914838/.source follow=False _original_basename=chronyd.sysconfig.j2 checksum=dd196b1ff1f915b23eebc37ec77405b5dd3df76c force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:01:39 compute-0 sudo[64905]: pam_unix(sudo:session): session closed for user root
Feb 16 13:01:40 compute-0 sudo[65059]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bvswpemfvtkwrttqhfuyiedhzmeoiucv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246900.5583334-516-68764721803864/AnsiballZ_lineinfile.py'
Feb 16 13:01:40 compute-0 sudo[65059]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:01:41 compute-0 python3.9[65061]: ansible-lineinfile Invoked with backup=True create=True dest=/etc/sysconfig/network line=PEERNTP=no mode=0644 regexp=^PEERNTP= state=present path=/etc/sysconfig/network encoding=utf-8 backrefs=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:01:41 compute-0 sudo[65059]: pam_unix(sudo:session): session closed for user root
Feb 16 13:01:42 compute-0 sudo[65213]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-azgetsptpvoxeipjbuzmejqtjkhfaqdc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246901.9410515-546-46297403882434/AnsiballZ_setup.py'
Feb 16 13:01:42 compute-0 sudo[65213]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:01:42 compute-0 python3.9[65215]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 16 13:01:42 compute-0 sudo[65213]: pam_unix(sudo:session): session closed for user root
Feb 16 13:01:43 compute-0 sudo[65297]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-leyfebnentnhvhabnqynlrcvwklmulnq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246901.9410515-546-46297403882434/AnsiballZ_systemd.py'
Feb 16 13:01:43 compute-0 sudo[65297]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:01:43 compute-0 python3.9[65299]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 16 13:01:43 compute-0 sudo[65297]: pam_unix(sudo:session): session closed for user root
Feb 16 13:01:44 compute-0 sudo[65451]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oeficjrabzxqhculxwjhwxggmcvxjpnp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246904.2649298-578-52247109151240/AnsiballZ_setup.py'
Feb 16 13:01:44 compute-0 sudo[65451]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:01:44 compute-0 python3.9[65453]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 16 13:01:45 compute-0 sudo[65451]: pam_unix(sudo:session): session closed for user root
Feb 16 13:01:45 compute-0 sudo[65535]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sicbbjscztbaqabtetdlkirnshtsckvj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246904.2649298-578-52247109151240/AnsiballZ_systemd.py'
Feb 16 13:01:45 compute-0 sudo[65535]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:01:45 compute-0 python3.9[65537]: ansible-ansible.legacy.systemd Invoked with name=chronyd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 16 13:01:45 compute-0 chronyd[842]: chronyd exiting
Feb 16 13:01:45 compute-0 systemd[1]: Stopping NTP client/server...
Feb 16 13:01:45 compute-0 systemd[1]: chronyd.service: Deactivated successfully.
Feb 16 13:01:45 compute-0 systemd[1]: Stopped NTP client/server.
Feb 16 13:01:45 compute-0 systemd[1]: Starting NTP client/server...
Feb 16 13:01:45 compute-0 chronyd[65545]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +NTS +SECHASH +IPV6 +DEBUG)
Feb 16 13:01:45 compute-0 chronyd[65545]: Frequency -24.870 +/- 0.099 ppm read from /var/lib/chrony/drift
Feb 16 13:01:45 compute-0 chronyd[65545]: Loaded seccomp filter (level 2)
Feb 16 13:01:45 compute-0 systemd[1]: Started NTP client/server.
Feb 16 13:01:45 compute-0 sudo[65535]: pam_unix(sudo:session): session closed for user root
Feb 16 13:01:46 compute-0 sshd-session[60687]: Connection closed by 192.168.122.30 port 35960
Feb 16 13:01:46 compute-0 sshd-session[60684]: pam_unix(sshd:session): session closed for user zuul
Feb 16 13:01:46 compute-0 systemd[1]: session-14.scope: Deactivated successfully.
Feb 16 13:01:46 compute-0 systemd[1]: session-14.scope: Consumed 22.928s CPU time.
Feb 16 13:01:46 compute-0 systemd-logind[818]: Session 14 logged out. Waiting for processes to exit.
Feb 16 13:01:46 compute-0 systemd-logind[818]: Removed session 14.
Feb 16 13:01:51 compute-0 sshd-session[65571]: Accepted publickey for zuul from 192.168.122.30 port 59338 ssh2: ECDSA SHA256:6P1Q60WP6ePVjzkJMgpM7PslBYS6YIK3pVS2L1FZ4Yk
Feb 16 13:01:51 compute-0 systemd-logind[818]: New session 15 of user zuul.
Feb 16 13:01:51 compute-0 systemd[1]: Started Session 15 of User zuul.
Feb 16 13:01:51 compute-0 sshd-session[65571]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 16 13:01:52 compute-0 python3.9[65724]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 16 13:01:53 compute-0 sudo[65878]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-neqpjfzpzhfbwqsrfxdskdeqagcsqthn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246913.4804034-46-194649154745946/AnsiballZ_file.py'
Feb 16 13:01:53 compute-0 sudo[65878]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:01:54 compute-0 python3.9[65880]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:01:54 compute-0 sudo[65878]: pam_unix(sudo:session): session closed for user root
Feb 16 13:01:54 compute-0 sudo[66053]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fbewesgngdhzriwfaigditzahewbjjzj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246914.3257442-62-258943108923618/AnsiballZ_stat.py'
Feb 16 13:01:54 compute-0 sudo[66053]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:01:55 compute-0 python3.9[66055]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:01:55 compute-0 sudo[66053]: pam_unix(sudo:session): session closed for user root
Feb 16 13:01:55 compute-0 sudo[66131]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lzdczgrvpmkvnzyauektyxcualoossot ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246914.3257442-62-258943108923618/AnsiballZ_file.py'
Feb 16 13:01:55 compute-0 sudo[66131]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:01:55 compute-0 python3.9[66133]: ansible-ansible.legacy.file Invoked with group=zuul mode=0660 owner=zuul dest=/root/.config/containers/auth.json _original_basename=.ldojoq2p recurse=False state=file path=/root/.config/containers/auth.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:01:55 compute-0 sudo[66131]: pam_unix(sudo:session): session closed for user root
Feb 16 13:01:56 compute-0 sudo[66283]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nvcytbhqkwbkdxsingdleqdomuauqqzv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246915.8058794-102-280283625061379/AnsiballZ_stat.py'
Feb 16 13:01:56 compute-0 sudo[66283]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:01:56 compute-0 python3.9[66285]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:01:56 compute-0 sudo[66283]: pam_unix(sudo:session): session closed for user root
Feb 16 13:01:56 compute-0 sudo[66406]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uqwjvrffncnnaizduuvrraeoidfmzgsc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246915.8058794-102-280283625061379/AnsiballZ_copy.py'
Feb 16 13:01:56 compute-0 sudo[66406]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:01:56 compute-0 python3.9[66408]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysconfig/podman_drop_in mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771246915.8058794-102-280283625061379/.source _original_basename=.tqo3uh0_ follow=False checksum=125299ce8dea7711a76292961206447f0043248b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:01:56 compute-0 sudo[66406]: pam_unix(sudo:session): session closed for user root
Feb 16 13:01:57 compute-0 sudo[66558]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bkccridvouseocoygiafctkgguemglka ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246917.0564957-134-63231100046621/AnsiballZ_file.py'
Feb 16 13:01:57 compute-0 sudo[66558]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:01:57 compute-0 python3.9[66560]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 16 13:01:57 compute-0 sudo[66558]: pam_unix(sudo:session): session closed for user root
Feb 16 13:01:57 compute-0 sudo[66710]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gzvkoshlttnsiranursotxukvnssvqyp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246917.6725495-150-78888403726262/AnsiballZ_stat.py'
Feb 16 13:01:57 compute-0 sudo[66710]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:01:58 compute-0 python3.9[66712]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:01:58 compute-0 sudo[66710]: pam_unix(sudo:session): session closed for user root
Feb 16 13:01:58 compute-0 sudo[66833]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cricwhmlupjgdjyhllhevcftokjcrtqm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246917.6725495-150-78888403726262/AnsiballZ_copy.py'
Feb 16 13:01:58 compute-0 sudo[66833]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:01:58 compute-0 python3.9[66835]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-container-shutdown group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771246917.6725495-150-78888403726262/.source _original_basename=edpm-container-shutdown follow=False checksum=632c3792eb3dce4288b33ae7b265b71950d69f13 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Feb 16 13:01:58 compute-0 sudo[66833]: pam_unix(sudo:session): session closed for user root
Feb 16 13:01:59 compute-0 sudo[66985]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fcoxxuvcroxlnqzzvxsxjbaysqouewfv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246918.7892277-150-95309242204432/AnsiballZ_stat.py'
Feb 16 13:01:59 compute-0 sudo[66985]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:01:59 compute-0 python3.9[66987]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:01:59 compute-0 sudo[66985]: pam_unix(sudo:session): session closed for user root
Feb 16 13:01:59 compute-0 sudo[67108]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wzusuadsxtdaylyqrpxmpqpygdssycfr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246918.7892277-150-95309242204432/AnsiballZ_copy.py'
Feb 16 13:01:59 compute-0 sudo[67108]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:01:59 compute-0 python3.9[67110]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-start-podman-container group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771246918.7892277-150-95309242204432/.source _original_basename=edpm-start-podman-container follow=False checksum=b963c569d75a655c0ccae95d9bb4a2a9a4df27d1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Feb 16 13:01:59 compute-0 sudo[67108]: pam_unix(sudo:session): session closed for user root
Feb 16 13:02:00 compute-0 sudo[67260]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wztwczhtsnofduhnatedntwhvpfougog ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246919.8451598-208-136178391280815/AnsiballZ_file.py'
Feb 16 13:02:00 compute-0 sudo[67260]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:02:00 compute-0 python3.9[67262]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:02:00 compute-0 sudo[67260]: pam_unix(sudo:session): session closed for user root
Feb 16 13:02:00 compute-0 sudo[67412]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rwvuxnybehbtdodyshdlgwibxkkhnnkm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246920.391639-224-16486652536291/AnsiballZ_stat.py'
Feb 16 13:02:00 compute-0 sudo[67412]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:02:00 compute-0 python3.9[67414]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:02:00 compute-0 sudo[67412]: pam_unix(sudo:session): session closed for user root
Feb 16 13:02:01 compute-0 sudo[67535]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eejyluhkjadzbjobvtzjbwzspngvaqqr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246920.391639-224-16486652536291/AnsiballZ_copy.py'
Feb 16 13:02:01 compute-0 sudo[67535]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:02:01 compute-0 python3.9[67537]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm-container-shutdown.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771246920.391639-224-16486652536291/.source.service _original_basename=edpm-container-shutdown-service follow=False checksum=6336835cb0f888670cc99de31e19c8c071444d33 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:02:01 compute-0 sudo[67535]: pam_unix(sudo:session): session closed for user root
Feb 16 13:02:01 compute-0 sudo[67687]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zifojbqmgptqyxvzebrbelgkllaeomqx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246921.522397-254-18367505661219/AnsiballZ_stat.py'
Feb 16 13:02:01 compute-0 sudo[67687]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:02:01 compute-0 python3.9[67689]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:02:01 compute-0 sudo[67687]: pam_unix(sudo:session): session closed for user root
Feb 16 13:02:02 compute-0 sudo[67810]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zoqscphzdnrixbqbfokhxeursxznrkpz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246921.522397-254-18367505661219/AnsiballZ_copy.py'
Feb 16 13:02:02 compute-0 sudo[67810]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:02:02 compute-0 python3.9[67812]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771246921.522397-254-18367505661219/.source.preset _original_basename=91-edpm-container-shutdown-preset follow=False checksum=b275e4375287528cb63464dd32f622c4f142a915 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:02:02 compute-0 sudo[67810]: pam_unix(sudo:session): session closed for user root
Feb 16 13:02:03 compute-0 sudo[67962]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-npajiipbzsmifaszbbkzwcihuitokasc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246922.6436036-284-62810866151798/AnsiballZ_systemd.py'
Feb 16 13:02:03 compute-0 sudo[67962]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:02:03 compute-0 python3.9[67964]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 16 13:02:03 compute-0 systemd[1]: Reloading.
Feb 16 13:02:03 compute-0 systemd-rc-local-generator[67992]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 13:02:03 compute-0 systemd-sysv-generator[67995]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 16 13:02:03 compute-0 systemd[1]: Reloading.
Feb 16 13:02:03 compute-0 systemd-sysv-generator[68035]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 16 13:02:03 compute-0 systemd-rc-local-generator[68031]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 13:02:03 compute-0 systemd[1]: Starting EDPM Container Shutdown...
Feb 16 13:02:03 compute-0 systemd[1]: Finished EDPM Container Shutdown.
Feb 16 13:02:03 compute-0 sudo[67962]: pam_unix(sudo:session): session closed for user root
Feb 16 13:02:04 compute-0 sudo[68204]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zkusjfbzvqvwfoeckeinvluidemwzsrj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246924.0922303-300-91680625235308/AnsiballZ_stat.py'
Feb 16 13:02:04 compute-0 sudo[68204]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:02:04 compute-0 python3.9[68206]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:02:04 compute-0 sudo[68204]: pam_unix(sudo:session): session closed for user root
Feb 16 13:02:04 compute-0 sudo[68327]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vbnecohtilsqmjvepeepzavleznddybo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246924.0922303-300-91680625235308/AnsiballZ_copy.py'
Feb 16 13:02:04 compute-0 sudo[68327]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:02:05 compute-0 python3.9[68329]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/netns-placeholder.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771246924.0922303-300-91680625235308/.source.service _original_basename=netns-placeholder-service follow=False checksum=b61b1b5918c20c877b8b226fbf34ff89a082d972 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:02:05 compute-0 sudo[68327]: pam_unix(sudo:session): session closed for user root
Feb 16 13:02:05 compute-0 sudo[68479]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bidgzbvkkxsnlumhbynjrbksmiobnanc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246925.1697583-330-98090008319542/AnsiballZ_stat.py'
Feb 16 13:02:05 compute-0 sudo[68479]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:02:05 compute-0 python3.9[68481]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:02:05 compute-0 sudo[68479]: pam_unix(sudo:session): session closed for user root
Feb 16 13:02:05 compute-0 sudo[68602]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xtdtzuzkgcfggnpvdwopbxneilupqvde ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246925.1697583-330-98090008319542/AnsiballZ_copy.py'
Feb 16 13:02:05 compute-0 sudo[68602]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:02:06 compute-0 python3.9[68604]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-netns-placeholder.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771246925.1697583-330-98090008319542/.source.preset _original_basename=91-netns-placeholder-preset follow=False checksum=28b7b9aa893525d134a1eeda8a0a48fb25b736b9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:02:06 compute-0 sudo[68602]: pam_unix(sudo:session): session closed for user root
Feb 16 13:02:06 compute-0 sudo[68754]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pcubpticbrvighsifczjzyjbbhnqeufx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246926.2350657-360-100212466059814/AnsiballZ_systemd.py'
Feb 16 13:02:06 compute-0 sudo[68754]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:02:06 compute-0 python3.9[68756]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 16 13:02:06 compute-0 systemd[1]: Reloading.
Feb 16 13:02:06 compute-0 systemd-sysv-generator[68787]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 16 13:02:06 compute-0 systemd-rc-local-generator[68784]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 13:02:06 compute-0 systemd[1]: Reloading.
Feb 16 13:02:07 compute-0 systemd-rc-local-generator[68823]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 13:02:07 compute-0 systemd-sysv-generator[68828]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 16 13:02:07 compute-0 systemd[1]: Starting Create netns directory...
Feb 16 13:02:07 compute-0 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Feb 16 13:02:07 compute-0 systemd[1]: netns-placeholder.service: Deactivated successfully.
Feb 16 13:02:07 compute-0 systemd[1]: Finished Create netns directory.
Feb 16 13:02:07 compute-0 sudo[68754]: pam_unix(sudo:session): session closed for user root
Feb 16 13:02:08 compute-0 python3.9[68997]: ansible-ansible.builtin.service_facts Invoked
Feb 16 13:02:08 compute-0 network[69014]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Feb 16 13:02:08 compute-0 network[69015]: 'network-scripts' will be removed from distribution in near future.
Feb 16 13:02:08 compute-0 network[69016]: It is advised to switch to 'NetworkManager' instead for network management.
Feb 16 13:02:11 compute-0 sudo[69277]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cdnydvwtprmchjryktxynzjdajqvmvnv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246931.2407687-392-251777081485045/AnsiballZ_systemd.py'
Feb 16 13:02:11 compute-0 sudo[69277]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:02:11 compute-0 python3.9[69279]: ansible-ansible.builtin.systemd Invoked with enabled=False name=iptables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 16 13:02:11 compute-0 systemd[1]: Reloading.
Feb 16 13:02:12 compute-0 systemd-sysv-generator[69304]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 16 13:02:12 compute-0 systemd-rc-local-generator[69300]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 13:02:12 compute-0 systemd[1]: Stopping IPv4 firewall with iptables...
Feb 16 13:02:12 compute-0 iptables.init[69326]: iptables: Setting chains to policy ACCEPT: raw mangle filter nat [  OK  ]
Feb 16 13:02:12 compute-0 iptables.init[69326]: iptables: Flushing firewall rules: [  OK  ]
Feb 16 13:02:12 compute-0 systemd[1]: iptables.service: Deactivated successfully.
Feb 16 13:02:12 compute-0 systemd[1]: Stopped IPv4 firewall with iptables.
Feb 16 13:02:12 compute-0 sudo[69277]: pam_unix(sudo:session): session closed for user root
Feb 16 13:02:12 compute-0 sudo[69520]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ovxkuhugpzwhqcsdbymmwkxiuvfelise ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246932.6946619-392-183284050023791/AnsiballZ_systemd.py'
Feb 16 13:02:12 compute-0 sudo[69520]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:02:13 compute-0 python3.9[69522]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ip6tables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 16 13:02:13 compute-0 sudo[69520]: pam_unix(sudo:session): session closed for user root
Feb 16 13:02:13 compute-0 sudo[69674]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kdlredzizlkatuerntxzozlstjffujga ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246933.5280774-424-100687258836054/AnsiballZ_systemd.py'
Feb 16 13:02:13 compute-0 sudo[69674]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:02:14 compute-0 python3.9[69676]: ansible-ansible.builtin.systemd Invoked with enabled=True name=nftables state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 16 13:02:14 compute-0 systemd[1]: Reloading.
Feb 16 13:02:14 compute-0 systemd-rc-local-generator[69708]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 13:02:14 compute-0 systemd-sysv-generator[69713]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 16 13:02:14 compute-0 systemd[1]: Starting Netfilter Tables...
Feb 16 13:02:14 compute-0 systemd[1]: Finished Netfilter Tables.
Feb 16 13:02:14 compute-0 sudo[69674]: pam_unix(sudo:session): session closed for user root
Feb 16 13:02:15 compute-0 sudo[69873]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zxlylnuhbdpgukarbxykxntyjwoxkpmo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246934.7054796-440-144375112846861/AnsiballZ_command.py'
Feb 16 13:02:15 compute-0 sudo[69873]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:02:15 compute-0 python3.9[69875]: ansible-ansible.legacy.command Invoked with _raw_params=nft flush ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 13:02:15 compute-0 sudo[69873]: pam_unix(sudo:session): session closed for user root
Feb 16 13:02:16 compute-0 sudo[70026]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-udagctakvhjrecjzneyaohlrgsawzhcx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246935.8317149-468-69426574792277/AnsiballZ_stat.py'
Feb 16 13:02:16 compute-0 sudo[70026]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:02:16 compute-0 python3.9[70028]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:02:16 compute-0 sudo[70026]: pam_unix(sudo:session): session closed for user root
Feb 16 13:02:16 compute-0 sudo[70151]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nletsmvcsrbcnecxjhezipjemktanspb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246935.8317149-468-69426574792277/AnsiballZ_copy.py'
Feb 16 13:02:16 compute-0 sudo[70151]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:02:16 compute-0 python3.9[70153]: ansible-ansible.legacy.copy Invoked with dest=/etc/ssh/sshd_config mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1771246935.8317149-468-69426574792277/.source validate=/usr/sbin/sshd -T -f %s follow=False _original_basename=sshd_config_block.j2 checksum=6c79f4cb960ad444688fde322eeacb8402e22d79 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:02:16 compute-0 sudo[70151]: pam_unix(sudo:session): session closed for user root
Feb 16 13:02:17 compute-0 sudo[70305]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rbyafcyzdqaylwxjgxbizzjcvgqmazro ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246937.5714414-498-34999770123222/AnsiballZ_systemd.py'
Feb 16 13:02:17 compute-0 sudo[70305]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:02:18 compute-0 python3.9[70307]: ansible-ansible.builtin.systemd Invoked with name=sshd state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 16 13:02:18 compute-0 systemd[1]: Reloading OpenSSH server daemon...
Feb 16 13:02:18 compute-0 sshd[1018]: Received SIGHUP; restarting.
Feb 16 13:02:18 compute-0 systemd[1]: Reloaded OpenSSH server daemon.
Feb 16 13:02:18 compute-0 sshd[1018]: Server listening on 0.0.0.0 port 22.
Feb 16 13:02:18 compute-0 sshd[1018]: Server listening on :: port 22.
Feb 16 13:02:18 compute-0 sudo[70305]: pam_unix(sudo:session): session closed for user root
Feb 16 13:02:18 compute-0 sudo[70461]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wujhavnaltjlvvjvngsfahwwvbyxawoi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246938.449064-514-247514418690706/AnsiballZ_file.py'
Feb 16 13:02:18 compute-0 sudo[70461]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:02:18 compute-0 python3.9[70463]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:02:18 compute-0 sudo[70461]: pam_unix(sudo:session): session closed for user root
Feb 16 13:02:19 compute-0 sudo[70613]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fzuvkwmfvjrpxzdahvhbcwdniajyfcbr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246939.0376525-530-83548016268250/AnsiballZ_stat.py'
Feb 16 13:02:19 compute-0 sudo[70613]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:02:19 compute-0 python3.9[70615]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/sshd-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:02:19 compute-0 sudo[70613]: pam_unix(sudo:session): session closed for user root
Feb 16 13:02:19 compute-0 sudo[70736]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fetwpdqsvqynhvcdohzghqspjlfwarvw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246939.0376525-530-83548016268250/AnsiballZ_copy.py'
Feb 16 13:02:19 compute-0 sudo[70736]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:02:19 compute-0 python3.9[70738]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/sshd-networks.yaml group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771246939.0376525-530-83548016268250/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=0bfc8440fd8f39002ab90252479fb794f51b5ae8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:02:20 compute-0 sudo[70736]: pam_unix(sudo:session): session closed for user root
Feb 16 13:02:20 compute-0 sudo[70888]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rsbtdnslmoppgauqmpghljfbhvuvktua ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246940.4468594-566-96242617983446/AnsiballZ_timezone.py'
Feb 16 13:02:20 compute-0 sudo[70888]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:02:21 compute-0 python3.9[70890]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Feb 16 13:02:21 compute-0 systemd[1]: Starting Time & Date Service...
Feb 16 13:02:21 compute-0 systemd[1]: Started Time & Date Service.
Feb 16 13:02:21 compute-0 sudo[70888]: pam_unix(sudo:session): session closed for user root
Feb 16 13:02:21 compute-0 sudo[71044]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cdfmrnpaupjmkgphcwqsbrzxuladaywc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246941.5564973-584-162914514761678/AnsiballZ_file.py'
Feb 16 13:02:21 compute-0 sudo[71044]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:02:21 compute-0 python3.9[71046]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:02:21 compute-0 sudo[71044]: pam_unix(sudo:session): session closed for user root
Feb 16 13:02:23 compute-0 sudo[71196]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kpotipkotutogtsifkagkbkqccqrozrv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246942.3341432-600-96232450891449/AnsiballZ_stat.py'
Feb 16 13:02:23 compute-0 sudo[71196]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:02:23 compute-0 python3.9[71198]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:02:23 compute-0 sudo[71196]: pam_unix(sudo:session): session closed for user root
Feb 16 13:02:23 compute-0 sudo[71319]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-erxkmrvkkafkdpntkzkxaozxnbapsvgb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246942.3341432-600-96232450891449/AnsiballZ_copy.py'
Feb 16 13:02:23 compute-0 sudo[71319]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:02:23 compute-0 python3.9[71321]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771246942.3341432-600-96232450891449/.source.yaml follow=False _original_basename=base-rules.yaml.j2 checksum=450456afcafded6d4bdecceec7a02e806eebd8b3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:02:23 compute-0 sudo[71319]: pam_unix(sudo:session): session closed for user root
Feb 16 13:02:24 compute-0 sudo[71471]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ttuwyfecttnfaowqlunrkzbkutgpckcf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246944.0356817-630-81082767845470/AnsiballZ_stat.py'
Feb 16 13:02:24 compute-0 sudo[71471]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:02:24 compute-0 python3.9[71473]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:02:24 compute-0 sudo[71471]: pam_unix(sudo:session): session closed for user root
Feb 16 13:02:24 compute-0 sudo[71594]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-macvhzhvrmaeucujkysipdblnrlzmrtt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246944.0356817-630-81082767845470/AnsiballZ_copy.py'
Feb 16 13:02:24 compute-0 sudo[71594]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:02:24 compute-0 python3.9[71596]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771246944.0356817-630-81082767845470/.source.yaml _original_basename=.ang1d5el follow=False checksum=97d170e1550eee4afc0af065b78cda302a97674c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:02:24 compute-0 sudo[71594]: pam_unix(sudo:session): session closed for user root
Feb 16 13:02:25 compute-0 sudo[71746]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-arsxwtdzuxnunifaszjulayugbqzgkik ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246945.1895719-660-175345097118732/AnsiballZ_stat.py'
Feb 16 13:02:25 compute-0 sudo[71746]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:02:25 compute-0 python3.9[71748]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:02:25 compute-0 sudo[71746]: pam_unix(sudo:session): session closed for user root
Feb 16 13:02:25 compute-0 sudo[71869]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yxkhjcqxeuxwbqaeqyubescbdiqmnvms ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246945.1895719-660-175345097118732/AnsiballZ_copy.py'
Feb 16 13:02:25 compute-0 sudo[71869]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:02:26 compute-0 python3.9[71871]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/iptables.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771246945.1895719-660-175345097118732/.source.nft _original_basename=iptables.nft follow=False checksum=3e02df08f1f3ab4a513e94056dbd390e3d38fe30 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:02:26 compute-0 sudo[71869]: pam_unix(sudo:session): session closed for user root
Feb 16 13:02:26 compute-0 sudo[72021]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pswjtdelowyxkznfoluzkriqknknuzyj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246946.311081-690-156053347278064/AnsiballZ_command.py'
Feb 16 13:02:26 compute-0 sudo[72021]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:02:26 compute-0 python3.9[72023]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/iptables.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 13:02:26 compute-0 sudo[72021]: pam_unix(sudo:session): session closed for user root
Feb 16 13:02:27 compute-0 sudo[72174]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uifibwhthcrxbpknojeajoxlisatkpoe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246947.1674156-706-4796412871339/AnsiballZ_command.py'
Feb 16 13:02:27 compute-0 sudo[72174]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:02:27 compute-0 python3.9[72176]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 13:02:27 compute-0 sudo[72174]: pam_unix(sudo:session): session closed for user root
Feb 16 13:02:28 compute-0 sudo[72327]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-masklioclkdpnlrrghunbqikzegvwvqd ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1771246947.7956228-722-146646951038942/AnsiballZ_edpm_nftables_from_files.py'
Feb 16 13:02:28 compute-0 sudo[72327]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:02:28 compute-0 python3[72329]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Feb 16 13:02:28 compute-0 sudo[72327]: pam_unix(sudo:session): session closed for user root
Feb 16 13:02:28 compute-0 sudo[72479]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jlbkezmrtkjvmcwcmrcuubwmycscdftg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246948.6532173-738-217452005386581/AnsiballZ_stat.py'
Feb 16 13:02:28 compute-0 sudo[72479]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:02:29 compute-0 python3.9[72481]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:02:29 compute-0 sudo[72479]: pam_unix(sudo:session): session closed for user root
Feb 16 13:02:29 compute-0 sudo[72602]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mgsfenepdaelhoagafuemorljnalxjcx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246948.6532173-738-217452005386581/AnsiballZ_copy.py'
Feb 16 13:02:29 compute-0 sudo[72602]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:02:29 compute-0 python3.9[72604]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771246948.6532173-738-217452005386581/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:02:29 compute-0 sudo[72602]: pam_unix(sudo:session): session closed for user root
Feb 16 13:02:30 compute-0 sudo[72754]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aoihxqggbntghyifosracphkbspelazm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246949.8359683-768-279950460901843/AnsiballZ_stat.py'
Feb 16 13:02:30 compute-0 sudo[72754]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:02:30 compute-0 python3.9[72756]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:02:30 compute-0 sudo[72754]: pam_unix(sudo:session): session closed for user root
Feb 16 13:02:30 compute-0 sudo[72877]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hjwlzgbdhrypbqzhnnivstxcyyfjsrwq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246949.8359683-768-279950460901843/AnsiballZ_copy.py'
Feb 16 13:02:30 compute-0 sudo[72877]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:02:31 compute-0 python3.9[72879]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771246949.8359683-768-279950460901843/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:02:31 compute-0 sudo[72877]: pam_unix(sudo:session): session closed for user root
Feb 16 13:02:31 compute-0 sudo[73029]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-azrmddpdimcwsjlvizpumgzuhiuhzhvo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246951.2851498-798-265736318156561/AnsiballZ_stat.py'
Feb 16 13:02:31 compute-0 sudo[73029]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:02:31 compute-0 python3.9[73031]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:02:31 compute-0 sudo[73029]: pam_unix(sudo:session): session closed for user root
Feb 16 13:02:32 compute-0 sudo[73152]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eolbivxdicyipdclttxigwobtrqtkccz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246951.2851498-798-265736318156561/AnsiballZ_copy.py'
Feb 16 13:02:32 compute-0 sudo[73152]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:02:32 compute-0 python3.9[73154]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771246951.2851498-798-265736318156561/.source.nft follow=False _original_basename=flush-chain.j2 checksum=d16337256a56373421842284fe09e4e6c7df417e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:02:32 compute-0 sudo[73152]: pam_unix(sudo:session): session closed for user root
Feb 16 13:02:32 compute-0 sudo[73304]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cevdhburtausgnqbebwzlbhqkocqaklf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246952.4715724-828-204032217929187/AnsiballZ_stat.py'
Feb 16 13:02:32 compute-0 sudo[73304]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:02:32 compute-0 python3.9[73306]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:02:32 compute-0 sudo[73304]: pam_unix(sudo:session): session closed for user root
Feb 16 13:02:33 compute-0 sudo[73427]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ddtnyueiezqxgneqpnlepnophcrrsvxu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246952.4715724-828-204032217929187/AnsiballZ_copy.py'
Feb 16 13:02:33 compute-0 sudo[73427]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:02:33 compute-0 python3.9[73429]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771246952.4715724-828-204032217929187/.source.nft follow=False _original_basename=chains.j2 checksum=2079f3b60590a165d1d502e763170876fc8e2984 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:02:33 compute-0 sudo[73427]: pam_unix(sudo:session): session closed for user root
Feb 16 13:02:33 compute-0 sudo[73579]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iadklcsadgralyjdxhjzbxzjvtkhvgiy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246953.6732626-858-72634458066375/AnsiballZ_stat.py'
Feb 16 13:02:34 compute-0 sudo[73579]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:02:34 compute-0 python3.9[73581]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:02:34 compute-0 sudo[73579]: pam_unix(sudo:session): session closed for user root
Feb 16 13:02:34 compute-0 sudo[73702]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eqlbabafkferrnodewzizoexqwykqouo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246953.6732626-858-72634458066375/AnsiballZ_copy.py'
Feb 16 13:02:34 compute-0 sudo[73702]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:02:34 compute-0 python3.9[73704]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771246953.6732626-858-72634458066375/.source.nft follow=False _original_basename=ruleset.j2 checksum=15a82a0dc61abfd6aa593407582b5b950437eb80 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:02:34 compute-0 sudo[73702]: pam_unix(sudo:session): session closed for user root
Feb 16 13:02:35 compute-0 sudo[73854]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cfudfdbcsyymrinmevnkbgvxwhawkzmk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246955.1513245-888-275998525575701/AnsiballZ_file.py'
Feb 16 13:02:35 compute-0 sudo[73854]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:02:35 compute-0 python3.9[73856]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:02:35 compute-0 sudo[73854]: pam_unix(sudo:session): session closed for user root
Feb 16 13:02:36 compute-0 sudo[74006]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gkpztveuwqorzdnnbmhsgrgyrrnhyjva ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246955.7653468-904-199181624910716/AnsiballZ_command.py'
Feb 16 13:02:36 compute-0 sudo[74006]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:02:36 compute-0 python3.9[74008]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 13:02:36 compute-0 sudo[74006]: pam_unix(sudo:session): session closed for user root
Feb 16 13:02:36 compute-0 sudo[74165]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ojntpmdvktjgbcfdnpzrzbytbxdorjcd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246956.482396-920-273710174037042/AnsiballZ_blockinfile.py'
Feb 16 13:02:36 compute-0 sudo[74165]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:02:37 compute-0 python3.9[74167]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                            include "/etc/nftables/edpm-chains.nft"
                                            include "/etc/nftables/edpm-rules.nft"
                                            include "/etc/nftables/edpm-jumps.nft"
                                             path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:02:37 compute-0 sudo[74165]: pam_unix(sudo:session): session closed for user root
Feb 16 13:02:37 compute-0 sudo[74318]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nygrqulnfwbtcujopnvyapczefzclmad ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246957.3337405-938-271289939946070/AnsiballZ_file.py'
Feb 16 13:02:37 compute-0 sudo[74318]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:02:37 compute-0 python3.9[74320]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages1G state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:02:37 compute-0 sudo[74318]: pam_unix(sudo:session): session closed for user root
Feb 16 13:02:38 compute-0 sudo[74470]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ninrpwxnxwqoyxqjiqhfmslviossyrni ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246957.9125142-938-167935071502470/AnsiballZ_file.py'
Feb 16 13:02:38 compute-0 sudo[74470]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:02:38 compute-0 python3.9[74472]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages2M state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:02:38 compute-0 sudo[74470]: pam_unix(sudo:session): session closed for user root
Feb 16 13:02:39 compute-0 sudo[74622]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vdgnribccikruyraahgzbkmvdnofxiyo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246958.5368743-968-142759653714035/AnsiballZ_mount.py'
Feb 16 13:02:39 compute-0 sudo[74622]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:02:39 compute-0 python3.9[74624]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=1G path=/dev/hugepages1G src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Feb 16 13:02:39 compute-0 sudo[74622]: pam_unix(sudo:session): session closed for user root
Feb 16 13:02:39 compute-0 rsyslogd[1017]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Feb 16 13:02:39 compute-0 rsyslogd[1017]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Feb 16 13:02:39 compute-0 sudo[74776]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vmrzvmjqrlllbjqhqnagwbbgrczeqsna ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246959.5917592-968-7733706350540/AnsiballZ_mount.py'
Feb 16 13:02:39 compute-0 sudo[74776]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:02:40 compute-0 python3.9[74778]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=2M path=/dev/hugepages2M src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Feb 16 13:02:40 compute-0 sudo[74776]: pam_unix(sudo:session): session closed for user root
Feb 16 13:02:40 compute-0 sshd-session[65574]: Connection closed by 192.168.122.30 port 59338
Feb 16 13:02:40 compute-0 sshd-session[65571]: pam_unix(sshd:session): session closed for user zuul
Feb 16 13:02:40 compute-0 systemd[1]: session-15.scope: Deactivated successfully.
Feb 16 13:02:40 compute-0 systemd[1]: session-15.scope: Consumed 31.278s CPU time.
Feb 16 13:02:40 compute-0 systemd-logind[818]: Session 15 logged out. Waiting for processes to exit.
Feb 16 13:02:40 compute-0 systemd-logind[818]: Removed session 15.
Feb 16 13:02:46 compute-0 sshd-session[74804]: Accepted publickey for zuul from 192.168.122.30 port 55890 ssh2: ECDSA SHA256:6P1Q60WP6ePVjzkJMgpM7PslBYS6YIK3pVS2L1FZ4Yk
Feb 16 13:02:46 compute-0 systemd-logind[818]: New session 16 of user zuul.
Feb 16 13:02:46 compute-0 systemd[1]: Started Session 16 of User zuul.
Feb 16 13:02:46 compute-0 sshd-session[74804]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 16 13:02:47 compute-0 sudo[74957]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ubtpncwdovyhebglvordduoqckapypmj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246967.0293872-22-40772374858117/AnsiballZ_tempfile.py'
Feb 16 13:02:47 compute-0 sudo[74957]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:02:47 compute-0 python3.9[74959]: ansible-ansible.builtin.tempfile Invoked with state=file prefix=ansible. suffix= path=None
Feb 16 13:02:47 compute-0 sudo[74957]: pam_unix(sudo:session): session closed for user root
Feb 16 13:02:48 compute-0 sudo[75109]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xpbeovwpebgsebnpzwukrsswipbcrzcj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246967.8276289-46-249452956756267/AnsiballZ_stat.py'
Feb 16 13:02:48 compute-0 sudo[75109]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:02:48 compute-0 python3.9[75111]: ansible-ansible.builtin.stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 16 13:02:48 compute-0 sudo[75109]: pam_unix(sudo:session): session closed for user root
Feb 16 13:02:49 compute-0 sudo[75261]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mltvqgkgpulpvbxpwtcfwimvhccbtohr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246968.6761954-66-97116839923955/AnsiballZ_setup.py'
Feb 16 13:02:49 compute-0 sudo[75261]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:02:49 compute-0 python3.9[75263]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'ssh_host_key_rsa_public', 'ssh_host_key_ed25519_public', 'ssh_host_key_ecdsa_public'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 16 13:02:49 compute-0 sudo[75261]: pam_unix(sudo:session): session closed for user root
Feb 16 13:02:50 compute-0 sudo[75413]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mmxvhpmivopgnlyjwsguvxrnhfswqvgi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246969.8024366-83-4888003952360/AnsiballZ_blockinfile.py'
Feb 16 13:02:50 compute-0 sudo[75413]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:02:50 compute-0 python3.9[75415]: ansible-ansible.builtin.blockinfile Invoked with block=compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCZXltIir5JBlvlPO7xLUmiSTW31gnyk58yBpRPAz3e1zsBdp3tp69owh1HkTmYc2BM0dUw6H1M71VuxI0SapqM1d5LkcPPguX1Mq7TGAQn2dsX6Piigs5Cgp5OXbpdp5/nJMF2TC4TLMXXab89NyRA6uh3T423AM/mWQEIBu3i252ANCm921kcESMFJPNdnV1B74UKLptZ8BUaExyksvXJtesoOoU5tovgAd4TFk6u4EgLNEnb8afOk11FDJSnTBwtrYzIJNIgo8EgU+JaDlS1BWU88QSSGYyUwznbA09nebRk89Vy+XJ9DYHlXuPU2Iz50yOO6dFTk0sfOA3/GDlBF2Z+I3eusAQR53HhJ08/uLwfEXOwpQfqAHgmIKXGBIcOSswNRXBRDVy46MVK0bxxRtljm2BHlSo/ayqvw63HW9V555GjPAmhtovOPJeX2/GaUTA8/48ZVHcvu7bnAjxyJK6lDyp1kVm2Zv0x8bl1mFbuOBi39ZeK+zc5AADi7CM=
                                            compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAII4vGkqkehtuFud87VeuVaCZH32Y8wUj3DYUltABnLf7
                                            compute-0.ctlplane.example.com,192.168.122.100,compute-0* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOElrSCFS57nxTxtJ648wl1DVoVcAkQzVPwikLAgiomC/pYBiXtlGQhPs9E4LrY5DDQLvhyYJ2yYdVE/4SRyISo=
                                            compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC+TVRcBupgzmTdjTAl4PntIbYWysOWMtXN0mftJlk/ePbSMt7K7+RZk++0YQpyUwPpty3/HFX3wQRF9viEaPSK5jMBRYoRn1WdvOeWGMCfK7TDigQW8ojVMgO804XHVvxEG717wW4/uLTu8DPqc5HNEqPzVl1GTH02Xj7g2O/FoQmpQuTeoquar0XxVxfiemUgZKGMCLaArrVK5u5oEXiiXWIGno1zlGwZ78bOq/csrxTZqVtXhSr8cszXUWFTqDh9bafxdl/Lj8NyfjG/pw7mTjSD+9KfpGTW4PCTru5Yp7Jr0AGSNqcWo76aGPWIuF5Ev6byLNM9NPjyT//iGN7Ez8x7GshAoUtHZ4BytRuL71hTYzRVU4t/21c5bLoo+aaeQ6RYB8U2VTh3L8gL6mB7oL+u+ZyVhDLvrlb4OK7yG6PpFBqlvXqK92lIC7x+tqrLYh28gfkawLOE2pnD8CzcuZfIc/aZ41WDNkE+0xSUlVPieVEvR+Og9hYKqxEJjjs=
                                            compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIHG+w/iJ+Sd31ZLka6ki6wkHvTMxAiuRNHy6U6ZlO/c7
                                            compute-1.ctlplane.example.com,192.168.122.101,compute-1* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBK6j3HO6Ca/k3kBzCq5boL0wpgaz3/9NCyn+y/MXv7x/dYitXgqAC8QrwaYe9xZNnaPdzPecAgq1NX9k2zO6yoI=
                                             create=True mode=0644 path=/tmp/ansible.8ozoz9ig state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:02:50 compute-0 sudo[75413]: pam_unix(sudo:session): session closed for user root
Feb 16 13:02:51 compute-0 sudo[75565]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qoebrvlalmqsnndiwefjthwsjdhvaqmt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246970.597358-99-134646521641488/AnsiballZ_command.py'
Feb 16 13:02:51 compute-0 sudo[75565]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:02:51 compute-0 systemd[1]: systemd-timedated.service: Deactivated successfully.
Feb 16 13:02:51 compute-0 python3.9[75567]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible.8ozoz9ig' > /etc/ssh/ssh_known_hosts _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 13:02:51 compute-0 sudo[75565]: pam_unix(sudo:session): session closed for user root
Feb 16 13:02:51 compute-0 sudo[75722]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-amcexjxvfvpanrhrfvruxngerqbgccsu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246971.5659356-115-73261897172768/AnsiballZ_file.py'
Feb 16 13:02:51 compute-0 sudo[75722]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:02:52 compute-0 python3.9[75724]: ansible-ansible.builtin.file Invoked with path=/tmp/ansible.8ozoz9ig state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:02:52 compute-0 sudo[75722]: pam_unix(sudo:session): session closed for user root
Feb 16 13:02:52 compute-0 sshd-session[74807]: Connection closed by 192.168.122.30 port 55890
Feb 16 13:02:52 compute-0 sshd-session[74804]: pam_unix(sshd:session): session closed for user zuul
Feb 16 13:02:52 compute-0 systemd[1]: session-16.scope: Deactivated successfully.
Feb 16 13:02:52 compute-0 systemd[1]: session-16.scope: Consumed 3.072s CPU time.
Feb 16 13:02:52 compute-0 systemd-logind[818]: Session 16 logged out. Waiting for processes to exit.
Feb 16 13:02:52 compute-0 systemd-logind[818]: Removed session 16.
Feb 16 13:02:54 compute-0 sshd-session[75749]: Connection closed by authenticating user root 146.190.226.24 port 42234 [preauth]
Feb 16 13:02:58 compute-0 sshd-session[75751]: Accepted publickey for zuul from 192.168.122.30 port 60766 ssh2: ECDSA SHA256:6P1Q60WP6ePVjzkJMgpM7PslBYS6YIK3pVS2L1FZ4Yk
Feb 16 13:02:58 compute-0 systemd-logind[818]: New session 17 of user zuul.
Feb 16 13:02:58 compute-0 systemd[1]: Started Session 17 of User zuul.
Feb 16 13:02:58 compute-0 sshd-session[75751]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 16 13:02:59 compute-0 python3.9[75904]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 16 13:03:00 compute-0 sudo[76058]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bsdjjhikcmcrvulhlmcvdjhubwoxivkr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246979.515734-44-110565500124808/AnsiballZ_systemd.py'
Feb 16 13:03:00 compute-0 sudo[76058]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:03:00 compute-0 python3.9[76060]: ansible-ansible.builtin.systemd Invoked with enabled=True name=sshd daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Feb 16 13:03:00 compute-0 sudo[76058]: pam_unix(sudo:session): session closed for user root
Feb 16 13:03:01 compute-0 sudo[76212]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oggjjajybnsxrottcjldyxzllwkcqpbc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246980.7022808-60-73093791222618/AnsiballZ_systemd.py'
Feb 16 13:03:01 compute-0 sudo[76212]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:03:01 compute-0 python3.9[76214]: ansible-ansible.builtin.systemd Invoked with name=sshd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 16 13:03:01 compute-0 sudo[76212]: pam_unix(sudo:session): session closed for user root
Feb 16 13:03:01 compute-0 sudo[76365]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pvfwdfoybtxhayssbkzudvbbafrlmdyv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246981.5517638-78-16582271573459/AnsiballZ_command.py'
Feb 16 13:03:01 compute-0 sudo[76365]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:03:02 compute-0 python3.9[76367]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 13:03:02 compute-0 sudo[76365]: pam_unix(sudo:session): session closed for user root
Feb 16 13:03:02 compute-0 sudo[76518]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xqqsenfqitgiartrxqwgnxqoweputhis ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246982.3103676-94-189446114876364/AnsiballZ_stat.py'
Feb 16 13:03:02 compute-0 sudo[76518]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:03:02 compute-0 python3.9[76520]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 16 13:03:02 compute-0 sudo[76518]: pam_unix(sudo:session): session closed for user root
Feb 16 13:03:03 compute-0 sudo[76672]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zsbrgnsafzvdarwjmaikstzbtlavvjxx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246983.072777-110-170343166078166/AnsiballZ_command.py'
Feb 16 13:03:03 compute-0 sudo[76672]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:03:03 compute-0 python3.9[76674]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 13:03:03 compute-0 sudo[76672]: pam_unix(sudo:session): session closed for user root
Feb 16 13:03:04 compute-0 sudo[76827]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-awclcvtomvwhlllbihzxvgvnwzwryalc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246983.7440946-126-249051937853120/AnsiballZ_file.py'
Feb 16 13:03:04 compute-0 sudo[76827]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:03:04 compute-0 python3.9[76829]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:03:04 compute-0 sudo[76827]: pam_unix(sudo:session): session closed for user root
Feb 16 13:03:05 compute-0 sshd-session[75754]: Connection closed by 192.168.122.30 port 60766
Feb 16 13:03:05 compute-0 sshd-session[75751]: pam_unix(sshd:session): session closed for user zuul
Feb 16 13:03:05 compute-0 systemd[1]: session-17.scope: Deactivated successfully.
Feb 16 13:03:05 compute-0 systemd[1]: session-17.scope: Consumed 3.920s CPU time.
Feb 16 13:03:05 compute-0 systemd-logind[818]: Session 17 logged out. Waiting for processes to exit.
Feb 16 13:03:05 compute-0 systemd-logind[818]: Removed session 17.
Feb 16 13:03:10 compute-0 sshd-session[76855]: Accepted publickey for zuul from 192.168.122.30 port 48004 ssh2: ECDSA SHA256:6P1Q60WP6ePVjzkJMgpM7PslBYS6YIK3pVS2L1FZ4Yk
Feb 16 13:03:10 compute-0 systemd-logind[818]: New session 18 of user zuul.
Feb 16 13:03:10 compute-0 systemd[1]: Started Session 18 of User zuul.
Feb 16 13:03:10 compute-0 sshd-session[76855]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 16 13:03:11 compute-0 python3.9[77008]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 16 13:03:12 compute-0 sudo[77162]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xhhgoilhbmtnkbmzuubygkhwaukwroqa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246992.4341097-48-197074524609835/AnsiballZ_setup.py'
Feb 16 13:03:12 compute-0 sudo[77162]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:03:12 compute-0 python3.9[77164]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 16 13:03:13 compute-0 sudo[77162]: pam_unix(sudo:session): session closed for user root
Feb 16 13:03:13 compute-0 sudo[77246]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gdntlnbievccvubqdbsbvvoepbbveyww ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246992.4341097-48-197074524609835/AnsiballZ_dnf.py'
Feb 16 13:03:13 compute-0 sudo[77246]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:03:14 compute-0 python3.9[77248]: ansible-ansible.legacy.dnf Invoked with name=['yum-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Feb 16 13:03:15 compute-0 sudo[77246]: pam_unix(sudo:session): session closed for user root
Feb 16 13:03:16 compute-0 python3.9[77399]: ansible-ansible.legacy.command Invoked with _raw_params=needs-restarting -r _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 13:03:17 compute-0 python3.9[77550]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/reboot_required/'] patterns=[] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Feb 16 13:03:18 compute-0 python3.9[77700]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 16 13:03:19 compute-0 python3.9[77850]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 16 13:03:20 compute-0 sshd-session[76858]: Connection closed by 192.168.122.30 port 48004
Feb 16 13:03:20 compute-0 sshd-session[76855]: pam_unix(sshd:session): session closed for user zuul
Feb 16 13:03:20 compute-0 systemd[1]: session-18.scope: Deactivated successfully.
Feb 16 13:03:20 compute-0 systemd[1]: session-18.scope: Consumed 5.368s CPU time.
Feb 16 13:03:20 compute-0 systemd-logind[818]: Session 18 logged out. Waiting for processes to exit.
Feb 16 13:03:20 compute-0 systemd-logind[818]: Removed session 18.
Feb 16 13:03:25 compute-0 sshd-session[77875]: Accepted publickey for zuul from 192.168.122.30 port 45898 ssh2: ECDSA SHA256:6P1Q60WP6ePVjzkJMgpM7PslBYS6YIK3pVS2L1FZ4Yk
Feb 16 13:03:25 compute-0 systemd-logind[818]: New session 19 of user zuul.
Feb 16 13:03:25 compute-0 systemd[1]: Started Session 19 of User zuul.
Feb 16 13:03:25 compute-0 sshd-session[77875]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 16 13:03:26 compute-0 python3.9[78028]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 16 13:03:28 compute-0 sudo[78182]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rmsokdrxuipmsommscyfiouxsajdrfke ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247008.0654013-81-50907049668996/AnsiballZ_file.py'
Feb 16 13:03:28 compute-0 sudo[78182]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:03:28 compute-0 python3.9[78184]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 16 13:03:28 compute-0 sudo[78182]: pam_unix(sudo:session): session closed for user root
Feb 16 13:03:29 compute-0 sudo[78334]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vmsbrapfoyiitlxxrpscxypcowjvxvsx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247008.773734-81-76563558888198/AnsiballZ_file.py'
Feb 16 13:03:29 compute-0 sudo[78334]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:03:29 compute-0 python3.9[78336]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 16 13:03:29 compute-0 sudo[78334]: pam_unix(sudo:session): session closed for user root
Feb 16 13:03:29 compute-0 sudo[78486]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dhjmgbhwqgkskapqlcvdutadscooyott ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247009.4155293-110-62756965592417/AnsiballZ_stat.py'
Feb 16 13:03:29 compute-0 sudo[78486]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:03:29 compute-0 python3.9[78488]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:03:30 compute-0 sudo[78486]: pam_unix(sudo:session): session closed for user root
Feb 16 13:03:30 compute-0 sudo[78609]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bpvpxwjtbmghoiqkwrupgdxwofysgeyr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247009.4155293-110-62756965592417/AnsiballZ_copy.py'
Feb 16 13:03:30 compute-0 sudo[78609]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:03:30 compute-0 python3.9[78611]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771247009.4155293-110-62756965592417/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=9822ea8f0cc23b8080274f6fc9fc0b42b5953311 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:03:30 compute-0 sudo[78609]: pam_unix(sudo:session): session closed for user root
Feb 16 13:03:31 compute-0 sudo[78761]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rrdghxnqbowxbeyemsuzxhycenpxqvuq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247010.8593652-110-25475249350028/AnsiballZ_stat.py'
Feb 16 13:03:31 compute-0 sudo[78761]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:03:31 compute-0 python3.9[78763]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:03:31 compute-0 sudo[78761]: pam_unix(sudo:session): session closed for user root
Feb 16 13:03:31 compute-0 sudo[78884]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-csrznrgecdkzunsjzmkjmtectphmnrjd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247010.8593652-110-25475249350028/AnsiballZ_copy.py'
Feb 16 13:03:31 compute-0 sudo[78884]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:03:32 compute-0 python3.9[78886]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771247010.8593652-110-25475249350028/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=c9dfcac50e41db328d94f1391b1aaddaecead554 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:03:32 compute-0 sudo[78884]: pam_unix(sudo:session): session closed for user root
Feb 16 13:03:32 compute-0 sudo[79036]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-btqvgrphegcrneadndkyxvdxxqqpkbuu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247012.2088146-110-133894185450249/AnsiballZ_stat.py'
Feb 16 13:03:32 compute-0 sudo[79036]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:03:32 compute-0 python3.9[79038]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:03:32 compute-0 sudo[79036]: pam_unix(sudo:session): session closed for user root
Feb 16 13:03:32 compute-0 sudo[79159]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nmdwfookxpkebuqngpmrwipvfaflcdtl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247012.2088146-110-133894185450249/AnsiballZ_copy.py'
Feb 16 13:03:32 compute-0 sudo[79159]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:03:33 compute-0 python3.9[79161]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771247012.2088146-110-133894185450249/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=7da6129850ec93fd1da7a5a0e3d82c501b3154dc backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:03:33 compute-0 sudo[79159]: pam_unix(sudo:session): session closed for user root
Feb 16 13:03:33 compute-0 sudo[79311]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yafzevzhcjjiylxtdezydtziwjvsxxvz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247013.380009-205-20719528645735/AnsiballZ_file.py'
Feb 16 13:03:33 compute-0 sudo[79311]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:03:33 compute-0 python3.9[79313]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/telemetry/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 16 13:03:33 compute-0 sudo[79311]: pam_unix(sudo:session): session closed for user root
Feb 16 13:03:34 compute-0 sudo[79463]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tptkeujcebwdarikvjusnoaohwoodeex ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247013.9592662-205-210742309150212/AnsiballZ_file.py'
Feb 16 13:03:34 compute-0 sudo[79463]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:03:34 compute-0 python3.9[79465]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/telemetry/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 16 13:03:34 compute-0 sudo[79463]: pam_unix(sudo:session): session closed for user root
Feb 16 13:03:34 compute-0 sudo[79615]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cidigyrpdypbmwvaidpdghxootofwrza ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247014.5101502-235-66458990528493/AnsiballZ_stat.py'
Feb 16 13:03:34 compute-0 sudo[79615]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:03:34 compute-0 python3.9[79617]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:03:34 compute-0 sudo[79615]: pam_unix(sudo:session): session closed for user root
Feb 16 13:03:35 compute-0 sudo[79738]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-snhzsgmqbqxpbygdsggaqenpdmgqrjtb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247014.5101502-235-66458990528493/AnsiballZ_copy.py'
Feb 16 13:03:35 compute-0 sudo[79738]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:03:35 compute-0 python3.9[79740]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771247014.5101502-235-66458990528493/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=73ca733b40c48ec8628f95216678aa7824a24ee9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:03:35 compute-0 sudo[79738]: pam_unix(sudo:session): session closed for user root
Feb 16 13:03:36 compute-0 sudo[79890]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ujjyomerniczdhlxkjxyrcsdfkrmyahv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247015.911357-235-104447007169439/AnsiballZ_stat.py'
Feb 16 13:03:36 compute-0 sudo[79890]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:03:36 compute-0 python3.9[79892]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:03:36 compute-0 sudo[79890]: pam_unix(sudo:session): session closed for user root
Feb 16 13:03:36 compute-0 sudo[80013]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hwusansytfqevxsnrzwanxxnlbzsgofa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247015.911357-235-104447007169439/AnsiballZ_copy.py'
Feb 16 13:03:36 compute-0 sudo[80013]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:03:36 compute-0 python3.9[80015]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771247015.911357-235-104447007169439/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=c95f61194c2ceee3c16fe7cffc94cfc98ee9379b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:03:36 compute-0 sudo[80013]: pam_unix(sudo:session): session closed for user root
Feb 16 13:03:37 compute-0 sudo[80165]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ciwkgjixylqgsokfbxrycvhlfantjizo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247017.1273062-235-19624926647149/AnsiballZ_stat.py'
Feb 16 13:03:37 compute-0 sudo[80165]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:03:37 compute-0 python3.9[80167]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:03:37 compute-0 sudo[80165]: pam_unix(sudo:session): session closed for user root
Feb 16 13:03:37 compute-0 sudo[80288]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zbhwxqzyioybpummtzlenkbslqclcoaf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247017.1273062-235-19624926647149/AnsiballZ_copy.py'
Feb 16 13:03:37 compute-0 sudo[80288]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:03:38 compute-0 python3.9[80290]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771247017.1273062-235-19624926647149/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=d42994bd45448c07d0beb0188d552dddf40b611b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:03:38 compute-0 sudo[80288]: pam_unix(sudo:session): session closed for user root
Feb 16 13:03:38 compute-0 sudo[80440]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oupfchsozltrexmbgqadnokllitlsihw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247018.2745392-323-173147868964025/AnsiballZ_file.py'
Feb 16 13:03:38 compute-0 sudo[80440]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:03:38 compute-0 python3.9[80442]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 16 13:03:38 compute-0 sudo[80440]: pam_unix(sudo:session): session closed for user root
Feb 16 13:03:39 compute-0 sudo[80592]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hffvayohwmaokxilnzymxwzrvyehicyj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247018.8317897-323-219410159788192/AnsiballZ_file.py'
Feb 16 13:03:39 compute-0 sudo[80592]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:03:39 compute-0 python3.9[80594]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 16 13:03:39 compute-0 sudo[80592]: pam_unix(sudo:session): session closed for user root
Feb 16 13:03:39 compute-0 sudo[80744]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jlgakcswtnvxpgoaabxobcdzmyntdqfr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247019.5574183-353-225330798517638/AnsiballZ_stat.py'
Feb 16 13:03:39 compute-0 sudo[80744]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:03:40 compute-0 python3.9[80746]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:03:40 compute-0 sudo[80744]: pam_unix(sudo:session): session closed for user root
Feb 16 13:03:40 compute-0 sudo[80867]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ztgygctfzxighxcivfscosqqfyltxqkd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247019.5574183-353-225330798517638/AnsiballZ_copy.py'
Feb 16 13:03:40 compute-0 sudo[80867]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:03:40 compute-0 python3.9[80869]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771247019.5574183-353-225330798517638/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=500b4fd3b03e8e203fe9f8ddbc70bcd8388cc784 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:03:40 compute-0 sudo[80867]: pam_unix(sudo:session): session closed for user root
Feb 16 13:03:40 compute-0 sudo[81019]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-chxdaahcvhhmviyjbemerkvtiwgvyytc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247020.7138078-353-13073725741250/AnsiballZ_stat.py'
Feb 16 13:03:40 compute-0 sudo[81019]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:03:41 compute-0 python3.9[81021]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:03:41 compute-0 sudo[81019]: pam_unix(sudo:session): session closed for user root
Feb 16 13:03:41 compute-0 sudo[81142]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-msbnilqmwafpszfoeooctzfwybxrgyek ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247020.7138078-353-13073725741250/AnsiballZ_copy.py'
Feb 16 13:03:41 compute-0 sudo[81142]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:03:41 compute-0 python3.9[81144]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771247020.7138078-353-13073725741250/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=35df7d65799c53f8c7036eda87c4670debdbe292 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:03:41 compute-0 sudo[81142]: pam_unix(sudo:session): session closed for user root
Feb 16 13:03:41 compute-0 sudo[81294]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rbphklhqvthyrgasofyxdjiejmxulykn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247021.7600768-353-21449860811657/AnsiballZ_stat.py'
Feb 16 13:03:41 compute-0 sudo[81294]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:03:42 compute-0 python3.9[81296]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:03:42 compute-0 sudo[81294]: pam_unix(sudo:session): session closed for user root
Feb 16 13:03:42 compute-0 sudo[81417]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vhrqgigkmalvnadosyxfsoqrezcbyndq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247021.7600768-353-21449860811657/AnsiballZ_copy.py'
Feb 16 13:03:42 compute-0 sudo[81417]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:03:42 compute-0 python3.9[81419]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771247021.7600768-353-21449860811657/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=da6b48652690be6e06ae21a878987fcd10574f53 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:03:42 compute-0 sudo[81417]: pam_unix(sudo:session): session closed for user root
Feb 16 13:03:43 compute-0 sudo[81569]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-khgpjfbgnlwephalkgqcjzkjqbffktry ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247022.9306386-442-46461919071270/AnsiballZ_file.py'
Feb 16 13:03:43 compute-0 sudo[81569]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:03:43 compute-0 python3.9[81571]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 16 13:03:43 compute-0 sudo[81569]: pam_unix(sudo:session): session closed for user root
Feb 16 13:03:43 compute-0 sudo[81721]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fjxhyumfukfzfpetyzaoaceldqmlfgho ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247023.5352309-442-217403583622546/AnsiballZ_file.py'
Feb 16 13:03:43 compute-0 sudo[81721]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:03:43 compute-0 python3.9[81723]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 16 13:03:43 compute-0 sudo[81721]: pam_unix(sudo:session): session closed for user root
Feb 16 13:03:44 compute-0 sudo[81873]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tbuydabdfpfpsghbbhlnknlpfarrrbth ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247024.1994033-476-38125251069473/AnsiballZ_stat.py'
Feb 16 13:03:44 compute-0 sudo[81873]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:03:44 compute-0 python3.9[81875]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:03:44 compute-0 sudo[81873]: pam_unix(sudo:session): session closed for user root
Feb 16 13:03:45 compute-0 sudo[81996]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xbagjfmnhjthyfweevvnenibabqlhdeh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247024.1994033-476-38125251069473/AnsiballZ_copy.py'
Feb 16 13:03:45 compute-0 sudo[81996]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:03:45 compute-0 python3.9[81998]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771247024.1994033-476-38125251069473/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=16cdc7673d54da4ace21b42cc73030ac13d3283a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:03:45 compute-0 sudo[81996]: pam_unix(sudo:session): session closed for user root
Feb 16 13:03:45 compute-0 sudo[82148]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iukekqcckmwoaojqrwldeffpmzzrayld ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247025.4988737-476-57412714803158/AnsiballZ_stat.py'
Feb 16 13:03:45 compute-0 sudo[82148]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:03:45 compute-0 python3.9[82150]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:03:45 compute-0 sudo[82148]: pam_unix(sudo:session): session closed for user root
Feb 16 13:03:46 compute-0 sudo[82271]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lqrlhjkmoldrzkzzvztixkwrvpcopnhz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247025.4988737-476-57412714803158/AnsiballZ_copy.py'
Feb 16 13:03:46 compute-0 sudo[82271]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:03:46 compute-0 python3.9[82273]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771247025.4988737-476-57412714803158/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=35df7d65799c53f8c7036eda87c4670debdbe292 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:03:46 compute-0 sudo[82271]: pam_unix(sudo:session): session closed for user root
Feb 16 13:03:46 compute-0 sudo[82423]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-twwyezxaxblbdlztxvcvwbbbnekyixga ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247026.5357873-476-87797877216801/AnsiballZ_stat.py'
Feb 16 13:03:46 compute-0 sudo[82423]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:03:46 compute-0 python3.9[82425]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:03:46 compute-0 sudo[82423]: pam_unix(sudo:session): session closed for user root
Feb 16 13:03:47 compute-0 sudo[82546]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rpatoqyezkxiwmcunjezhngrjyzfbtjn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247026.5357873-476-87797877216801/AnsiballZ_copy.py'
Feb 16 13:03:47 compute-0 sudo[82546]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:03:47 compute-0 python3.9[82548]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771247026.5357873-476-87797877216801/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=5fdead47e2cefc5eb912772b8e639057bfa79189 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:03:47 compute-0 sudo[82546]: pam_unix(sudo:session): session closed for user root
Feb 16 13:03:48 compute-0 sudo[82698]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ntxzcjridwkzyhphshxjfebrwkdhfbpi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247028.3252034-604-37381666407008/AnsiballZ_file.py'
Feb 16 13:03:48 compute-0 sudo[82698]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:03:48 compute-0 python3.9[82700]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/telemetry setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 16 13:03:48 compute-0 sudo[82698]: pam_unix(sudo:session): session closed for user root
Feb 16 13:03:49 compute-0 sudo[82850]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-klztwbvzkptsbfvncpmaqkpqlhrkhrre ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247028.9485679-622-136556923591318/AnsiballZ_stat.py'
Feb 16 13:03:49 compute-0 sudo[82850]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:03:49 compute-0 python3.9[82852]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:03:49 compute-0 sudo[82850]: pam_unix(sudo:session): session closed for user root
Feb 16 13:03:50 compute-0 sudo[82973]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zvgksyxsebkgtqjsezmxabuptywvuwdb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247028.9485679-622-136556923591318/AnsiballZ_copy.py'
Feb 16 13:03:50 compute-0 sudo[82973]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:03:50 compute-0 python3.9[82975]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771247028.9485679-622-136556923591318/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=01d1f535123be2e7a115b69213a7cb6af06b70ab backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:03:50 compute-0 sudo[82973]: pam_unix(sudo:session): session closed for user root
Feb 16 13:03:50 compute-0 sudo[83125]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ushrimdowhovcrmlmwqxsrspulajttys ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247030.475544-656-118335790889919/AnsiballZ_file.py'
Feb 16 13:03:50 compute-0 sudo[83125]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:03:50 compute-0 python3.9[83127]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 16 13:03:50 compute-0 sudo[83125]: pam_unix(sudo:session): session closed for user root
Feb 16 13:03:51 compute-0 sudo[83277]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vsovrjqdvnrdehgrrkzweacurgjonyhk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247031.1148465-675-91440152178256/AnsiballZ_stat.py'
Feb 16 13:03:51 compute-0 sudo[83277]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:03:51 compute-0 python3.9[83279]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:03:51 compute-0 sudo[83277]: pam_unix(sudo:session): session closed for user root
Feb 16 13:03:51 compute-0 sudo[83400]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ylgwjuiljkpmqrmelhifcztpbtpywpji ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247031.1148465-675-91440152178256/AnsiballZ_copy.py'
Feb 16 13:03:51 compute-0 sudo[83400]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:03:51 compute-0 python3.9[83402]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771247031.1148465-675-91440152178256/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=01d1f535123be2e7a115b69213a7cb6af06b70ab backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:03:51 compute-0 sudo[83400]: pam_unix(sudo:session): session closed for user root
Feb 16 13:03:52 compute-0 sudo[83552]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oyyybaxtgphjxxhukfadwrawjwcseoiy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247032.189712-706-93545818253891/AnsiballZ_file.py'
Feb 16 13:03:52 compute-0 sudo[83552]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:03:52 compute-0 python3.9[83554]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/repo-setup setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 16 13:03:52 compute-0 sudo[83552]: pam_unix(sudo:session): session closed for user root
Feb 16 13:03:53 compute-0 sudo[83704]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-atygfsfvclrpclxbdbvwwquoajqlfczj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247032.8013468-722-202702494503456/AnsiballZ_stat.py'
Feb 16 13:03:53 compute-0 sudo[83704]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:03:53 compute-0 python3.9[83706]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:03:53 compute-0 sudo[83704]: pam_unix(sudo:session): session closed for user root
Feb 16 13:03:53 compute-0 sudo[83827]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vwqcpojtwwtwypfeucpjysrvfpqegtgk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247032.8013468-722-202702494503456/AnsiballZ_copy.py'
Feb 16 13:03:53 compute-0 sudo[83827]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:03:53 compute-0 python3.9[83829]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771247032.8013468-722-202702494503456/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=01d1f535123be2e7a115b69213a7cb6af06b70ab backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:03:53 compute-0 sudo[83827]: pam_unix(sudo:session): session closed for user root
Feb 16 13:03:54 compute-0 sudo[83979]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fbfvkercvnmuadeuvqifgkuzlkqflfhv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247034.113928-753-160285357577346/AnsiballZ_file.py'
Feb 16 13:03:54 compute-0 sudo[83979]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:03:54 compute-0 python3.9[83981]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/neutron-metadata setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 16 13:03:54 compute-0 sudo[83979]: pam_unix(sudo:session): session closed for user root
Feb 16 13:03:54 compute-0 sudo[84131]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-imguzrvwuaaddcdddntybuleqbzzlpmr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247034.7261622-770-252792926212328/AnsiballZ_stat.py'
Feb 16 13:03:55 compute-0 sudo[84131]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:03:55 compute-0 chronyd[65545]: Selected source 4.206.183.154 (pool.ntp.org)
Feb 16 13:03:55 compute-0 python3.9[84133]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:03:55 compute-0 sudo[84131]: pam_unix(sudo:session): session closed for user root
Feb 16 13:03:55 compute-0 sudo[84254]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jhpusxziqgfvkfjjwodhkcjvqhfvzllb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247034.7261622-770-252792926212328/AnsiballZ_copy.py'
Feb 16 13:03:55 compute-0 sudo[84254]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:03:55 compute-0 python3.9[84256]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771247034.7261622-770-252792926212328/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=01d1f535123be2e7a115b69213a7cb6af06b70ab backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:03:55 compute-0 sudo[84254]: pam_unix(sudo:session): session closed for user root
Feb 16 13:03:56 compute-0 sudo[84406]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dxhafmjsacihsnrwupcsqqunphrssbxw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247035.9750607-805-93117420920167/AnsiballZ_file.py'
Feb 16 13:03:56 compute-0 sudo[84406]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:03:56 compute-0 python3.9[84408]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 16 13:03:56 compute-0 sudo[84406]: pam_unix(sudo:session): session closed for user root
Feb 16 13:03:56 compute-0 sudo[84558]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bxmsdmokxghagzdzqtmlolfymaoyxdfo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247036.619678-822-111538416795012/AnsiballZ_stat.py'
Feb 16 13:03:56 compute-0 sudo[84558]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:03:57 compute-0 python3.9[84560]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:03:57 compute-0 sudo[84558]: pam_unix(sudo:session): session closed for user root
Feb 16 13:03:57 compute-0 sudo[84681]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-crjnpkhdnlyaosdviudrmcabylqiqgwh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247036.619678-822-111538416795012/AnsiballZ_copy.py'
Feb 16 13:03:57 compute-0 sudo[84681]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:03:57 compute-0 python3.9[84683]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771247036.619678-822-111538416795012/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=01d1f535123be2e7a115b69213a7cb6af06b70ab backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:03:57 compute-0 sudo[84681]: pam_unix(sudo:session): session closed for user root
Feb 16 13:03:58 compute-0 sudo[84833]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nfahfhqqbeinvpqcioawxoaqcwqkhavv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247038.1073709-855-148224405649903/AnsiballZ_file.py'
Feb 16 13:03:58 compute-0 sudo[84833]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:03:58 compute-0 python3.9[84835]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/bootstrap setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 16 13:03:58 compute-0 sudo[84833]: pam_unix(sudo:session): session closed for user root
Feb 16 13:03:59 compute-0 sudo[84985]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mfmhmumdayumluvmnkhqoizpnextcgls ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247038.7826006-874-57956954366010/AnsiballZ_stat.py'
Feb 16 13:03:59 compute-0 sudo[84985]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:03:59 compute-0 python3.9[84987]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:03:59 compute-0 sudo[84985]: pam_unix(sudo:session): session closed for user root
Feb 16 13:03:59 compute-0 sudo[85110]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tmdofrjzeraealccavcabesewnrswjkz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247038.7826006-874-57956954366010/AnsiballZ_copy.py'
Feb 16 13:03:59 compute-0 sudo[85110]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:03:59 compute-0 sshd-session[84988]: Connection closed by authenticating user root 146.190.226.24 port 35346 [preauth]
Feb 16 13:03:59 compute-0 python3.9[85112]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771247038.7826006-874-57956954366010/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=01d1f535123be2e7a115b69213a7cb6af06b70ab backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:03:59 compute-0 sudo[85110]: pam_unix(sudo:session): session closed for user root
Feb 16 13:04:00 compute-0 sudo[85262]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ermyxgjmxnlajjmsrmadzxrnwaefptdt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247039.9707546-895-274425126710971/AnsiballZ_file.py'
Feb 16 13:04:00 compute-0 sudo[85262]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:04:00 compute-0 python3.9[85264]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 16 13:04:00 compute-0 sudo[85262]: pam_unix(sudo:session): session closed for user root
Feb 16 13:04:00 compute-0 sudo[85414]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-udpxttcwzcyucgsuzsqsxazzdizegvte ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247040.5996091-903-163944038949900/AnsiballZ_stat.py'
Feb 16 13:04:00 compute-0 sudo[85414]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:04:01 compute-0 python3.9[85416]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:04:01 compute-0 sudo[85414]: pam_unix(sudo:session): session closed for user root
Feb 16 13:04:01 compute-0 sudo[85537]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jozhlsocnvuqjwairgdfwvvohezlazvq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247040.5996091-903-163944038949900/AnsiballZ_copy.py'
Feb 16 13:04:01 compute-0 sudo[85537]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:04:01 compute-0 python3.9[85539]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771247040.5996091-903-163944038949900/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=01d1f535123be2e7a115b69213a7cb6af06b70ab backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:04:01 compute-0 sudo[85537]: pam_unix(sudo:session): session closed for user root
Feb 16 13:04:02 compute-0 sshd-session[77878]: Connection closed by 192.168.122.30 port 45898
Feb 16 13:04:02 compute-0 sshd-session[77875]: pam_unix(sshd:session): session closed for user zuul
Feb 16 13:04:02 compute-0 systemd[1]: session-19.scope: Deactivated successfully.
Feb 16 13:04:02 compute-0 systemd[1]: session-19.scope: Consumed 25.196s CPU time.
Feb 16 13:04:02 compute-0 systemd-logind[818]: Session 19 logged out. Waiting for processes to exit.
Feb 16 13:04:02 compute-0 systemd-logind[818]: Removed session 19.
Feb 16 13:04:07 compute-0 sshd-session[85564]: Accepted publickey for zuul from 192.168.122.30 port 41374 ssh2: ECDSA SHA256:6P1Q60WP6ePVjzkJMgpM7PslBYS6YIK3pVS2L1FZ4Yk
Feb 16 13:04:07 compute-0 systemd-logind[818]: New session 20 of user zuul.
Feb 16 13:04:07 compute-0 systemd[1]: Started Session 20 of User zuul.
Feb 16 13:04:07 compute-0 sshd-session[85564]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 16 13:04:08 compute-0 python3.9[85717]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 16 13:04:09 compute-0 sudo[85871]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lcphooicewhqjxzsltjndoiwyxrbrqts ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247049.402909-48-114521635606635/AnsiballZ_file.py'
Feb 16 13:04:09 compute-0 sudo[85871]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:04:09 compute-0 python3.9[85873]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 16 13:04:10 compute-0 sudo[85871]: pam_unix(sudo:session): session closed for user root
Feb 16 13:04:10 compute-0 sudo[86023]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nozlwkfhckcdmitrfrscjsekasvdsvty ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247050.1572626-48-73921816390696/AnsiballZ_file.py'
Feb 16 13:04:10 compute-0 sudo[86023]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:04:10 compute-0 python3.9[86025]: ansible-ansible.builtin.file Invoked with group=openvswitch owner=openvswitch path=/var/lib/openvswitch/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Feb 16 13:04:10 compute-0 sudo[86023]: pam_unix(sudo:session): session closed for user root
Feb 16 13:04:11 compute-0 python3.9[86175]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 16 13:04:11 compute-0 sudo[86325]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pzdaqojhywadhquqpshtwrorjviqauaa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247051.423363-94-90882185065759/AnsiballZ_seboolean.py'
Feb 16 13:04:11 compute-0 sudo[86325]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:04:12 compute-0 python3.9[86327]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Feb 16 13:04:12 compute-0 sudo[86325]: pam_unix(sudo:session): session closed for user root
Feb 16 13:04:13 compute-0 sudo[86481]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-stpofxmuewxjsrcrkkwvuwdbnooevhkv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247053.3040357-114-29551738108744/AnsiballZ_setup.py'
Feb 16 13:04:13 compute-0 dbus-broker-launch[808]: avc:  op=load_policy lsm=selinux seqno=9 res=1
Feb 16 13:04:13 compute-0 sudo[86481]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:04:13 compute-0 python3.9[86483]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 16 13:04:14 compute-0 sudo[86481]: pam_unix(sudo:session): session closed for user root
Feb 16 13:04:14 compute-0 sudo[86565]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uidihspmwydqsczhiiajbcdmbdoniilq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247053.3040357-114-29551738108744/AnsiballZ_dnf.py'
Feb 16 13:04:14 compute-0 sudo[86565]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:04:14 compute-0 python3.9[86567]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 16 13:04:15 compute-0 sudo[86565]: pam_unix(sudo:session): session closed for user root
Feb 16 13:04:16 compute-0 sudo[86718]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hbrywetbyjgnccnifquvozpvdmiampii ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247056.194403-138-251767864844079/AnsiballZ_systemd.py'
Feb 16 13:04:16 compute-0 sudo[86718]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:04:16 compute-0 python3.9[86720]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Feb 16 13:04:17 compute-0 sudo[86718]: pam_unix(sudo:session): session closed for user root
Feb 16 13:04:17 compute-0 sudo[86873]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cfgthdgdzjkdqaylpsgzoyipcdaocxci ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1771247057.2096202-154-78606148358201/AnsiballZ_edpm_nftables_snippet.py'
Feb 16 13:04:17 compute-0 sudo[86873]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:04:17 compute-0 python3[86875]: ansible-osp.edpm.edpm_nftables_snippet Invoked with content=- rule_name: 118 neutron vxlan networks
                                            rule:
                                              proto: udp
                                              dport: 4789
                                          - rule_name: 119 neutron geneve networks
                                            rule:
                                              proto: udp
                                              dport: 6081
                                              state: ["UNTRACKED"]
                                          - rule_name: 120 neutron geneve networks no conntrack
                                            rule:
                                              proto: udp
                                              dport: 6081
                                              table: raw
                                              chain: OUTPUT
                                              jump: NOTRACK
                                              action: append
                                              state: []
                                          - rule_name: 121 neutron geneve networks no conntrack
                                            rule:
                                              proto: udp
                                              dport: 6081
                                              table: raw
                                              chain: PREROUTING
                                              jump: NOTRACK
                                              action: append
                                              state: []
                                           dest=/var/lib/edpm-config/firewall/ovn.yaml state=present
Feb 16 13:04:17 compute-0 sudo[86873]: pam_unix(sudo:session): session closed for user root
Feb 16 13:04:18 compute-0 sudo[87025]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hfruxlrmibwuybirmmhbgotvsssuuljo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247058.0974386-172-108207631027624/AnsiballZ_file.py'
Feb 16 13:04:18 compute-0 sudo[87025]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:04:18 compute-0 python3.9[87027]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:04:18 compute-0 sudo[87025]: pam_unix(sudo:session): session closed for user root
Feb 16 13:04:19 compute-0 sudo[87177]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hzzlgbepyzrwwixwusxlhanqsdvugumo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247058.7360723-188-221318048127158/AnsiballZ_stat.py'
Feb 16 13:04:19 compute-0 sudo[87177]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:04:19 compute-0 python3.9[87179]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:04:19 compute-0 sudo[87177]: pam_unix(sudo:session): session closed for user root
Feb 16 13:04:19 compute-0 sudo[87255]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yjwhqjhuxqhwsxmrfltkrkqpwlvlzdmh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247058.7360723-188-221318048127158/AnsiballZ_file.py'
Feb 16 13:04:19 compute-0 sudo[87255]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:04:19 compute-0 python3.9[87257]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:04:19 compute-0 sudo[87255]: pam_unix(sudo:session): session closed for user root
Feb 16 13:04:20 compute-0 sudo[87407]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fukbdulcgcahvlhptirinvkqbqctphql ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247060.2932963-212-251026528901252/AnsiballZ_stat.py'
Feb 16 13:04:20 compute-0 sudo[87407]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:04:20 compute-0 python3.9[87409]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:04:20 compute-0 sudo[87407]: pam_unix(sudo:session): session closed for user root
Feb 16 13:04:20 compute-0 sudo[87485]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sfgfnmttpxmsqsivosfpvvmivaarmjjy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247060.2932963-212-251026528901252/AnsiballZ_file.py'
Feb 16 13:04:20 compute-0 sudo[87485]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:04:21 compute-0 python3.9[87487]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.okc420m1 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:04:21 compute-0 sudo[87485]: pam_unix(sudo:session): session closed for user root
Feb 16 13:04:21 compute-0 sudo[87637]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nvglegxbaqnxkfydttchwkfpvfjdmahd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247061.320173-236-110190695056557/AnsiballZ_stat.py'
Feb 16 13:04:21 compute-0 sudo[87637]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:04:21 compute-0 python3.9[87639]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:04:21 compute-0 sudo[87637]: pam_unix(sudo:session): session closed for user root
Feb 16 13:04:21 compute-0 sudo[87715]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vzxnpleksceuskpwibedceirevxdawse ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247061.320173-236-110190695056557/AnsiballZ_file.py'
Feb 16 13:04:21 compute-0 sudo[87715]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:04:22 compute-0 python3.9[87717]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:04:22 compute-0 sudo[87715]: pam_unix(sudo:session): session closed for user root
Feb 16 13:04:22 compute-0 sudo[87867]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fxpxlbqohmhgnoiflaismcxrwdoypryh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247062.4346797-262-117729488228790/AnsiballZ_command.py'
Feb 16 13:04:22 compute-0 sudo[87867]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:04:23 compute-0 python3.9[87869]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 13:04:23 compute-0 sudo[87867]: pam_unix(sudo:session): session closed for user root
Feb 16 13:04:23 compute-0 sudo[88020]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-icxjwdoieyipkpaxqckmjbcubfruvxyb ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1771247063.2556028-278-216034887556797/AnsiballZ_edpm_nftables_from_files.py'
Feb 16 13:04:23 compute-0 sudo[88020]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:04:23 compute-0 python3[88022]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Feb 16 13:04:23 compute-0 sudo[88020]: pam_unix(sudo:session): session closed for user root
Feb 16 13:04:24 compute-0 sudo[88172]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bgcwzkzprttrelthzypalrzadqsomwby ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247064.0559754-294-101302245700963/AnsiballZ_stat.py'
Feb 16 13:04:24 compute-0 sudo[88172]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:04:24 compute-0 python3.9[88174]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:04:24 compute-0 sudo[88172]: pam_unix(sudo:session): session closed for user root
Feb 16 13:04:24 compute-0 sudo[88297]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eqwprtowgkcjujvzpwwpzczwitdnrxiz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247064.0559754-294-101302245700963/AnsiballZ_copy.py'
Feb 16 13:04:24 compute-0 sudo[88297]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:04:25 compute-0 python3.9[88299]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771247064.0559754-294-101302245700963/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:04:25 compute-0 sudo[88297]: pam_unix(sudo:session): session closed for user root
Feb 16 13:04:25 compute-0 sudo[88449]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qvlvtqgkqftobudwajgzwzqmgnsingwd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247065.3100595-324-79433065154420/AnsiballZ_stat.py'
Feb 16 13:04:25 compute-0 sudo[88449]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:04:25 compute-0 python3.9[88451]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:04:25 compute-0 sudo[88449]: pam_unix(sudo:session): session closed for user root
Feb 16 13:04:26 compute-0 sudo[88574]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qvxwescxclptadolytfrqqsbxworjggm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247065.3100595-324-79433065154420/AnsiballZ_copy.py'
Feb 16 13:04:26 compute-0 sudo[88574]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:04:26 compute-0 python3.9[88576]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771247065.3100595-324-79433065154420/.source.nft follow=False _original_basename=jump-chain.j2 checksum=ac8dea350c18f51f54d48dacc09613cda4c5540c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:04:26 compute-0 sudo[88574]: pam_unix(sudo:session): session closed for user root
Feb 16 13:04:26 compute-0 sudo[88726]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wqwhjcawsmjmzimlepoycrnydyptevxv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247066.4676855-354-250509752353091/AnsiballZ_stat.py'
Feb 16 13:04:26 compute-0 sudo[88726]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:04:26 compute-0 python3.9[88728]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:04:26 compute-0 sudo[88726]: pam_unix(sudo:session): session closed for user root
Feb 16 13:04:27 compute-0 sudo[88851]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-waxfbllriuejvtwuajwhernvypkzmhqt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247066.4676855-354-250509752353091/AnsiballZ_copy.py'
Feb 16 13:04:27 compute-0 sudo[88851]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:04:27 compute-0 python3.9[88853]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771247066.4676855-354-250509752353091/.source.nft follow=False _original_basename=flush-chain.j2 checksum=4d3ffec49c8eb1a9b80d2f1e8cd64070063a87b4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:04:27 compute-0 sudo[88851]: pam_unix(sudo:session): session closed for user root
Feb 16 13:04:27 compute-0 sudo[89003]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tgytfixwrjkugzqxpzrmxyjgjstqacpi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247067.6318061-384-153883793626468/AnsiballZ_stat.py'
Feb 16 13:04:27 compute-0 sudo[89003]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:04:28 compute-0 python3.9[89005]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:04:28 compute-0 sudo[89003]: pam_unix(sudo:session): session closed for user root
Feb 16 13:04:28 compute-0 sudo[89128]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-htjpftikdsnenvmhaexnnebibvqntiem ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247067.6318061-384-153883793626468/AnsiballZ_copy.py'
Feb 16 13:04:28 compute-0 sudo[89128]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:04:28 compute-0 python3.9[89130]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771247067.6318061-384-153883793626468/.source.nft follow=False _original_basename=chains.j2 checksum=298ada419730ec15df17ded0cc50c97a4014a591 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:04:28 compute-0 sudo[89128]: pam_unix(sudo:session): session closed for user root
Feb 16 13:04:29 compute-0 sudo[89280]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zxyamygnbcmksvaogqjpcbuydqsqtrbb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247068.7970157-414-47699400397660/AnsiballZ_stat.py'
Feb 16 13:04:29 compute-0 sudo[89280]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:04:29 compute-0 python3.9[89282]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:04:29 compute-0 sudo[89280]: pam_unix(sudo:session): session closed for user root
Feb 16 13:04:29 compute-0 sudo[89405]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vovvkxetgyhtcrwuxxlpeiqwqsmuvxqp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247068.7970157-414-47699400397660/AnsiballZ_copy.py'
Feb 16 13:04:29 compute-0 sudo[89405]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:04:29 compute-0 python3.9[89407]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771247068.7970157-414-47699400397660/.source.nft follow=False _original_basename=ruleset.j2 checksum=eb691bdb7d792c5f8ff0d719e807fe1c95b09438 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:04:29 compute-0 sudo[89405]: pam_unix(sudo:session): session closed for user root
Feb 16 13:04:30 compute-0 sudo[89557]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uswumbvxmrhfehinrvxwmwuqesnjxfjs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247070.129387-444-278645236382588/AnsiballZ_file.py'
Feb 16 13:04:30 compute-0 sudo[89557]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:04:30 compute-0 python3.9[89559]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:04:30 compute-0 sudo[89557]: pam_unix(sudo:session): session closed for user root
Feb 16 13:04:30 compute-0 sudo[89709]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-woxtjdlshzcdqarmoowerlgdmtwhmksd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247070.7486944-460-46120724798553/AnsiballZ_command.py'
Feb 16 13:04:30 compute-0 sudo[89709]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:04:31 compute-0 python3.9[89711]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 13:04:31 compute-0 sudo[89709]: pam_unix(sudo:session): session closed for user root
Feb 16 13:04:31 compute-0 sudo[89864]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oevpkahvhsoqopptvgdeokmfohoyznfy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247071.367953-476-228349319599505/AnsiballZ_blockinfile.py'
Feb 16 13:04:31 compute-0 sudo[89864]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:04:31 compute-0 python3.9[89866]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                            include "/etc/nftables/edpm-chains.nft"
                                            include "/etc/nftables/edpm-rules.nft"
                                            include "/etc/nftables/edpm-jumps.nft"
                                             path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:04:31 compute-0 sudo[89864]: pam_unix(sudo:session): session closed for user root
Feb 16 13:04:32 compute-0 sudo[90016]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ertfvblyjlluftypxrkcgnhzyofqzozv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247072.2071016-494-44106478761189/AnsiballZ_command.py'
Feb 16 13:04:32 compute-0 sudo[90016]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:04:32 compute-0 python3.9[90018]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 13:04:32 compute-0 sudo[90016]: pam_unix(sudo:session): session closed for user root
Feb 16 13:04:33 compute-0 sudo[90169]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zjkgulqyjrvklznfbaubwtakbankzpme ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247072.8917875-510-250156571024671/AnsiballZ_stat.py'
Feb 16 13:04:33 compute-0 sudo[90169]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:04:33 compute-0 python3.9[90171]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 16 13:04:33 compute-0 sudo[90169]: pam_unix(sudo:session): session closed for user root
Feb 16 13:04:33 compute-0 sudo[90323]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kconjlbuptxohomgayzomzmsbwauehnl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247073.5666804-526-174881155991665/AnsiballZ_command.py'
Feb 16 13:04:33 compute-0 sudo[90323]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:04:33 compute-0 python3.9[90325]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 13:04:33 compute-0 sudo[90323]: pam_unix(sudo:session): session closed for user root
Feb 16 13:04:34 compute-0 sudo[90478]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mirgwarfruswupowngcnkghybalknxny ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247074.213869-542-67053250919807/AnsiballZ_file.py'
Feb 16 13:04:34 compute-0 sudo[90478]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:04:34 compute-0 python3.9[90480]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:04:34 compute-0 sudo[90478]: pam_unix(sudo:session): session closed for user root
Feb 16 13:04:35 compute-0 python3.9[90630]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'machine'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 16 13:04:36 compute-0 sudo[90781]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qeiqaqlroklffblpwhmtbeydswdwkhkf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247076.4473305-622-198633145294718/AnsiballZ_command.py'
Feb 16 13:04:36 compute-0 sudo[90781]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:04:36 compute-0 python3.9[90783]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl set open . external_ids:hostname=compute-0.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings="datacentre:1e:0a:9f:1d:bd:e8" external_ids:ovn-encap-ip=172.19.0.100 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch 
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 13:04:36 compute-0 ovs-vsctl[90784]: ovs|00001|vsctl|INFO|Called as ovs-vsctl set open . external_ids:hostname=compute-0.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings=datacentre:1e:0a:9f:1d:bd:e8 external_ids:ovn-encap-ip=172.19.0.100 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch
Feb 16 13:04:36 compute-0 sudo[90781]: pam_unix(sudo:session): session closed for user root
Feb 16 13:04:37 compute-0 sudo[90934]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ydpkssdnmmtfctnwktpmkgilonzdrhwc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247077.1100733-640-179945389696023/AnsiballZ_command.py'
Feb 16 13:04:37 compute-0 sudo[90934]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:04:37 compute-0 python3.9[90936]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail
                                            ovs-vsctl show | grep -q "Manager"
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 13:04:37 compute-0 sudo[90934]: pam_unix(sudo:session): session closed for user root
Feb 16 13:04:38 compute-0 sudo[91089]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bylrhcubzzbthxptarspgrrxcgqdisjv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247077.7587538-656-140777479882045/AnsiballZ_command.py'
Feb 16 13:04:38 compute-0 sudo[91089]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:04:38 compute-0 python3.9[91091]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl --timeout=5 --id=@manager -- create Manager target=\"ptcp:********@manager
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 13:04:38 compute-0 ovs-vsctl[91092]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --timeout=5 --id=@manager -- create Manager "target=\"ptcp:6640:127.0.0.1\"" -- add Open_vSwitch . manager_options @manager
Feb 16 13:04:38 compute-0 sudo[91089]: pam_unix(sudo:session): session closed for user root
Feb 16 13:04:38 compute-0 python3.9[91242]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 16 13:04:39 compute-0 sudo[91394]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vwqperyfluagmczdqanyhjlusscpojmt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247079.312364-690-225717038567965/AnsiballZ_file.py'
Feb 16 13:04:39 compute-0 sudo[91394]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:04:39 compute-0 python3.9[91396]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 16 13:04:39 compute-0 sudo[91394]: pam_unix(sudo:session): session closed for user root
Feb 16 13:04:40 compute-0 sudo[91546]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xxwptgpefwsjxibrwcentcrsnllaxvjl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247079.9916952-706-70148574492947/AnsiballZ_stat.py'
Feb 16 13:04:40 compute-0 sudo[91546]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:04:40 compute-0 python3.9[91548]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:04:40 compute-0 sudo[91546]: pam_unix(sudo:session): session closed for user root
Feb 16 13:04:40 compute-0 sudo[91624]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ijdefmuyftwznxcqtnuyrkueotwmqwde ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247079.9916952-706-70148574492947/AnsiballZ_file.py'
Feb 16 13:04:40 compute-0 sudo[91624]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:04:40 compute-0 python3.9[91626]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 16 13:04:40 compute-0 sudo[91624]: pam_unix(sudo:session): session closed for user root
Feb 16 13:04:41 compute-0 sudo[91776]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-donmwukdpreipzyuchghihrztklalohg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247081.042926-706-9350282347293/AnsiballZ_stat.py'
Feb 16 13:04:41 compute-0 sudo[91776]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:04:41 compute-0 python3.9[91778]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:04:41 compute-0 sudo[91776]: pam_unix(sudo:session): session closed for user root
Feb 16 13:04:41 compute-0 sudo[91854]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zzsvaruzuwgogimaegppjiampzmuwffx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247081.042926-706-9350282347293/AnsiballZ_file.py'
Feb 16 13:04:41 compute-0 sudo[91854]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:04:41 compute-0 python3.9[91856]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 16 13:04:41 compute-0 sudo[91854]: pam_unix(sudo:session): session closed for user root
Feb 16 13:04:42 compute-0 sudo[92006]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-guxfhszdvgyjkhidjkvwicagjkzgsllr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247082.1565135-752-119634097958999/AnsiballZ_file.py'
Feb 16 13:04:42 compute-0 sudo[92006]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:04:42 compute-0 python3.9[92008]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:04:42 compute-0 sudo[92006]: pam_unix(sudo:session): session closed for user root
Feb 16 13:04:43 compute-0 sudo[92158]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lthtetwwpjggkcreuawputmsydvwnfyx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247082.9718146-768-51559247478551/AnsiballZ_stat.py'
Feb 16 13:04:43 compute-0 sudo[92158]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:04:43 compute-0 python3.9[92160]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:04:43 compute-0 sudo[92158]: pam_unix(sudo:session): session closed for user root
Feb 16 13:04:43 compute-0 sudo[92236]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qetssowztmfqfmgypdqirrmmqwjjhuza ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247082.9718146-768-51559247478551/AnsiballZ_file.py'
Feb 16 13:04:43 compute-0 sudo[92236]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:04:43 compute-0 python3.9[92238]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:04:43 compute-0 sudo[92236]: pam_unix(sudo:session): session closed for user root
Feb 16 13:04:44 compute-0 sudo[92388]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pnbgnnqsakjbpddzcaxbbxgyupbbehws ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247084.078696-792-48233427992804/AnsiballZ_stat.py'
Feb 16 13:04:44 compute-0 sudo[92388]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:04:44 compute-0 python3.9[92390]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:04:44 compute-0 sudo[92388]: pam_unix(sudo:session): session closed for user root
Feb 16 13:04:44 compute-0 sudo[92466]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pgetoxyylzojngndwbqjvdcpipsddmxt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247084.078696-792-48233427992804/AnsiballZ_file.py'
Feb 16 13:04:44 compute-0 sudo[92466]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:04:44 compute-0 python3.9[92468]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:04:44 compute-0 sudo[92466]: pam_unix(sudo:session): session closed for user root
Feb 16 13:04:45 compute-0 sudo[92618]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-shmpqklxuthrknrlkxuzxwbrkqvrxhlp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247085.1686232-816-90756629220233/AnsiballZ_systemd.py'
Feb 16 13:04:45 compute-0 sudo[92618]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:04:45 compute-0 python3.9[92620]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 16 13:04:45 compute-0 systemd[1]: Reloading.
Feb 16 13:04:45 compute-0 systemd-sysv-generator[92652]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 16 13:04:45 compute-0 systemd-rc-local-generator[92647]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 13:04:46 compute-0 sudo[92618]: pam_unix(sudo:session): session closed for user root
Feb 16 13:04:46 compute-0 sudo[92814]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vceuldfvlkzcvzdiwzznyswcgdcagobh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247086.4676702-832-196693231858329/AnsiballZ_stat.py'
Feb 16 13:04:46 compute-0 sudo[92814]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:04:46 compute-0 python3.9[92816]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:04:46 compute-0 sudo[92814]: pam_unix(sudo:session): session closed for user root
Feb 16 13:04:47 compute-0 sudo[92892]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-waatmnuktyyandxlwjtgosnbtfxtkxax ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247086.4676702-832-196693231858329/AnsiballZ_file.py'
Feb 16 13:04:47 compute-0 sudo[92892]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:04:47 compute-0 python3.9[92894]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:04:47 compute-0 sudo[92892]: pam_unix(sudo:session): session closed for user root
Feb 16 13:04:47 compute-0 sudo[93044]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yjkugtqdhthcmchdanyjahlcuetxrsrv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247087.5027926-856-84162772560922/AnsiballZ_stat.py'
Feb 16 13:04:47 compute-0 sudo[93044]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:04:47 compute-0 python3.9[93046]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:04:47 compute-0 sudo[93044]: pam_unix(sudo:session): session closed for user root
Feb 16 13:04:48 compute-0 sudo[93122]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tmdizhyyqptuemyodcqemtetuuhrewgc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247087.5027926-856-84162772560922/AnsiballZ_file.py'
Feb 16 13:04:48 compute-0 sudo[93122]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:04:48 compute-0 python3.9[93124]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:04:48 compute-0 sudo[93122]: pam_unix(sudo:session): session closed for user root
Feb 16 13:04:48 compute-0 sudo[93274]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-agxccqyuullonvducuikigvccnywctyg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247088.5621352-880-20079030744073/AnsiballZ_systemd.py'
Feb 16 13:04:48 compute-0 sudo[93274]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:04:49 compute-0 python3.9[93276]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 16 13:04:49 compute-0 systemd[1]: Reloading.
Feb 16 13:04:49 compute-0 systemd-rc-local-generator[93299]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 13:04:49 compute-0 systemd-sysv-generator[93302]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 16 13:04:49 compute-0 systemd[1]: Starting Create netns directory...
Feb 16 13:04:49 compute-0 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Feb 16 13:04:49 compute-0 systemd[1]: netns-placeholder.service: Deactivated successfully.
Feb 16 13:04:49 compute-0 systemd[1]: Finished Create netns directory.
Feb 16 13:04:49 compute-0 sudo[93274]: pam_unix(sudo:session): session closed for user root
Feb 16 13:04:49 compute-0 sudo[93473]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iwdqkoltbrmzykjgdqduhllphnxpxtoc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247089.6392713-900-69394877203225/AnsiballZ_file.py'
Feb 16 13:04:49 compute-0 sudo[93473]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:04:50 compute-0 python3.9[93475]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 16 13:04:50 compute-0 sudo[93473]: pam_unix(sudo:session): session closed for user root
Feb 16 13:04:50 compute-0 sudo[93625]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eqbfkbshgrvwxpjwslezplylsowxlnlb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247090.2803125-916-278701985187700/AnsiballZ_stat.py'
Feb 16 13:04:50 compute-0 sudo[93625]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:04:51 compute-0 python3.9[93627]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_controller/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:04:51 compute-0 sudo[93625]: pam_unix(sudo:session): session closed for user root
Feb 16 13:04:51 compute-0 sudo[93748]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vvivipqsugkfjguidklpyeyaaupacloq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247090.2803125-916-278701985187700/AnsiballZ_copy.py'
Feb 16 13:04:51 compute-0 sudo[93748]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:04:51 compute-0 python3.9[93750]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_controller/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771247090.2803125-916-278701985187700/.source _original_basename=healthcheck follow=False checksum=4098dd010265fabdf5c26b97d169fc4e575ff457 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Feb 16 13:04:51 compute-0 sudo[93748]: pam_unix(sudo:session): session closed for user root
Feb 16 13:04:52 compute-0 sudo[93900]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cmfgckhkmgpmrefxwywstcmcjagctfgt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247091.9127629-950-148997270038786/AnsiballZ_file.py'
Feb 16 13:04:52 compute-0 sudo[93900]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:04:52 compute-0 python3.9[93902]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:04:52 compute-0 sudo[93900]: pam_unix(sudo:session): session closed for user root
Feb 16 13:04:52 compute-0 sudo[94052]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oxwbyulkvexbjnovxhkfklgfknyjulpi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247092.5523992-966-32052457173449/AnsiballZ_file.py'
Feb 16 13:04:52 compute-0 sudo[94052]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:04:53 compute-0 python3.9[94054]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 16 13:04:53 compute-0 sudo[94052]: pam_unix(sudo:session): session closed for user root
Feb 16 13:04:53 compute-0 sudo[94204]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hcafcepzbvdjmwabjchzbfzlbsbuujtv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247093.242011-982-242026944905440/AnsiballZ_stat.py'
Feb 16 13:04:53 compute-0 sudo[94204]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:04:53 compute-0 python3.9[94206]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_controller.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:04:53 compute-0 sudo[94204]: pam_unix(sudo:session): session closed for user root
Feb 16 13:04:53 compute-0 sudo[94327]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ebrvgdhkhagspysbybdsytdbrnomctkt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247093.242011-982-242026944905440/AnsiballZ_copy.py'
Feb 16 13:04:53 compute-0 sudo[94327]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:04:54 compute-0 python3.9[94329]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_controller.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1771247093.242011-982-242026944905440/.source.json _original_basename=.fi3kyh2j follow=False checksum=2328fc98619beeb08ee32b01f15bb43094c10b61 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:04:54 compute-0 sudo[94327]: pam_unix(sudo:session): session closed for user root
Feb 16 13:04:55 compute-0 python3.9[94479]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_controller state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:04:57 compute-0 sudo[94900]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fwhhtuqjauojimdsssizwyelztqwguyt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247096.6323202-1062-269665928591308/AnsiballZ_container_config_data.py'
Feb 16 13:04:57 compute-0 sudo[94900]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:04:57 compute-0 python3.9[94902]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_controller config_pattern=*.json debug=False
Feb 16 13:04:57 compute-0 sudo[94900]: pam_unix(sudo:session): session closed for user root
Feb 16 13:04:57 compute-0 sudo[95052]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lroiboxyalcyovptonqrjpwkxcouzcvm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247097.588678-1084-78505348158695/AnsiballZ_container_config_hash.py'
Feb 16 13:04:57 compute-0 sudo[95052]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:04:58 compute-0 python3.9[95054]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Feb 16 13:04:58 compute-0 sudo[95052]: pam_unix(sudo:session): session closed for user root
Feb 16 13:04:59 compute-0 sudo[95204]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ysrxbgmnjjieobbzcsuojvmanpfnlyxh ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1771247098.5363216-1104-210400200549529/AnsiballZ_edpm_container_manage.py'
Feb 16 13:04:59 compute-0 sudo[95204]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:04:59 compute-0 python3[95206]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_controller config_id=ovn_controller config_overrides={} config_patterns=*.json containers=['ovn_controller'] log_base_path=/var/log/containers/stdouts debug=False
Feb 16 13:04:59 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 16 13:04:59 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 16 13:04:59 compute-0 podman[95242]: 2026-02-16 13:04:59.514654059 +0000 UTC m=+0.050776314 container create 19961a2f872cc72daf954f4dd556f980b5f58f137d145776df6132c167a8a93c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20260127, tcib_managed=true, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Feb 16 13:04:59 compute-0 podman[95242]: 2026-02-16 13:04:59.48816195 +0000 UTC m=+0.024284305 image pull 9f8c6308802db66f6c1100257e3fa9593740e85d82f038b4185cf756493dc94e quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Feb 16 13:04:59 compute-0 python3[95206]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_controller --conmon-pidfile /run/ovn_controller.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc --healthcheck-command /openstack/healthcheck --label config_id=ovn_controller --label container_name=ovn_controller --label managed_by=edpm_ansible --label config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --user root --volume /lib/modules:/lib/modules:ro --volume /run:/run --volume /var/lib/openvswitch/ovn:/run/ovn:shared,z --volume /var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Feb 16 13:04:59 compute-0 sudo[95204]: pam_unix(sudo:session): session closed for user root
Feb 16 13:05:00 compute-0 sudo[95430]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tjnowqkkogcqctsqwmrslwqraqluzndw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247099.9894536-1120-87866352462976/AnsiballZ_stat.py'
Feb 16 13:05:00 compute-0 sudo[95430]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:05:00 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 16 13:05:00 compute-0 python3.9[95432]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 16 13:05:00 compute-0 sudo[95430]: pam_unix(sudo:session): session closed for user root
Feb 16 13:05:00 compute-0 sudo[95584]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rlpafmvhaikkydznvelkfnjmdunhnody ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247100.6980658-1138-261401849232733/AnsiballZ_file.py'
Feb 16 13:05:00 compute-0 sudo[95584]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:05:01 compute-0 python3.9[95586]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_controller.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:05:01 compute-0 sudo[95584]: pam_unix(sudo:session): session closed for user root
Feb 16 13:05:01 compute-0 sudo[95660]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-erljsjgaeythaydhfmehzpjynnpwpbcr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247100.6980658-1138-261401849232733/AnsiballZ_stat.py'
Feb 16 13:05:01 compute-0 sudo[95660]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:05:01 compute-0 python3.9[95662]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_controller_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 16 13:05:01 compute-0 sudo[95660]: pam_unix(sudo:session): session closed for user root
Feb 16 13:05:02 compute-0 sudo[95811]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wrifbzbzwkpgghdmznzdurzankioimft ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247101.5995882-1138-114389474551846/AnsiballZ_copy.py'
Feb 16 13:05:02 compute-0 sudo[95811]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:05:02 compute-0 python3.9[95813]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1771247101.5995882-1138-114389474551846/source dest=/etc/systemd/system/edpm_ovn_controller.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:05:02 compute-0 sudo[95811]: pam_unix(sudo:session): session closed for user root
Feb 16 13:05:02 compute-0 sudo[95887]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eyiewetrbvkynruxibvscqurqprbfnov ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247101.5995882-1138-114389474551846/AnsiballZ_systemd.py'
Feb 16 13:05:02 compute-0 sudo[95887]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:05:02 compute-0 python3.9[95889]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 16 13:05:02 compute-0 systemd[1]: Reloading.
Feb 16 13:05:02 compute-0 systemd-rc-local-generator[95915]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 13:05:02 compute-0 systemd-sysv-generator[95919]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 16 13:05:02 compute-0 sudo[95887]: pam_unix(sudo:session): session closed for user root
Feb 16 13:05:03 compute-0 sudo[96007]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ureilatcnysppmdomgarplapszkwunwv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247101.5995882-1138-114389474551846/AnsiballZ_systemd.py'
Feb 16 13:05:03 compute-0 sudo[96007]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:05:03 compute-0 python3.9[96009]: ansible-systemd Invoked with state=restarted name=edpm_ovn_controller.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 16 13:05:03 compute-0 systemd[1]: Reloading.
Feb 16 13:05:03 compute-0 systemd-rc-local-generator[96039]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 13:05:03 compute-0 systemd-sysv-generator[96043]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 16 13:05:03 compute-0 systemd[1]: Starting ovn_controller container...
Feb 16 13:05:03 compute-0 systemd[1]: Created slice Virtual Machine and Container Slice.
Feb 16 13:05:03 compute-0 systemd[1]: Started libcrun container.
Feb 16 13:05:03 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/aad9e5e978094f1eb3090c069cf90bc1cd8525a441a52024ba7785d5404aa49c/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Feb 16 13:05:03 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 19961a2f872cc72daf954f4dd556f980b5f58f137d145776df6132c167a8a93c.
Feb 16 13:05:03 compute-0 podman[96059]: 2026-02-16 13:05:03.720292118 +0000 UTC m=+0.105892223 container init 19961a2f872cc72daf954f4dd556f980b5f58f137d145776df6132c167a8a93c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Feb 16 13:05:03 compute-0 ovn_controller[96072]: + sudo -E kolla_set_configs
Feb 16 13:05:03 compute-0 podman[96059]: 2026-02-16 13:05:03.745169899 +0000 UTC m=+0.130769984 container start 19961a2f872cc72daf954f4dd556f980b5f58f137d145776df6132c167a8a93c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Feb 16 13:05:03 compute-0 edpm-start-podman-container[96059]: ovn_controller
Feb 16 13:05:03 compute-0 systemd[1]: Created slice User Slice of UID 0.
Feb 16 13:05:03 compute-0 systemd[1]: Starting User Runtime Directory /run/user/0...
Feb 16 13:05:03 compute-0 systemd[1]: Finished User Runtime Directory /run/user/0.
Feb 16 13:05:03 compute-0 systemd[1]: Starting User Manager for UID 0...
Feb 16 13:05:03 compute-0 systemd[96102]: pam_unix(systemd-user:session): session opened for user root(uid=0) by root(uid=0)
Feb 16 13:05:03 compute-0 edpm-start-podman-container[96058]: Creating additional drop-in dependency for "ovn_controller" (19961a2f872cc72daf954f4dd556f980b5f58f137d145776df6132c167a8a93c)
Feb 16 13:05:03 compute-0 podman[96078]: 2026-02-16 13:05:03.824535679 +0000 UTC m=+0.071829374 container health_status 19961a2f872cc72daf954f4dd556f980b5f58f137d145776df6132c167a8a93c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=starting, health_failing_streak=1, health_log=, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible)
Feb 16 13:05:03 compute-0 systemd[1]: 19961a2f872cc72daf954f4dd556f980b5f58f137d145776df6132c167a8a93c-3d8c1c4c76d09cc1.service: Main process exited, code=exited, status=1/FAILURE
Feb 16 13:05:03 compute-0 systemd[1]: 19961a2f872cc72daf954f4dd556f980b5f58f137d145776df6132c167a8a93c-3d8c1c4c76d09cc1.service: Failed with result 'exit-code'.
Feb 16 13:05:03 compute-0 systemd[1]: Reloading.
Feb 16 13:05:03 compute-0 systemd[96102]: Queued start job for default target Main User Target.
Feb 16 13:05:03 compute-0 systemd[96102]: Created slice User Application Slice.
Feb 16 13:05:03 compute-0 systemd[96102]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system).
Feb 16 13:05:03 compute-0 systemd[96102]: Started Daily Cleanup of User's Temporary Directories.
Feb 16 13:05:03 compute-0 systemd[96102]: Reached target Paths.
Feb 16 13:05:03 compute-0 systemd[96102]: Reached target Timers.
Feb 16 13:05:03 compute-0 systemd[96102]: Starting D-Bus User Message Bus Socket...
Feb 16 13:05:03 compute-0 systemd[96102]: Starting Create User's Volatile Files and Directories...
Feb 16 13:05:03 compute-0 systemd[96102]: Finished Create User's Volatile Files and Directories.
Feb 16 13:05:03 compute-0 systemd[96102]: Listening on D-Bus User Message Bus Socket.
Feb 16 13:05:03 compute-0 systemd[96102]: Reached target Sockets.
Feb 16 13:05:03 compute-0 systemd[96102]: Reached target Basic System.
Feb 16 13:05:03 compute-0 systemd[96102]: Reached target Main User Target.
Feb 16 13:05:03 compute-0 systemd[96102]: Startup finished in 98ms.
Feb 16 13:05:03 compute-0 systemd-rc-local-generator[96162]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 13:05:03 compute-0 systemd-sysv-generator[96167]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 16 13:05:04 compute-0 sshd-session[96054]: Connection closed by authenticating user root 146.190.226.24 port 56252 [preauth]
Feb 16 13:05:04 compute-0 systemd[1]: Started User Manager for UID 0.
Feb 16 13:05:04 compute-0 systemd[1]: Started Session c1 of User root.
Feb 16 13:05:04 compute-0 systemd[1]: Started ovn_controller container.
Feb 16 13:05:04 compute-0 sudo[96007]: pam_unix(sudo:session): session closed for user root
Feb 16 13:05:04 compute-0 ovn_controller[96072]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Feb 16 13:05:04 compute-0 ovn_controller[96072]: INFO:__main__:Validating config file
Feb 16 13:05:04 compute-0 ovn_controller[96072]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Feb 16 13:05:04 compute-0 ovn_controller[96072]: INFO:__main__:Writing out command to execute
Feb 16 13:05:04 compute-0 systemd[1]: session-c1.scope: Deactivated successfully.
Feb 16 13:05:04 compute-0 ovn_controller[96072]: ++ cat /run_command
Feb 16 13:05:04 compute-0 ovn_controller[96072]: + CMD='/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Feb 16 13:05:04 compute-0 ovn_controller[96072]: + ARGS=
Feb 16 13:05:04 compute-0 ovn_controller[96072]: + sudo kolla_copy_cacerts
Feb 16 13:05:04 compute-0 systemd[1]: Started Session c2 of User root.
Feb 16 13:05:04 compute-0 systemd[1]: session-c2.scope: Deactivated successfully.
Feb 16 13:05:04 compute-0 ovn_controller[96072]: + [[ ! -n '' ]]
Feb 16 13:05:04 compute-0 ovn_controller[96072]: + . kolla_extend_start
Feb 16 13:05:04 compute-0 ovn_controller[96072]: + echo 'Running command: '\''/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '\'''
Feb 16 13:05:04 compute-0 ovn_controller[96072]: Running command: '/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Feb 16 13:05:04 compute-0 ovn_controller[96072]: + umask 0022
Feb 16 13:05:04 compute-0 ovn_controller[96072]: + exec /usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt
Feb 16 13:05:04 compute-0 ovn_controller[96072]: 2026-02-16T13:05:04Z|00001|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Feb 16 13:05:04 compute-0 ovn_controller[96072]: 2026-02-16T13:05:04Z|00002|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Feb 16 13:05:04 compute-0 ovn_controller[96072]: 2026-02-16T13:05:04Z|00003|main|INFO|OVN internal version is : [24.03.8-20.33.0-76.8]
Feb 16 13:05:04 compute-0 ovn_controller[96072]: 2026-02-16T13:05:04Z|00004|main|INFO|OVS IDL reconnected, force recompute.
Feb 16 13:05:04 compute-0 ovn_controller[96072]: 2026-02-16T13:05:04Z|00005|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Feb 16 13:05:04 compute-0 ovn_controller[96072]: 2026-02-16T13:05:04Z|00006|main|INFO|OVNSB IDL reconnected, force recompute.
Feb 16 13:05:04 compute-0 NetworkManager[56177]: <info>  [1771247104.2072] manager: (br-int): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/14)
Feb 16 13:05:04 compute-0 NetworkManager[56177]: <info>  [1771247104.2081] device (br-int)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Feb 16 13:05:04 compute-0 NetworkManager[56177]: <warn>  [1771247104.2085] device (br-int)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Feb 16 13:05:04 compute-0 NetworkManager[56177]: <info>  [1771247104.2091] manager: (br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/15)
Feb 16 13:05:04 compute-0 NetworkManager[56177]: <info>  [1771247104.2097] manager: (br-int): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/16)
Feb 16 13:05:04 compute-0 NetworkManager[56177]: <info>  [1771247104.2100] device (br-int)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Feb 16 13:05:04 compute-0 ovn_controller[96072]: 2026-02-16T13:05:04Z|00007|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connected
Feb 16 13:05:04 compute-0 kernel: br-int: entered promiscuous mode
Feb 16 13:05:04 compute-0 ovn_controller[96072]: 2026-02-16T13:05:04Z|00008|features|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Feb 16 13:05:04 compute-0 ovn_controller[96072]: 2026-02-16T13:05:04Z|00009|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Feb 16 13:05:04 compute-0 ovn_controller[96072]: 2026-02-16T13:05:04Z|00010|features|INFO|OVS Feature: ct_zero_snat, state: supported
Feb 16 13:05:04 compute-0 ovn_controller[96072]: 2026-02-16T13:05:04Z|00011|features|INFO|OVS Feature: ct_flush, state: supported
Feb 16 13:05:04 compute-0 ovn_controller[96072]: 2026-02-16T13:05:04Z|00012|features|INFO|OVS Feature: dp_hash_l4_sym_support, state: supported
Feb 16 13:05:04 compute-0 ovn_controller[96072]: 2026-02-16T13:05:04Z|00013|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Feb 16 13:05:04 compute-0 ovn_controller[96072]: 2026-02-16T13:05:04Z|00014|main|INFO|OVS feature set changed, force recompute.
Feb 16 13:05:04 compute-0 ovn_controller[96072]: 2026-02-16T13:05:04Z|00015|ofctrl|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Feb 16 13:05:04 compute-0 ovn_controller[96072]: 2026-02-16T13:05:04Z|00016|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Feb 16 13:05:04 compute-0 ovn_controller[96072]: 2026-02-16T13:05:04Z|00017|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Feb 16 13:05:04 compute-0 ovn_controller[96072]: 2026-02-16T13:05:04Z|00018|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Feb 16 13:05:04 compute-0 ovn_controller[96072]: 2026-02-16T13:05:04Z|00019|features|INFO|OVS DB schema supports 4 flow table prefixes, our IDL supports: 4
Feb 16 13:05:04 compute-0 ovn_controller[96072]: 2026-02-16T13:05:04Z|00020|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Feb 16 13:05:04 compute-0 ovn_controller[96072]: 2026-02-16T13:05:04Z|00021|ofctrl|INFO|ofctrl-wait-before-clear is now 8000 ms (was 0 ms)
Feb 16 13:05:04 compute-0 ovn_controller[96072]: 2026-02-16T13:05:04Z|00022|main|INFO|OVS OpenFlow connection reconnected,force recompute.
Feb 16 13:05:04 compute-0 ovn_controller[96072]: 2026-02-16T13:05:04Z|00023|main|INFO|Setting flow table prefixes: ip_src, ip_dst, ipv6_src, ipv6_dst.
Feb 16 13:05:04 compute-0 ovn_controller[96072]: 2026-02-16T13:05:04Z|00024|main|INFO|OVS feature set changed, force recompute.
Feb 16 13:05:04 compute-0 ovn_controller[96072]: 2026-02-16T13:05:04Z|00001|statctrl(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Feb 16 13:05:04 compute-0 ovn_controller[96072]: 2026-02-16T13:05:04Z|00001|pinctrl(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Feb 16 13:05:04 compute-0 ovn_controller[96072]: 2026-02-16T13:05:04Z|00002|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Feb 16 13:05:04 compute-0 ovn_controller[96072]: 2026-02-16T13:05:04Z|00002|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Feb 16 13:05:04 compute-0 ovn_controller[96072]: 2026-02-16T13:05:04Z|00003|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Feb 16 13:05:04 compute-0 ovn_controller[96072]: 2026-02-16T13:05:04Z|00003|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Feb 16 13:05:04 compute-0 NetworkManager[56177]: <info>  [1771247104.2291] manager: (ovn-16940e-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/17)
Feb 16 13:05:04 compute-0 systemd-udevd[96215]: Network interface NamePolicy= disabled on kernel command line.
Feb 16 13:05:04 compute-0 kernel: genev_sys_6081: entered promiscuous mode
Feb 16 13:05:04 compute-0 NetworkManager[56177]: <info>  [1771247104.2427] device (genev_sys_6081): carrier: link connected
Feb 16 13:05:04 compute-0 NetworkManager[56177]: <info>  [1771247104.2433] manager: (genev_sys_6081): new Generic device (/org/freedesktop/NetworkManager/Devices/18)
Feb 16 13:05:04 compute-0 systemd-udevd[96218]: Network interface NamePolicy= disabled on kernel command line.
Feb 16 13:05:04 compute-0 NetworkManager[56177]: <info>  [1771247104.3615] manager: (ovn-54c1a2-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/19)
Feb 16 13:05:05 compute-0 python3.9[96345]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Feb 16 13:05:05 compute-0 sudo[96495]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-reitaxrozdzrpsccsslyqcwfxutjotae ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247105.5329254-1228-191342157936038/AnsiballZ_stat.py'
Feb 16 13:05:05 compute-0 sudo[96495]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:05:05 compute-0 python3.9[96497]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:05:06 compute-0 sudo[96495]: pam_unix(sudo:session): session closed for user root
Feb 16 13:05:06 compute-0 sudo[96618]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vtfukqlqflbidmgrnvhdrtwjobviivdp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247105.5329254-1228-191342157936038/AnsiballZ_copy.py'
Feb 16 13:05:06 compute-0 sudo[96618]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:05:06 compute-0 python3.9[96620]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771247105.5329254-1228-191342157936038/.source.yaml _original_basename=.7qmzih5m follow=False checksum=1ce770b9a19b4b0066c27b1fbba4d3923dbce27b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:05:06 compute-0 sudo[96618]: pam_unix(sudo:session): session closed for user root
Feb 16 13:05:06 compute-0 sudo[96770]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nopxlmlgsexldoxrqrwthvgovtwiogoq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247106.6725192-1258-193418172719610/AnsiballZ_command.py'
Feb 16 13:05:06 compute-0 sudo[96770]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:05:07 compute-0 python3.9[96772]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove open . other_config hw-offload
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 13:05:07 compute-0 ovs-vsctl[96773]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove open . other_config hw-offload
Feb 16 13:05:07 compute-0 sudo[96770]: pam_unix(sudo:session): session closed for user root
Feb 16 13:05:07 compute-0 sudo[96923]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cygrqxnwvdpxsvstgomdwrtwoigqvixn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247107.2440207-1274-119508939028674/AnsiballZ_command.py'
Feb 16 13:05:07 compute-0 sudo[96923]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:05:07 compute-0 python3.9[96925]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl get Open_vSwitch . external_ids:ovn-cms-options | sed 's/\"//g'
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 13:05:07 compute-0 ovs-vsctl[96927]: ovs|00001|db_ctl_base|ERR|no key "ovn-cms-options" in Open_vSwitch record "." column external_ids
Feb 16 13:05:07 compute-0 sudo[96923]: pam_unix(sudo:session): session closed for user root
Feb 16 13:05:08 compute-0 sudo[97078]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ntdrruwymdhigbsozjqprpqbajfjkeny ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247108.200057-1302-134341063569108/AnsiballZ_command.py'
Feb 16 13:05:08 compute-0 sudo[97078]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:05:08 compute-0 python3.9[97080]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 13:05:08 compute-0 ovs-vsctl[97081]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
Feb 16 13:05:08 compute-0 sudo[97078]: pam_unix(sudo:session): session closed for user root
Feb 16 13:05:09 compute-0 sshd-session[85567]: Connection closed by 192.168.122.30 port 41374
Feb 16 13:05:09 compute-0 sshd-session[85564]: pam_unix(sshd:session): session closed for user zuul
Feb 16 13:05:09 compute-0 systemd[1]: session-20.scope: Deactivated successfully.
Feb 16 13:05:09 compute-0 systemd[1]: session-20.scope: Consumed 39.801s CPU time.
Feb 16 13:05:09 compute-0 systemd-logind[818]: Session 20 logged out. Waiting for processes to exit.
Feb 16 13:05:09 compute-0 systemd-logind[818]: Removed session 20.
Feb 16 13:05:14 compute-0 systemd[1]: Stopping User Manager for UID 0...
Feb 16 13:05:14 compute-0 systemd[96102]: Activating special unit Exit the Session...
Feb 16 13:05:14 compute-0 systemd[96102]: Stopped target Main User Target.
Feb 16 13:05:14 compute-0 systemd[96102]: Stopped target Basic System.
Feb 16 13:05:14 compute-0 systemd[96102]: Stopped target Paths.
Feb 16 13:05:14 compute-0 systemd[96102]: Stopped target Sockets.
Feb 16 13:05:14 compute-0 systemd[96102]: Stopped target Timers.
Feb 16 13:05:14 compute-0 systemd[96102]: Stopped Daily Cleanup of User's Temporary Directories.
Feb 16 13:05:14 compute-0 systemd[96102]: Closed D-Bus User Message Bus Socket.
Feb 16 13:05:14 compute-0 systemd[96102]: Stopped Create User's Volatile Files and Directories.
Feb 16 13:05:14 compute-0 systemd[96102]: Removed slice User Application Slice.
Feb 16 13:05:14 compute-0 systemd[96102]: Reached target Shutdown.
Feb 16 13:05:14 compute-0 systemd[96102]: Finished Exit the Session.
Feb 16 13:05:14 compute-0 systemd[96102]: Reached target Exit the Session.
Feb 16 13:05:14 compute-0 systemd[1]: user@0.service: Deactivated successfully.
Feb 16 13:05:14 compute-0 systemd[1]: Stopped User Manager for UID 0.
Feb 16 13:05:14 compute-0 systemd[1]: Stopping User Runtime Directory /run/user/0...
Feb 16 13:05:14 compute-0 systemd[1]: run-user-0.mount: Deactivated successfully.
Feb 16 13:05:14 compute-0 systemd[1]: user-runtime-dir@0.service: Deactivated successfully.
Feb 16 13:05:14 compute-0 systemd[1]: Stopped User Runtime Directory /run/user/0.
Feb 16 13:05:14 compute-0 systemd[1]: Removed slice User Slice of UID 0.
Feb 16 13:05:15 compute-0 sshd-session[97108]: Accepted publickey for zuul from 192.168.122.30 port 57748 ssh2: ECDSA SHA256:6P1Q60WP6ePVjzkJMgpM7PslBYS6YIK3pVS2L1FZ4Yk
Feb 16 13:05:15 compute-0 systemd-logind[818]: New session 22 of user zuul.
Feb 16 13:05:15 compute-0 systemd[1]: Started Session 22 of User zuul.
Feb 16 13:05:15 compute-0 sshd-session[97108]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 16 13:05:16 compute-0 python3.9[97261]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 16 13:05:17 compute-0 sudo[97415]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xzyewpfbyygyazzarwamhkowuildcmfb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247116.8830123-48-232962926323306/AnsiballZ_file.py'
Feb 16 13:05:17 compute-0 sudo[97415]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:05:17 compute-0 python3.9[97417]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/var/lib/openstack/neutron-ovn-metadata-agent setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Feb 16 13:05:17 compute-0 sudo[97415]: pam_unix(sudo:session): session closed for user root
Feb 16 13:05:18 compute-0 sudo[97567]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yfldbcjvimeaxdbufpdcscnzqurmqjwc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247117.7606-48-110527166860191/AnsiballZ_file.py'
Feb 16 13:05:18 compute-0 sudo[97567]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:05:18 compute-0 python3.9[97569]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 16 13:05:18 compute-0 sudo[97567]: pam_unix(sudo:session): session closed for user root
Feb 16 13:05:18 compute-0 sudo[97719]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gijrmwqgpgheamoemoqextqebkiohohs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247118.4677873-48-13753438920560/AnsiballZ_file.py'
Feb 16 13:05:18 compute-0 sudo[97719]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:05:18 compute-0 python3.9[97721]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/kill_scripts setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 16 13:05:19 compute-0 sudo[97719]: pam_unix(sudo:session): session closed for user root
Feb 16 13:05:19 compute-0 sudo[97871]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hrfowekwsoqvhvogtqakmcjfsthixdjj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247119.2817397-48-199525398649964/AnsiballZ_file.py'
Feb 16 13:05:19 compute-0 sudo[97871]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:05:19 compute-0 python3.9[97873]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/ovn-metadata-proxy setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 16 13:05:19 compute-0 sudo[97871]: pam_unix(sudo:session): session closed for user root
Feb 16 13:05:20 compute-0 sudo[98023]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gvazcnhicrwstldswpxjfuvabuqbexwi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247120.0494256-48-277083813159609/AnsiballZ_file.py'
Feb 16 13:05:20 compute-0 sudo[98023]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:05:20 compute-0 python3.9[98025]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/external/pids setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 16 13:05:20 compute-0 sudo[98023]: pam_unix(sudo:session): session closed for user root
Feb 16 13:05:21 compute-0 python3.9[98175]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 16 13:05:22 compute-0 sudo[98325]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kmxofjrzddwupsweuloylijufjmisuze ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247121.6067832-136-241957035710068/AnsiballZ_seboolean.py'
Feb 16 13:05:22 compute-0 sudo[98325]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:05:22 compute-0 python3.9[98327]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Feb 16 13:05:22 compute-0 sudo[98325]: pam_unix(sudo:session): session closed for user root
Feb 16 13:05:23 compute-0 python3.9[98478]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/ovn_metadata_haproxy_wrapper follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:05:24 compute-0 python3.9[98599]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/ovn_metadata_haproxy_wrapper mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771247122.9911466-152-73452203397469/.source follow=False _original_basename=haproxy.j2 checksum=a5072e7b19ca96a1f495d94f97f31903737cfd27 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 16 13:05:25 compute-0 python3.9[98749]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/haproxy-kill follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:05:25 compute-0 python3.9[98870]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/kill_scripts/haproxy-kill mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771247124.4857564-182-110799891487183/.source follow=False _original_basename=kill-script.j2 checksum=2dfb5489f491f61b95691c3bf95fa1fe48ff3700 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 16 13:05:26 compute-0 sudo[99020]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aigwvjuxifeumsgpphmylmvptypozcdb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247126.084883-216-133763670553708/AnsiballZ_setup.py'
Feb 16 13:05:26 compute-0 sudo[99020]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:05:26 compute-0 python3.9[99022]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 16 13:05:26 compute-0 sudo[99020]: pam_unix(sudo:session): session closed for user root
Feb 16 13:05:27 compute-0 sudo[99104]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-egykzeqmrfaztlcnyzvourexqnhakvby ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247126.084883-216-133763670553708/AnsiballZ_dnf.py'
Feb 16 13:05:27 compute-0 sudo[99104]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:05:27 compute-0 python3.9[99106]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 16 13:05:28 compute-0 sudo[99104]: pam_unix(sudo:session): session closed for user root
Feb 16 13:05:29 compute-0 sudo[99257]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-urtvdsfjmtwfadsodmufxbvbospzacdr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247129.0780334-240-173804042805523/AnsiballZ_systemd.py'
Feb 16 13:05:29 compute-0 sudo[99257]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:05:30 compute-0 python3.9[99259]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Feb 16 13:05:30 compute-0 sudo[99257]: pam_unix(sudo:session): session closed for user root
Feb 16 13:05:30 compute-0 python3.9[99412]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/01-rootwrap.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:05:31 compute-0 python3.9[99533]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/01-rootwrap.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771247130.276666-256-247186386740905/.source.conf follow=False _original_basename=rootwrap.conf.j2 checksum=11f2cfb4b7d97b2cef3c2c2d88089e6999cffe22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 16 13:05:31 compute-0 python3.9[99683]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:05:32 compute-0 python3.9[99804]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771247131.356384-256-200977738596978/.source.conf follow=False _original_basename=neutron-ovn-metadata-agent.conf.j2 checksum=8bc979abbe81c2cf3993a225517a7e2483e20443 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 16 13:05:33 compute-0 python3.9[99954]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/10-neutron-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:05:33 compute-0 ovn_controller[96072]: 2026-02-16T13:05:33Z|00025|memory|INFO|16896 kB peak resident set size after 29.8 seconds
Feb 16 13:05:33 compute-0 ovn_controller[96072]: 2026-02-16T13:05:33Z|00026|memory|INFO|idl-cells-OVN_Southbound:256 idl-cells-Open_vSwitch:528 ofctrl_desired_flow_usage-KB:6 ofctrl_installed_flow_usage-KB:5 ofctrl_sb_flow_ref_usage-KB:2
Feb 16 13:05:34 compute-0 podman[100049]: 2026-02-16 13:05:34.024501302 +0000 UTC m=+0.079562344 container health_status 19961a2f872cc72daf954f4dd556f980b5f58f137d145776df6132c167a8a93c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Feb 16 13:05:34 compute-0 python3.9[100085]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/10-neutron-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771247133.2118504-344-144784662037456/.source.conf _original_basename=10-neutron-metadata.conf follow=False checksum=ca7d4d155f5b812fab1a3b70e34adb495d291b8d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 16 13:05:34 compute-0 python3.9[100249]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/05-nova-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:05:35 compute-0 python3.9[100370]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/05-nova-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771247134.2679343-344-115382134406201/.source.conf _original_basename=05-nova-metadata.conf follow=False checksum=a14d6b38898a379cd37fc0bf365d17f10859446f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 16 13:05:35 compute-0 python3.9[100520]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 16 13:05:36 compute-0 sudo[100672]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rfrymkanmnmdksuxlowtsawvejboedzz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247136.133284-420-107675409909318/AnsiballZ_file.py'
Feb 16 13:05:36 compute-0 sudo[100672]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:05:36 compute-0 python3.9[100674]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 16 13:05:36 compute-0 sudo[100672]: pam_unix(sudo:session): session closed for user root
Feb 16 13:05:37 compute-0 sudo[100824]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rqnyhvswocztotbkvgsylmnnifikloup ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247136.8214805-436-87732266837668/AnsiballZ_stat.py'
Feb 16 13:05:37 compute-0 sudo[100824]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:05:37 compute-0 python3.9[100826]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:05:37 compute-0 sudo[100824]: pam_unix(sudo:session): session closed for user root
Feb 16 13:05:37 compute-0 sudo[100902]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pserdndexratoznsduvrnheplcvdozaa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247136.8214805-436-87732266837668/AnsiballZ_file.py'
Feb 16 13:05:37 compute-0 sudo[100902]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:05:37 compute-0 python3.9[100904]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 16 13:05:37 compute-0 sudo[100902]: pam_unix(sudo:session): session closed for user root
Feb 16 13:05:38 compute-0 sudo[101054]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-whkxxjkfzqpzwnolremlvumobonhobeu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247137.9084344-436-109846333038966/AnsiballZ_stat.py'
Feb 16 13:05:38 compute-0 sudo[101054]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:05:38 compute-0 python3.9[101056]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:05:38 compute-0 sudo[101054]: pam_unix(sudo:session): session closed for user root
Feb 16 13:05:38 compute-0 sudo[101132]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uhnklzvdunrkqtujrihyntmvggjwjhjk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247137.9084344-436-109846333038966/AnsiballZ_file.py'
Feb 16 13:05:38 compute-0 sudo[101132]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:05:38 compute-0 python3.9[101134]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 16 13:05:38 compute-0 sudo[101132]: pam_unix(sudo:session): session closed for user root
Feb 16 13:05:39 compute-0 sudo[101284]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-edagyvtecbffqjeopmrpvvetnmlgzagr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247138.991283-482-54466086489643/AnsiballZ_file.py'
Feb 16 13:05:39 compute-0 sudo[101284]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:05:39 compute-0 python3.9[101286]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:05:39 compute-0 sudo[101284]: pam_unix(sudo:session): session closed for user root
Feb 16 13:05:39 compute-0 sudo[101436]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ezgqyzqjlyrhnsdsxjslnqomuukdjvev ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247139.6734862-498-238775734443109/AnsiballZ_stat.py'
Feb 16 13:05:39 compute-0 sudo[101436]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:05:40 compute-0 python3.9[101438]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:05:40 compute-0 sudo[101436]: pam_unix(sudo:session): session closed for user root
Feb 16 13:05:40 compute-0 sudo[101514]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zzsysxnuguzpnwwkuxnhafxmjktfbmak ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247139.6734862-498-238775734443109/AnsiballZ_file.py'
Feb 16 13:05:40 compute-0 sudo[101514]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:05:40 compute-0 python3.9[101516]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:05:40 compute-0 sudo[101514]: pam_unix(sudo:session): session closed for user root
Feb 16 13:05:41 compute-0 sudo[101666]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fbfoezyqfdcuohclolznipnzvduffqol ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247140.7852476-522-36038590638668/AnsiballZ_stat.py'
Feb 16 13:05:41 compute-0 sudo[101666]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:05:41 compute-0 python3.9[101668]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:05:41 compute-0 sudo[101666]: pam_unix(sudo:session): session closed for user root
Feb 16 13:05:41 compute-0 sudo[101744]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-evujtzcvlgaulplxzmbzqoeyifyuizxf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247140.7852476-522-36038590638668/AnsiballZ_file.py'
Feb 16 13:05:41 compute-0 sudo[101744]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:05:41 compute-0 python3.9[101746]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:05:41 compute-0 sudo[101744]: pam_unix(sudo:session): session closed for user root
Feb 16 13:05:42 compute-0 sudo[101896]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lzpnhwvqpbocnmjcajhccuztkfhgugej ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247141.8611116-546-197679713671710/AnsiballZ_systemd.py'
Feb 16 13:05:42 compute-0 sudo[101896]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:05:42 compute-0 python3.9[101898]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 16 13:05:42 compute-0 systemd[1]: Reloading.
Feb 16 13:05:42 compute-0 systemd-rc-local-generator[101922]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 13:05:42 compute-0 systemd-sysv-generator[101928]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 16 13:05:42 compute-0 sudo[101896]: pam_unix(sudo:session): session closed for user root
Feb 16 13:05:43 compute-0 sudo[102094]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wybkwrzxzdpchtmdtuhubwqpnpgbsxxy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247142.9150407-562-254503039528515/AnsiballZ_stat.py'
Feb 16 13:05:43 compute-0 sudo[102094]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:05:43 compute-0 python3.9[102096]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:05:43 compute-0 sudo[102094]: pam_unix(sudo:session): session closed for user root
Feb 16 13:05:43 compute-0 sudo[102172]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-woxyqhcgjsijyuovejzcnudqampcimre ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247142.9150407-562-254503039528515/AnsiballZ_file.py'
Feb 16 13:05:43 compute-0 sudo[102172]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:05:43 compute-0 python3.9[102174]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:05:43 compute-0 sudo[102172]: pam_unix(sudo:session): session closed for user root
Feb 16 13:05:44 compute-0 sudo[102324]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cicrnldcdjihannfdjxmmimiznbrzanf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247144.1312978-586-13879734805109/AnsiballZ_stat.py'
Feb 16 13:05:44 compute-0 sudo[102324]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:05:44 compute-0 python3.9[102326]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:05:44 compute-0 sudo[102324]: pam_unix(sudo:session): session closed for user root
Feb 16 13:05:44 compute-0 sudo[102402]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zorieiwlbryswwbgeupajxonuhrnsbtp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247144.1312978-586-13879734805109/AnsiballZ_file.py'
Feb 16 13:05:44 compute-0 sudo[102402]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:05:45 compute-0 python3.9[102404]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:05:45 compute-0 sudo[102402]: pam_unix(sudo:session): session closed for user root
Feb 16 13:05:45 compute-0 sudo[102554]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sewrcooqylifrerqmbzuokppacerucbq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247145.2340207-610-209021996819301/AnsiballZ_systemd.py'
Feb 16 13:05:45 compute-0 sudo[102554]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:05:45 compute-0 python3.9[102556]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 16 13:05:45 compute-0 systemd[1]: Reloading.
Feb 16 13:05:45 compute-0 systemd-rc-local-generator[102583]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 13:05:45 compute-0 systemd-sysv-generator[102588]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 16 13:05:46 compute-0 systemd[1]: Starting Create netns directory...
Feb 16 13:05:46 compute-0 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Feb 16 13:05:46 compute-0 systemd[1]: netns-placeholder.service: Deactivated successfully.
Feb 16 13:05:46 compute-0 systemd[1]: Finished Create netns directory.
Feb 16 13:05:46 compute-0 sudo[102554]: pam_unix(sudo:session): session closed for user root
Feb 16 13:05:46 compute-0 sudo[102754]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gdgyuwfidpcpcqyhqrejaxwtufidsqlx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247146.3954864-630-264742852937407/AnsiballZ_file.py'
Feb 16 13:05:46 compute-0 sudo[102754]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:05:46 compute-0 python3.9[102756]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 16 13:05:46 compute-0 sudo[102754]: pam_unix(sudo:session): session closed for user root
Feb 16 13:05:47 compute-0 sudo[102906]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dojegaktbsioqdpjjdesssyxgmipcxmu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247147.0788174-646-141215917649661/AnsiballZ_stat.py'
Feb 16 13:05:47 compute-0 sudo[102906]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:05:47 compute-0 python3.9[102908]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_metadata_agent/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:05:47 compute-0 sudo[102906]: pam_unix(sudo:session): session closed for user root
Feb 16 13:05:47 compute-0 sudo[103029]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lffzfurzzobbhbmyrhjcgqqdqandyjsu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247147.0788174-646-141215917649661/AnsiballZ_copy.py'
Feb 16 13:05:47 compute-0 sudo[103029]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:05:48 compute-0 python3.9[103031]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_metadata_agent/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771247147.0788174-646-141215917649661/.source _original_basename=healthcheck follow=False checksum=898a5a1fcd473cf731177fc866e3bd7ebf20a131 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Feb 16 13:05:48 compute-0 sudo[103029]: pam_unix(sudo:session): session closed for user root
Feb 16 13:05:48 compute-0 sudo[103181]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-atoldjcwyexyeheppejndzmtowlytube ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247148.616897-680-224892914355874/AnsiballZ_file.py'
Feb 16 13:05:48 compute-0 sudo[103181]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:05:49 compute-0 python3.9[103183]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:05:49 compute-0 sudo[103181]: pam_unix(sudo:session): session closed for user root
Feb 16 13:05:49 compute-0 sudo[103333]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kppvxgpnutozwwwnifyaglmdkczknuux ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247149.3389018-696-112717667964993/AnsiballZ_file.py'
Feb 16 13:05:49 compute-0 sudo[103333]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:05:49 compute-0 python3.9[103335]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 16 13:05:49 compute-0 sudo[103333]: pam_unix(sudo:session): session closed for user root
Feb 16 13:05:50 compute-0 sudo[103485]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hfcunowsheobnxpgghddnbvcqbldvyyh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247149.9687145-712-60645271871853/AnsiballZ_stat.py'
Feb 16 13:05:50 compute-0 sudo[103485]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:05:50 compute-0 python3.9[103487]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_metadata_agent.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:05:50 compute-0 sudo[103485]: pam_unix(sudo:session): session closed for user root
Feb 16 13:05:50 compute-0 sudo[103608]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rrurcddwvdqaqrlghbddoqsjkycimvsq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247149.9687145-712-60645271871853/AnsiballZ_copy.py'
Feb 16 13:05:50 compute-0 sudo[103608]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:05:51 compute-0 python3.9[103610]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_metadata_agent.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1771247149.9687145-712-60645271871853/.source.json _original_basename=.t88k6cje follow=False checksum=a908ef151ded3a33ae6c9ac8be72a35e5e33b9dc backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:05:51 compute-0 sudo[103608]: pam_unix(sudo:session): session closed for user root
Feb 16 13:05:51 compute-0 python3.9[103760]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:05:53 compute-0 sudo[104181]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qthuhjsvvpcnxdmuivlxaoblkqzucqnq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247153.52041-792-16588952311278/AnsiballZ_container_config_data.py'
Feb 16 13:05:53 compute-0 sudo[104181]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:05:54 compute-0 python3.9[104183]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_pattern=*.json debug=False
Feb 16 13:05:54 compute-0 sudo[104181]: pam_unix(sudo:session): session closed for user root
Feb 16 13:05:55 compute-0 sudo[104333]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dywpnfgrhwanrhcllwrtwuzdgvtonlru ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247154.6110995-814-147296127612162/AnsiballZ_container_config_hash.py'
Feb 16 13:05:55 compute-0 sudo[104333]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:05:55 compute-0 python3.9[104335]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Feb 16 13:05:55 compute-0 sudo[104333]: pam_unix(sudo:session): session closed for user root
Feb 16 13:05:56 compute-0 sudo[104485]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jsrjzsojkcstpcqrrzxrpifjegzrbaqk ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1771247155.6294823-834-210813593090739/AnsiballZ_edpm_container_manage.py'
Feb 16 13:05:56 compute-0 sudo[104485]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:05:56 compute-0 python3[104487]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_id=ovn_metadata_agent config_overrides={} config_patterns=*.json containers=['ovn_metadata_agent'] log_base_path=/var/log/containers/stdouts debug=False
Feb 16 13:05:56 compute-0 podman[104525]: 2026-02-16 13:05:56.580321022 +0000 UTC m=+0.050723351 container create d69bd0be854ec8043fcba555b70c2cd311c7a2510d07d6bc9929fe7684abece9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, managed_by=edpm_ansible)
Feb 16 13:05:56 compute-0 podman[104525]: 2026-02-16 13:05:56.550961609 +0000 UTC m=+0.021363948 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Feb 16 13:05:56 compute-0 python3[104487]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_metadata_agent --cgroupns=host --conmon-pidfile /run/ovn_metadata_agent.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d --healthcheck-command /openstack/healthcheck --label config_id=ovn_metadata_agent --label container_name=ovn_metadata_agent --label managed_by=edpm_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']} --log-driver journald --log-level info --network host --pid host --privileged=True --user root --volume /run/openvswitch:/run/openvswitch:z --volume /var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z --volume /run/netns:/run/netns:shared --volume /var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro --volume /var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro --volume /var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Feb 16 13:05:56 compute-0 sudo[104485]: pam_unix(sudo:session): session closed for user root
Feb 16 13:05:57 compute-0 sudo[104714]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uzvgnhlhmygqizysrqililpssxaxdndy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247156.8753512-850-218649604862465/AnsiballZ_stat.py'
Feb 16 13:05:57 compute-0 sudo[104714]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:05:57 compute-0 python3.9[104716]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 16 13:05:57 compute-0 sudo[104714]: pam_unix(sudo:session): session closed for user root
Feb 16 13:05:58 compute-0 sudo[104868]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lxmgykomqafbxxxlpnevxqoitxsrsopv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247157.8829076-868-248202224916219/AnsiballZ_file.py'
Feb 16 13:05:58 compute-0 sudo[104868]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:05:58 compute-0 python3.9[104870]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:05:58 compute-0 sudo[104868]: pam_unix(sudo:session): session closed for user root
Feb 16 13:05:58 compute-0 sudo[104944]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nghpwqcwnvpndqijaldwjfldkrwzfxis ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247157.8829076-868-248202224916219/AnsiballZ_stat.py'
Feb 16 13:05:58 compute-0 sudo[104944]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:05:58 compute-0 python3.9[104946]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 16 13:05:58 compute-0 sudo[104944]: pam_unix(sudo:session): session closed for user root
Feb 16 13:05:59 compute-0 sudo[105095]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jfzaihjzlsmjvhldlbdzsnpcapwmrqgy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247158.9254444-868-224040749714870/AnsiballZ_copy.py'
Feb 16 13:05:59 compute-0 sudo[105095]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:05:59 compute-0 python3.9[105097]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1771247158.9254444-868-224040749714870/source dest=/etc/systemd/system/edpm_ovn_metadata_agent.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:05:59 compute-0 sudo[105095]: pam_unix(sudo:session): session closed for user root
Feb 16 13:05:59 compute-0 sudo[105171]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cvbctiriqmoeukytuwyyqcbeykacoiab ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247158.9254444-868-224040749714870/AnsiballZ_systemd.py'
Feb 16 13:05:59 compute-0 sudo[105171]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:06:00 compute-0 python3.9[105173]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 16 13:06:00 compute-0 systemd[1]: Reloading.
Feb 16 13:06:00 compute-0 systemd-rc-local-generator[105198]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 13:06:00 compute-0 systemd-sysv-generator[105204]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 16 13:06:00 compute-0 sudo[105171]: pam_unix(sudo:session): session closed for user root
Feb 16 13:06:00 compute-0 sudo[105289]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mrnepikdfjzsutcjntnrbzeyuemwpdme ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247158.9254444-868-224040749714870/AnsiballZ_systemd.py'
Feb 16 13:06:00 compute-0 sudo[105289]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:06:00 compute-0 python3.9[105291]: ansible-systemd Invoked with state=restarted name=edpm_ovn_metadata_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 16 13:06:01 compute-0 systemd[1]: Reloading.
Feb 16 13:06:01 compute-0 systemd-rc-local-generator[105322]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 13:06:01 compute-0 systemd-sysv-generator[105325]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 16 13:06:01 compute-0 systemd[1]: Starting ovn_metadata_agent container...
Feb 16 13:06:01 compute-0 systemd[1]: Started libcrun container.
Feb 16 13:06:01 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/164857ab85fe5bf4eadbfb7b2db466669393443b733ebc6d1a9796efd7440785/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff)
Feb 16 13:06:01 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/164857ab85fe5bf4eadbfb7b2db466669393443b733ebc6d1a9796efd7440785/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 16 13:06:01 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run d69bd0be854ec8043fcba555b70c2cd311c7a2510d07d6bc9929fe7684abece9.
Feb 16 13:06:01 compute-0 podman[105339]: 2026-02-16 13:06:01.389682502 +0000 UTC m=+0.160580565 container init d69bd0be854ec8043fcba555b70c2cd311c7a2510d07d6bc9929fe7684abece9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Feb 16 13:06:01 compute-0 ovn_metadata_agent[105355]: + sudo -E kolla_set_configs
Feb 16 13:06:01 compute-0 podman[105339]: 2026-02-16 13:06:01.418270143 +0000 UTC m=+0.189168196 container start d69bd0be854ec8043fcba555b70c2cd311c7a2510d07d6bc9929fe7684abece9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 16 13:06:01 compute-0 edpm-start-podman-container[105339]: ovn_metadata_agent
Feb 16 13:06:01 compute-0 ovn_metadata_agent[105355]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Feb 16 13:06:01 compute-0 ovn_metadata_agent[105355]: INFO:__main__:Validating config file
Feb 16 13:06:01 compute-0 ovn_metadata_agent[105355]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Feb 16 13:06:01 compute-0 ovn_metadata_agent[105355]: INFO:__main__:Copying service configuration files
Feb 16 13:06:01 compute-0 ovn_metadata_agent[105355]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf
Feb 16 13:06:01 compute-0 ovn_metadata_agent[105355]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf
Feb 16 13:06:01 compute-0 ovn_metadata_agent[105355]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf
Feb 16 13:06:01 compute-0 ovn_metadata_agent[105355]: INFO:__main__:Writing out command to execute
Feb 16 13:06:01 compute-0 ovn_metadata_agent[105355]: INFO:__main__:Setting permission for /var/lib/neutron
Feb 16 13:06:01 compute-0 ovn_metadata_agent[105355]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts
Feb 16 13:06:01 compute-0 ovn_metadata_agent[105355]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy
Feb 16 13:06:01 compute-0 ovn_metadata_agent[105355]: INFO:__main__:Setting permission for /var/lib/neutron/external
Feb 16 13:06:01 compute-0 ovn_metadata_agent[105355]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper
Feb 16 13:06:01 compute-0 ovn_metadata_agent[105355]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill
Feb 16 13:06:01 compute-0 ovn_metadata_agent[105355]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids
Feb 16 13:06:01 compute-0 ovn_metadata_agent[105355]: ++ cat /run_command
Feb 16 13:06:01 compute-0 ovn_metadata_agent[105355]: + CMD=neutron-ovn-metadata-agent
Feb 16 13:06:01 compute-0 ovn_metadata_agent[105355]: + ARGS=
Feb 16 13:06:01 compute-0 ovn_metadata_agent[105355]: + sudo kolla_copy_cacerts
Feb 16 13:06:01 compute-0 podman[105362]: 2026-02-16 13:06:01.478753127 +0000 UTC m=+0.048511138 container health_status d69bd0be854ec8043fcba555b70c2cd311c7a2510d07d6bc9929fe7684abece9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Feb 16 13:06:01 compute-0 edpm-start-podman-container[105338]: Creating additional drop-in dependency for "ovn_metadata_agent" (d69bd0be854ec8043fcba555b70c2cd311c7a2510d07d6bc9929fe7684abece9)
Feb 16 13:06:01 compute-0 ovn_metadata_agent[105355]: + [[ ! -n '' ]]
Feb 16 13:06:01 compute-0 ovn_metadata_agent[105355]: + . kolla_extend_start
Feb 16 13:06:01 compute-0 ovn_metadata_agent[105355]: Running command: 'neutron-ovn-metadata-agent'
Feb 16 13:06:01 compute-0 ovn_metadata_agent[105355]: + echo 'Running command: '\''neutron-ovn-metadata-agent'\'''
Feb 16 13:06:01 compute-0 ovn_metadata_agent[105355]: + umask 0022
Feb 16 13:06:01 compute-0 ovn_metadata_agent[105355]: + exec neutron-ovn-metadata-agent
Feb 16 13:06:01 compute-0 systemd[1]: Reloading.
Feb 16 13:06:01 compute-0 systemd-rc-local-generator[105434]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 13:06:01 compute-0 systemd-sysv-generator[105438]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 16 13:06:01 compute-0 systemd[1]: Started ovn_metadata_agent container.
Feb 16 13:06:01 compute-0 sudo[105289]: pam_unix(sudo:session): session closed for user root
Feb 16 13:06:02 compute-0 python3.9[105602]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.147 105360 INFO neutron.common.config [-] Logging enabled!
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.148 105360 INFO neutron.common.config [-] /usr/bin/neutron-ovn-metadata-agent version 22.2.2.dev44
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.148 105360 DEBUG neutron.common.config [-] command line: /usr/bin/neutron-ovn-metadata-agent setup_logging /usr/lib/python3.9/site-packages/neutron/common/config.py:123
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.148 105360 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.148 105360 DEBUG neutron.agent.ovn.metadata_agent [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.148 105360 DEBUG neutron.agent.ovn.metadata_agent [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.149 105360 DEBUG neutron.agent.ovn.metadata_agent [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.149 105360 DEBUG neutron.agent.ovn.metadata_agent [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.149 105360 DEBUG neutron.agent.ovn.metadata_agent [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.149 105360 DEBUG neutron.agent.ovn.metadata_agent [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.149 105360 DEBUG neutron.agent.ovn.metadata_agent [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.149 105360 DEBUG neutron.agent.ovn.metadata_agent [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.149 105360 DEBUG neutron.agent.ovn.metadata_agent [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.149 105360 DEBUG neutron.agent.ovn.metadata_agent [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.149 105360 DEBUG neutron.agent.ovn.metadata_agent [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.150 105360 DEBUG neutron.agent.ovn.metadata_agent [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.150 105360 DEBUG neutron.agent.ovn.metadata_agent [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.150 105360 DEBUG neutron.agent.ovn.metadata_agent [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.150 105360 DEBUG neutron.agent.ovn.metadata_agent [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.150 105360 DEBUG neutron.agent.ovn.metadata_agent [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.150 105360 DEBUG neutron.agent.ovn.metadata_agent [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.150 105360 DEBUG neutron.agent.ovn.metadata_agent [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.150 105360 DEBUG neutron.agent.ovn.metadata_agent [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.150 105360 DEBUG neutron.agent.ovn.metadata_agent [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.151 105360 DEBUG neutron.agent.ovn.metadata_agent [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.151 105360 DEBUG neutron.agent.ovn.metadata_agent [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.151 105360 DEBUG neutron.agent.ovn.metadata_agent [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.151 105360 DEBUG neutron.agent.ovn.metadata_agent [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.151 105360 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.151 105360 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.151 105360 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.151 105360 DEBUG neutron.agent.ovn.metadata_agent [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.151 105360 DEBUG neutron.agent.ovn.metadata_agent [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.151 105360 DEBUG neutron.agent.ovn.metadata_agent [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.152 105360 DEBUG neutron.agent.ovn.metadata_agent [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.152 105360 DEBUG neutron.agent.ovn.metadata_agent [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.152 105360 DEBUG neutron.agent.ovn.metadata_agent [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.152 105360 DEBUG neutron.agent.ovn.metadata_agent [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.152 105360 DEBUG neutron.agent.ovn.metadata_agent [-] host                           = compute-0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.152 105360 DEBUG neutron.agent.ovn.metadata_agent [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.152 105360 DEBUG neutron.agent.ovn.metadata_agent [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.152 105360 DEBUG neutron.agent.ovn.metadata_agent [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.152 105360 DEBUG neutron.agent.ovn.metadata_agent [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.153 105360 DEBUG neutron.agent.ovn.metadata_agent [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.153 105360 DEBUG neutron.agent.ovn.metadata_agent [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.153 105360 DEBUG neutron.agent.ovn.metadata_agent [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.153 105360 DEBUG neutron.agent.ovn.metadata_agent [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.153 105360 DEBUG neutron.agent.ovn.metadata_agent [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.153 105360 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.153 105360 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.153 105360 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.153 105360 DEBUG neutron.agent.ovn.metadata_agent [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.153 105360 DEBUG neutron.agent.ovn.metadata_agent [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.153 105360 DEBUG neutron.agent.ovn.metadata_agent [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.154 105360 DEBUG neutron.agent.ovn.metadata_agent [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.154 105360 DEBUG neutron.agent.ovn.metadata_agent [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.154 105360 DEBUG neutron.agent.ovn.metadata_agent [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.154 105360 DEBUG neutron.agent.ovn.metadata_agent [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.154 105360 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.154 105360 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.154 105360 DEBUG neutron.agent.ovn.metadata_agent [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.154 105360 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.154 105360 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.154 105360 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.155 105360 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.155 105360 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.155 105360 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.155 105360 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.155 105360 DEBUG neutron.agent.ovn.metadata_agent [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.155 105360 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.155 105360 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.155 105360 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.155 105360 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.155 105360 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.156 105360 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.156 105360 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.156 105360 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.156 105360 DEBUG neutron.agent.ovn.metadata_agent [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.156 105360 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.156 105360 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.156 105360 DEBUG neutron.agent.ovn.metadata_agent [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.156 105360 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.156 105360 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.156 105360 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.156 105360 DEBUG neutron.agent.ovn.metadata_agent [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.157 105360 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.157 105360 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.157 105360 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.157 105360 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.157 105360 DEBUG neutron.agent.ovn.metadata_agent [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.157 105360 DEBUG neutron.agent.ovn.metadata_agent [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.157 105360 DEBUG neutron.agent.ovn.metadata_agent [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.157 105360 DEBUG neutron.agent.ovn.metadata_agent [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.157 105360 DEBUG neutron.agent.ovn.metadata_agent [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.157 105360 DEBUG neutron.agent.ovn.metadata_agent [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.158 105360 DEBUG neutron.agent.ovn.metadata_agent [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.158 105360 DEBUG neutron.agent.ovn.metadata_agent [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.158 105360 DEBUG neutron.agent.ovn.metadata_agent [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.158 105360 DEBUG neutron.agent.ovn.metadata_agent [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.158 105360 DEBUG neutron.agent.ovn.metadata_agent [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.158 105360 DEBUG neutron.agent.ovn.metadata_agent [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.158 105360 DEBUG neutron.agent.ovn.metadata_agent [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.158 105360 DEBUG neutron.agent.ovn.metadata_agent [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.158 105360 DEBUG neutron.agent.ovn.metadata_agent [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.158 105360 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.158 105360 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.159 105360 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.159 105360 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.159 105360 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.159 105360 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.159 105360 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.159 105360 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.159 105360 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.159 105360 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.159 105360 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.160 105360 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.160 105360 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.160 105360 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.160 105360 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.160 105360 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.160 105360 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.160 105360 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.160 105360 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.160 105360 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.161 105360 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.161 105360 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.161 105360 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.161 105360 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.161 105360 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.161 105360 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.161 105360 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.161 105360 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.161 105360 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.162 105360 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.162 105360 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.162 105360 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.162 105360 DEBUG neutron.agent.ovn.metadata_agent [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.162 105360 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.162 105360 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.162 105360 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.162 105360 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.162 105360 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.163 105360 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.163 105360 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.163 105360 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.163 105360 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.163 105360 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.163 105360 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.163 105360 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.163 105360 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.163 105360 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.164 105360 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.164 105360 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.164 105360 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.164 105360 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.164 105360 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.164 105360 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.164 105360 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.164 105360 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.165 105360 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.165 105360 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.165 105360 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.165 105360 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.165 105360 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.165 105360 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.165 105360 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.165 105360 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.166 105360 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.166 105360 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.166 105360 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.166 105360 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.166 105360 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.166 105360 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.166 105360 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.166 105360 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.166 105360 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.167 105360 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.167 105360 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.167 105360 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.167 105360 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.167 105360 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.167 105360 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.167 105360 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.168 105360 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.168 105360 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.168 105360 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.168 105360 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.168 105360 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.168 105360 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.168 105360 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.168 105360 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.169 105360 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.169 105360 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.169 105360 DEBUG neutron.agent.ovn.metadata_agent [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.169 105360 DEBUG neutron.agent.ovn.metadata_agent [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.169 105360 DEBUG neutron.agent.ovn.metadata_agent [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.169 105360 DEBUG neutron.agent.ovn.metadata_agent [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.169 105360 DEBUG neutron.agent.ovn.metadata_agent [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.169 105360 DEBUG neutron.agent.ovn.metadata_agent [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.169 105360 DEBUG neutron.agent.ovn.metadata_agent [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.170 105360 DEBUG neutron.agent.ovn.metadata_agent [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.170 105360 DEBUG neutron.agent.ovn.metadata_agent [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.170 105360 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.170 105360 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.170 105360 DEBUG neutron.agent.ovn.metadata_agent [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.170 105360 DEBUG neutron.agent.ovn.metadata_agent [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.170 105360 DEBUG neutron.agent.ovn.metadata_agent [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.170 105360 DEBUG neutron.agent.ovn.metadata_agent [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.171 105360 DEBUG neutron.agent.ovn.metadata_agent [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.171 105360 DEBUG neutron.agent.ovn.metadata_agent [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.171 105360 DEBUG neutron.agent.ovn.metadata_agent [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.171 105360 DEBUG neutron.agent.ovn.metadata_agent [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.171 105360 DEBUG neutron.agent.ovn.metadata_agent [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.171 105360 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.171 105360 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.172 105360 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.172 105360 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.172 105360 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.172 105360 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.172 105360 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.172 105360 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.172 105360 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.172 105360 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.172 105360 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.172 105360 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.173 105360 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.173 105360 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.173 105360 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.173 105360 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.173 105360 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.173 105360 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.173 105360 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.173 105360 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.174 105360 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.174 105360 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.174 105360 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.174 105360 DEBUG neutron.agent.ovn.metadata_agent [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.174 105360 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.174 105360 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.174 105360 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.174 105360 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.174 105360 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.175 105360 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.175 105360 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.175 105360 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.175 105360 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.175 105360 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.175 105360 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.175 105360 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.175 105360 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.175 105360 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.176 105360 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.176 105360 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.176 105360 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.176 105360 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.176 105360 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_connection          = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.176 105360 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.176 105360 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.176 105360 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.176 105360 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.177 105360 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.177 105360 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.177 105360 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.177 105360 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.177 105360 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.177 105360 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.177 105360 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.178 105360 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.178 105360 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.178 105360 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.178 105360 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.178 105360 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.179 105360 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.179 105360 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.179 105360 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.179 105360 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.179 105360 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.179 105360 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.179 105360 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.179 105360 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.179 105360 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.179 105360 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.180 105360 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.180 105360 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.180 105360 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.180 105360 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.180 105360 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.180 105360 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.180 105360 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.180 105360 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.181 105360 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.181 105360 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.181 105360 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.181 105360 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.181 105360 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.181 105360 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.181 105360 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.182 105360 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.182 105360 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.182 105360 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.182 105360 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.182 105360 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.182 105360 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.182 105360 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.192 105360 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.192 105360 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.192 105360 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.193 105360 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connecting...
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.193 105360 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connected
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.210 105360 DEBUG neutron.agent.ovn.metadata.agent [-] Loaded chassis name b0e583b2-47d7-4bde-bbd6-282143e0c194 (UUID: b0e583b2-47d7-4bde-bbd6-282143e0c194) and ovn bridge br-int. _load_config /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:309
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.243 105360 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.243 105360 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.243 105360 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.244 105360 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Chassis_Private.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.249 105360 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.256 105360 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.263 105360 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched CREATE: ChassisPrivateCreateEvent(events=('create',), table='Chassis_Private', conditions=(('name', '=', 'b0e583b2-47d7-4bde-bbd6-282143e0c194'),), old_conditions=None), priority=20 to row=Chassis_Private(chassis=[<ovs.db.idl.Row object at 0x7f2e53046760>], external_ids={}, name=b0e583b2-47d7-4bde-bbd6-282143e0c194, nb_cfg_timestamp=1771247112231, nb_cfg=1) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.264 105360 DEBUG neutron_lib.callbacks.manager [-] Subscribe: <bound method MetadataProxyHandler.post_fork_initialize of <neutron.agent.ovn.metadata.server.MetadataProxyHandler object at 0x7f2e53046ac0>> process after_init 55550000, False subscribe /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:52
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.265 105360 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.266 105360 DEBUG oslo_concurrency.lockutils [-] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.266 105360 DEBUG oslo_concurrency.lockutils [-] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.267 105360 INFO oslo_service.service [-] Starting 1 workers
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.271 105360 DEBUG oslo_service.service [-] Started child 105653 _start_child /usr/lib/python3.9/site-packages/oslo_service/service.py:575
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.274 105360 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.namespace_cmd', '--privsep_sock_path', '/tmp/tmp6rxvbr1e/privsep.sock']
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.275 105653 DEBUG neutron_lib.callbacks.manager [-] Publish callbacks ['neutron.agent.ovn.metadata.server.MetadataProxyHandler.post_fork_initialize-192900'] for process (None), after_init _notify_loop /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:184
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.301 105653 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.301 105653 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.301 105653 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.305 105653 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.313 105653 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.320 105653 INFO eventlet.wsgi.server [-] (105653) wsgi starting up on http:/var/lib/neutron/metadata_proxy
Feb 16 13:06:03 compute-0 sudo[105757]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tdowvvtqrrdzyowoxukuftpwkonyoyzw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247163.2537756-958-240486191189944/AnsiballZ_stat.py'
Feb 16 13:06:03 compute-0 sudo[105757]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:06:03 compute-0 python3.9[105759]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:06:03 compute-0 sudo[105757]: pam_unix(sudo:session): session closed for user root
Feb 16 13:06:03 compute-0 kernel: capability: warning: `privsep-helper' uses deprecated v2 capabilities in a way that may be insecure
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.846 105360 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.846 105360 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmp6rxvbr1e/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.729 105762 INFO oslo.privsep.daemon [-] privsep daemon starting
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.736 105762 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.742 105762 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_SYS_ADMIN/CAP_SYS_ADMIN/none
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.742 105762 INFO oslo.privsep.daemon [-] privsep daemon running as pid 105762
Feb 16 13:06:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:03.849 105762 DEBUG oslo.privsep.daemon [-] privsep: reply[85c06360-e01d-445d-9554-3c4a33b075b6]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:06:04 compute-0 sudo[105893]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pwefxbadegjrghmcbgfaoqynizfoddmv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247163.2537756-958-240486191189944/AnsiballZ_copy.py'
Feb 16 13:06:04 compute-0 sudo[105893]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:06:04 compute-0 podman[105861]: 2026-02-16 13:06:04.218191093 +0000 UTC m=+0.098483299 container health_status 19961a2f872cc72daf954f4dd556f980b5f58f137d145776df6132c167a8a93c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3)
Feb 16 13:06:04 compute-0 python3.9[105899]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771247163.2537756-958-240486191189944/.source.yaml _original_basename=.8a7q7wo8 follow=False checksum=e321a77da89f80ab6ea1e75aee3ae5cf00c93c93 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.345 105762 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.346 105762 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.346 105762 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:06:04 compute-0 sudo[105893]: pam_unix(sudo:session): session closed for user root
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.826 105762 DEBUG oslo.privsep.daemon [-] privsep: reply[e35c3d52-a5c5-4959-90ca-0d448cdbb166]: (4, []) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.829 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbAddCommand(_result=None, table=Chassis_Private, record=b0e583b2-47d7-4bde-bbd6-282143e0c194, column=external_ids, values=({'neutron:ovn-metadata-id': 'eb717e8e-59f7-5ce1-b6d2-cdbcad0491ef'},)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.845 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=b0e583b2-47d7-4bde-bbd6-282143e0c194, col_values=(('external_ids', {'neutron:ovn-bridge': 'br-int'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:06:04 compute-0 sshd-session[97111]: Connection closed by 192.168.122.30 port 57748
Feb 16 13:06:04 compute-0 sshd-session[97108]: pam_unix(sshd:session): session closed for user zuul
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.854 105360 DEBUG oslo_service.service [-] Full set of CONF: wait /usr/lib/python3.9/site-packages/oslo_service/service.py:649
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.854 105360 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.854 105360 DEBUG oslo_service.service [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.854 105360 DEBUG oslo_service.service [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.854 105360 DEBUG oslo_service.service [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.854 105360 DEBUG oslo_service.service [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.854 105360 DEBUG oslo_service.service [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.855 105360 DEBUG oslo_service.service [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.855 105360 DEBUG oslo_service.service [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.855 105360 DEBUG oslo_service.service [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:04 compute-0 systemd[1]: session-22.scope: Deactivated successfully.
Feb 16 13:06:04 compute-0 systemd[1]: session-22.scope: Consumed 31.603s CPU time.
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.855 105360 DEBUG oslo_service.service [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.855 105360 DEBUG oslo_service.service [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.855 105360 DEBUG oslo_service.service [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.855 105360 DEBUG oslo_service.service [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.856 105360 DEBUG oslo_service.service [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.856 105360 DEBUG oslo_service.service [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.856 105360 DEBUG oslo_service.service [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.856 105360 DEBUG oslo_service.service [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.856 105360 DEBUG oslo_service.service [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.856 105360 DEBUG oslo_service.service [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.856 105360 DEBUG oslo_service.service [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:04 compute-0 systemd-logind[818]: Session 22 logged out. Waiting for processes to exit.
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.857 105360 DEBUG oslo_service.service [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.857 105360 DEBUG oslo_service.service [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.857 105360 DEBUG oslo_service.service [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.857 105360 DEBUG oslo_service.service [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.858 105360 DEBUG oslo_service.service [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.858 105360 DEBUG oslo_service.service [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.858 105360 DEBUG oslo_service.service [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.858 105360 DEBUG oslo_service.service [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.858 105360 DEBUG oslo_service.service [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.858 105360 DEBUG oslo_service.service [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.858 105360 DEBUG oslo_service.service [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.859 105360 DEBUG oslo_service.service [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:04 compute-0 systemd-logind[818]: Removed session 22.
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.859 105360 DEBUG oslo_service.service [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.859 105360 DEBUG oslo_service.service [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.859 105360 DEBUG oslo_service.service [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.859 105360 DEBUG oslo_service.service [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.859 105360 DEBUG oslo_service.service [-] host                           = compute-0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.859 105360 DEBUG oslo_service.service [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.860 105360 DEBUG oslo_service.service [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.860 105360 DEBUG oslo_service.service [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.860 105360 DEBUG oslo_service.service [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.860 105360 DEBUG oslo_service.service [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.860 105360 DEBUG oslo_service.service [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.860 105360 DEBUG oslo_service.service [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.860 105360 DEBUG oslo_service.service [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.860 105360 DEBUG oslo_service.service [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.860 105360 DEBUG oslo_service.service [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.860 105360 DEBUG oslo_service.service [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.861 105360 DEBUG oslo_service.service [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.861 105360 DEBUG oslo_service.service [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.861 105360 DEBUG oslo_service.service [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.861 105360 DEBUG oslo_service.service [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.861 105360 DEBUG oslo_service.service [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.861 105360 DEBUG oslo_service.service [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.861 105360 DEBUG oslo_service.service [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.861 105360 DEBUG oslo_service.service [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.861 105360 DEBUG oslo_service.service [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.862 105360 DEBUG oslo_service.service [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.862 105360 DEBUG oslo_service.service [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.862 105360 DEBUG oslo_service.service [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.862 105360 DEBUG oslo_service.service [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.862 105360 DEBUG oslo_service.service [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.862 105360 DEBUG oslo_service.service [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.862 105360 DEBUG oslo_service.service [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.862 105360 DEBUG oslo_service.service [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.862 105360 DEBUG oslo_service.service [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.863 105360 DEBUG oslo_service.service [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.863 105360 DEBUG oslo_service.service [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.863 105360 DEBUG oslo_service.service [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.863 105360 DEBUG oslo_service.service [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.863 105360 DEBUG oslo_service.service [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.863 105360 DEBUG oslo_service.service [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.863 105360 DEBUG oslo_service.service [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.863 105360 DEBUG oslo_service.service [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.863 105360 DEBUG oslo_service.service [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.863 105360 DEBUG oslo_service.service [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.864 105360 DEBUG oslo_service.service [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.864 105360 DEBUG oslo_service.service [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.864 105360 DEBUG oslo_service.service [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.864 105360 DEBUG oslo_service.service [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.864 105360 DEBUG oslo_service.service [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.864 105360 DEBUG oslo_service.service [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.864 105360 DEBUG oslo_service.service [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.864 105360 DEBUG oslo_service.service [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.864 105360 DEBUG oslo_service.service [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.865 105360 DEBUG oslo_service.service [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.865 105360 DEBUG oslo_service.service [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.865 105360 DEBUG oslo_service.service [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.865 105360 DEBUG oslo_service.service [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.865 105360 DEBUG oslo_service.service [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.865 105360 DEBUG oslo_service.service [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.865 105360 DEBUG oslo_service.service [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.865 105360 DEBUG oslo_service.service [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.865 105360 DEBUG oslo_service.service [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.866 105360 DEBUG oslo_service.service [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.866 105360 DEBUG oslo_service.service [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.866 105360 DEBUG oslo_service.service [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.866 105360 DEBUG oslo_service.service [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.866 105360 DEBUG oslo_service.service [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.866 105360 DEBUG oslo_service.service [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.866 105360 DEBUG oslo_service.service [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.866 105360 DEBUG oslo_service.service [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.867 105360 DEBUG oslo_service.service [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.867 105360 DEBUG oslo_service.service [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.867 105360 DEBUG oslo_service.service [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.867 105360 DEBUG oslo_service.service [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.867 105360 DEBUG oslo_service.service [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.867 105360 DEBUG oslo_service.service [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.867 105360 DEBUG oslo_service.service [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.867 105360 DEBUG oslo_service.service [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.867 105360 DEBUG oslo_service.service [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.868 105360 DEBUG oslo_service.service [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.868 105360 DEBUG oslo_service.service [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.868 105360 DEBUG oslo_service.service [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.868 105360 DEBUG oslo_service.service [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.868 105360 DEBUG oslo_service.service [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.868 105360 DEBUG oslo_service.service [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.868 105360 DEBUG oslo_service.service [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.868 105360 DEBUG oslo_service.service [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.868 105360 DEBUG oslo_service.service [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.868 105360 DEBUG oslo_service.service [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.869 105360 DEBUG oslo_service.service [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.869 105360 DEBUG oslo_service.service [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.869 105360 DEBUG oslo_service.service [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.869 105360 DEBUG oslo_service.service [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.869 105360 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.869 105360 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.869 105360 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.869 105360 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.870 105360 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.870 105360 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.870 105360 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.870 105360 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.870 105360 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.870 105360 DEBUG oslo_service.service [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.870 105360 DEBUG oslo_service.service [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.870 105360 DEBUG oslo_service.service [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.871 105360 DEBUG oslo_service.service [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.871 105360 DEBUG oslo_service.service [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.871 105360 DEBUG oslo_service.service [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.871 105360 DEBUG oslo_service.service [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.871 105360 DEBUG oslo_service.service [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.871 105360 DEBUG oslo_service.service [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.871 105360 DEBUG oslo_service.service [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.871 105360 DEBUG oslo_service.service [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.871 105360 DEBUG oslo_service.service [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.871 105360 DEBUG oslo_service.service [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.872 105360 DEBUG oslo_service.service [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.872 105360 DEBUG oslo_service.service [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.872 105360 DEBUG oslo_service.service [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.872 105360 DEBUG oslo_service.service [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.872 105360 DEBUG oslo_service.service [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.872 105360 DEBUG oslo_service.service [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.872 105360 DEBUG oslo_service.service [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.872 105360 DEBUG oslo_service.service [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.872 105360 DEBUG oslo_service.service [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.873 105360 DEBUG oslo_service.service [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.873 105360 DEBUG oslo_service.service [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.873 105360 DEBUG oslo_service.service [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.873 105360 DEBUG oslo_service.service [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.873 105360 DEBUG oslo_service.service [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.873 105360 DEBUG oslo_service.service [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.873 105360 DEBUG oslo_service.service [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.873 105360 DEBUG oslo_service.service [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.873 105360 DEBUG oslo_service.service [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.873 105360 DEBUG oslo_service.service [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.874 105360 DEBUG oslo_service.service [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.874 105360 DEBUG oslo_service.service [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.874 105360 DEBUG oslo_service.service [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.874 105360 DEBUG oslo_service.service [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.874 105360 DEBUG oslo_service.service [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.874 105360 DEBUG oslo_service.service [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.874 105360 DEBUG oslo_service.service [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.874 105360 DEBUG oslo_service.service [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.875 105360 DEBUG oslo_service.service [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.875 105360 DEBUG oslo_service.service [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.875 105360 DEBUG oslo_service.service [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.875 105360 DEBUG oslo_service.service [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.875 105360 DEBUG oslo_service.service [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.875 105360 DEBUG oslo_service.service [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.875 105360 DEBUG oslo_service.service [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.875 105360 DEBUG oslo_service.service [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.876 105360 DEBUG oslo_service.service [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.876 105360 DEBUG oslo_service.service [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.876 105360 DEBUG oslo_service.service [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.876 105360 DEBUG oslo_service.service [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.876 105360 DEBUG oslo_service.service [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.876 105360 DEBUG oslo_service.service [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.876 105360 DEBUG oslo_service.service [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.876 105360 DEBUG oslo_service.service [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.877 105360 DEBUG oslo_service.service [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.877 105360 DEBUG oslo_service.service [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.877 105360 DEBUG oslo_service.service [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.877 105360 DEBUG oslo_service.service [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.877 105360 DEBUG oslo_service.service [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.877 105360 DEBUG oslo_service.service [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.877 105360 DEBUG oslo_service.service [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.877 105360 DEBUG oslo_service.service [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.878 105360 DEBUG oslo_service.service [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.878 105360 DEBUG oslo_service.service [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.878 105360 DEBUG oslo_service.service [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.878 105360 DEBUG oslo_service.service [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.878 105360 DEBUG oslo_service.service [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.878 105360 DEBUG oslo_service.service [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.878 105360 DEBUG oslo_service.service [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.878 105360 DEBUG oslo_service.service [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.879 105360 DEBUG oslo_service.service [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.879 105360 DEBUG oslo_service.service [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.879 105360 DEBUG oslo_service.service [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.879 105360 DEBUG oslo_service.service [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.879 105360 DEBUG oslo_service.service [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.879 105360 DEBUG oslo_service.service [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.879 105360 DEBUG oslo_service.service [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.879 105360 DEBUG oslo_service.service [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.880 105360 DEBUG oslo_service.service [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.880 105360 DEBUG oslo_service.service [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.880 105360 DEBUG oslo_service.service [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.880 105360 DEBUG oslo_service.service [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.880 105360 DEBUG oslo_service.service [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.880 105360 DEBUG oslo_service.service [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.880 105360 DEBUG oslo_service.service [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.880 105360 DEBUG oslo_service.service [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.880 105360 DEBUG oslo_service.service [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.881 105360 DEBUG oslo_service.service [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.881 105360 DEBUG oslo_service.service [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.881 105360 DEBUG oslo_service.service [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.881 105360 DEBUG oslo_service.service [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.881 105360 DEBUG oslo_service.service [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.881 105360 DEBUG oslo_service.service [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.881 105360 DEBUG oslo_service.service [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.881 105360 DEBUG oslo_service.service [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.881 105360 DEBUG oslo_service.service [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.882 105360 DEBUG oslo_service.service [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.882 105360 DEBUG oslo_service.service [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.882 105360 DEBUG oslo_service.service [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.882 105360 DEBUG oslo_service.service [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.882 105360 DEBUG oslo_service.service [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.882 105360 DEBUG oslo_service.service [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.882 105360 DEBUG oslo_service.service [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.882 105360 DEBUG oslo_service.service [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.882 105360 DEBUG oslo_service.service [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.883 105360 DEBUG oslo_service.service [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.883 105360 DEBUG oslo_service.service [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.883 105360 DEBUG oslo_service.service [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.883 105360 DEBUG oslo_service.service [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.883 105360 DEBUG oslo_service.service [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.883 105360 DEBUG oslo_service.service [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.883 105360 DEBUG oslo_service.service [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.884 105360 DEBUG oslo_service.service [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.884 105360 DEBUG oslo_service.service [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.884 105360 DEBUG oslo_service.service [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.884 105360 DEBUG oslo_service.service [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.884 105360 DEBUG oslo_service.service [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.884 105360 DEBUG oslo_service.service [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.884 105360 DEBUG oslo_service.service [-] ovn.ovn_sb_connection          = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.885 105360 DEBUG oslo_service.service [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.885 105360 DEBUG oslo_service.service [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.885 105360 DEBUG oslo_service.service [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.885 105360 DEBUG oslo_service.service [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.885 105360 DEBUG oslo_service.service [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.885 105360 DEBUG oslo_service.service [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.885 105360 DEBUG oslo_service.service [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.885 105360 DEBUG oslo_service.service [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.886 105360 DEBUG oslo_service.service [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.886 105360 DEBUG oslo_service.service [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.886 105360 DEBUG oslo_service.service [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.886 105360 DEBUG oslo_service.service [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.886 105360 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.886 105360 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.887 105360 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.887 105360 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.887 105360 DEBUG oslo_service.service [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.887 105360 DEBUG oslo_service.service [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.887 105360 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.887 105360 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.887 105360 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.887 105360 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.888 105360 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.888 105360 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.888 105360 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.888 105360 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.888 105360 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.888 105360 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.889 105360 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.889 105360 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.889 105360 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.889 105360 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.889 105360 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.889 105360 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.889 105360 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.889 105360 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.890 105360 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.890 105360 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.890 105360 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.890 105360 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.890 105360 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.890 105360 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.891 105360 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.891 105360 DEBUG oslo_service.service [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.891 105360 DEBUG oslo_service.service [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.891 105360 DEBUG oslo_service.service [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.891 105360 DEBUG oslo_service.service [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:04 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:06:04.891 105360 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Feb 16 13:06:05 compute-0 sshd-session[105937]: Connection closed by authenticating user root 146.190.226.24 port 52536 [preauth]
Feb 16 13:06:10 compute-0 sshd-session[105939]: Accepted publickey for zuul from 192.168.122.30 port 54578 ssh2: ECDSA SHA256:6P1Q60WP6ePVjzkJMgpM7PslBYS6YIK3pVS2L1FZ4Yk
Feb 16 13:06:10 compute-0 systemd-logind[818]: New session 23 of user zuul.
Feb 16 13:06:10 compute-0 systemd[1]: Started Session 23 of User zuul.
Feb 16 13:06:10 compute-0 sshd-session[105939]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 16 13:06:11 compute-0 python3.9[106092]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 16 13:06:13 compute-0 sudo[106246]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lgxrdbmomhayrvhiwnconlwiqnyuigyk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247171.753336-48-63609078675107/AnsiballZ_command.py'
Feb 16 13:06:13 compute-0 sudo[106246]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:06:13 compute-0 python3.9[106248]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps -a --filter name=^nova_virtlogd$ --format \{\{.Names\}\} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 13:06:13 compute-0 sudo[106246]: pam_unix(sudo:session): session closed for user root
Feb 16 13:06:14 compute-0 sudo[106411]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xopfabnpmaalmyauxpcfhusysvwkzcpz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247173.8782923-70-118015067831287/AnsiballZ_systemd_service.py'
Feb 16 13:06:14 compute-0 sudo[106411]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:06:14 compute-0 python3.9[106413]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 16 13:06:14 compute-0 systemd[1]: Reloading.
Feb 16 13:06:14 compute-0 systemd-sysv-generator[106444]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 16 13:06:14 compute-0 systemd-rc-local-generator[106436]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 13:06:14 compute-0 sudo[106411]: pam_unix(sudo:session): session closed for user root
Feb 16 13:06:15 compute-0 python3.9[106606]: ansible-ansible.builtin.service_facts Invoked
Feb 16 13:06:15 compute-0 network[106623]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Feb 16 13:06:15 compute-0 network[106624]: 'network-scripts' will be removed from distribution in near future.
Feb 16 13:06:15 compute-0 network[106625]: It is advised to switch to 'NetworkManager' instead for network management.
Feb 16 13:06:18 compute-0 sudo[106885]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bcdaaffackfzevsytfbrecifospvlsqu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247178.288869-108-279407629384663/AnsiballZ_systemd_service.py'
Feb 16 13:06:18 compute-0 sudo[106885]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:06:18 compute-0 python3.9[106887]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_libvirt.target state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 16 13:06:18 compute-0 sudo[106885]: pam_unix(sudo:session): session closed for user root
Feb 16 13:06:19 compute-0 sudo[107038]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mtehxwajebehkzepcgakfwmjkkvfmnkf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247179.0525959-108-126813057059986/AnsiballZ_systemd_service.py'
Feb 16 13:06:19 compute-0 sudo[107038]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:06:19 compute-0 python3.9[107040]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtlogd_wrapper.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 16 13:06:19 compute-0 sudo[107038]: pam_unix(sudo:session): session closed for user root
Feb 16 13:06:20 compute-0 sudo[107191]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-llygwbzlzdugoiglcgccqpbydbflevmx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247179.7795-108-16301710220882/AnsiballZ_systemd_service.py'
Feb 16 13:06:20 compute-0 sudo[107191]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:06:20 compute-0 python3.9[107193]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtnodedevd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 16 13:06:20 compute-0 sudo[107191]: pam_unix(sudo:session): session closed for user root
Feb 16 13:06:20 compute-0 sudo[107344]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pvlecujwnhqieedrgxcyvjfkydsqcscf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247180.4951174-108-238017823719505/AnsiballZ_systemd_service.py'
Feb 16 13:06:20 compute-0 sudo[107344]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:06:21 compute-0 python3.9[107346]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtproxyd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 16 13:06:21 compute-0 sudo[107344]: pam_unix(sudo:session): session closed for user root
Feb 16 13:06:21 compute-0 sudo[107497]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zygdyajxlztgpmczcmlkdbylfojlcrld ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247181.2216072-108-178011433800731/AnsiballZ_systemd_service.py'
Feb 16 13:06:21 compute-0 sudo[107497]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:06:21 compute-0 python3.9[107499]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtqemud.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 16 13:06:21 compute-0 sudo[107497]: pam_unix(sudo:session): session closed for user root
Feb 16 13:06:22 compute-0 sudo[107650]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zhzjzoasfetxsitxccijvxqlttaepecy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247181.9149907-108-76801306528190/AnsiballZ_systemd_service.py'
Feb 16 13:06:22 compute-0 sudo[107650]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:06:22 compute-0 python3.9[107652]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtsecretd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 16 13:06:22 compute-0 sudo[107650]: pam_unix(sudo:session): session closed for user root
Feb 16 13:06:22 compute-0 sudo[107803]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wyfqniarljedopstxdvghkwekgaysuas ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247182.6145453-108-31741082978592/AnsiballZ_systemd_service.py'
Feb 16 13:06:22 compute-0 sudo[107803]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:06:23 compute-0 python3.9[107805]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtstoraged.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 16 13:06:23 compute-0 sudo[107803]: pam_unix(sudo:session): session closed for user root
Feb 16 13:06:25 compute-0 sudo[107956]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-inxaoekjbxnplftzcjavoukaslqkcyax ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247184.9093044-212-157165998937961/AnsiballZ_file.py'
Feb 16 13:06:25 compute-0 sudo[107956]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:06:25 compute-0 python3.9[107958]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:06:25 compute-0 sudo[107956]: pam_unix(sudo:session): session closed for user root
Feb 16 13:06:25 compute-0 sudo[108108]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vzkvdqjgzexqfxaanbvfwvopxevrlaec ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247185.6648998-212-150025865041264/AnsiballZ_file.py'
Feb 16 13:06:25 compute-0 sudo[108108]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:06:26 compute-0 python3.9[108110]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:06:26 compute-0 sudo[108108]: pam_unix(sudo:session): session closed for user root
Feb 16 13:06:26 compute-0 sudo[108260]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fpgwkffubcdzksgrhkgotnwyzvdexcxt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247186.222716-212-236348712758029/AnsiballZ_file.py'
Feb 16 13:06:26 compute-0 sudo[108260]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:06:26 compute-0 python3.9[108262]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:06:26 compute-0 sudo[108260]: pam_unix(sudo:session): session closed for user root
Feb 16 13:06:26 compute-0 sudo[108412]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-glzmxalugrjstzwfrziiiiouyhswbnfz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247186.7632637-212-240893061669376/AnsiballZ_file.py'
Feb 16 13:06:26 compute-0 sudo[108412]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:06:27 compute-0 python3.9[108414]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:06:27 compute-0 sudo[108412]: pam_unix(sudo:session): session closed for user root
Feb 16 13:06:27 compute-0 sudo[108564]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fsfocffwidntrkrolqmfpgcgpzybzhhr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247187.3213708-212-77079969283341/AnsiballZ_file.py'
Feb 16 13:06:27 compute-0 sudo[108564]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:06:27 compute-0 python3.9[108566]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:06:27 compute-0 sudo[108564]: pam_unix(sudo:session): session closed for user root
Feb 16 13:06:28 compute-0 sudo[108716]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pqllyyuklizmlzbhetvyinvtavybadwp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247188.0559335-212-46951909739692/AnsiballZ_file.py'
Feb 16 13:06:28 compute-0 sudo[108716]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:06:28 compute-0 python3.9[108718]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:06:28 compute-0 sudo[108716]: pam_unix(sudo:session): session closed for user root
Feb 16 13:06:28 compute-0 sudo[108868]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-orvdkntjmpkmbqxuzrssykmbmjnqcudo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247188.5810027-212-255679753795110/AnsiballZ_file.py'
Feb 16 13:06:28 compute-0 sudo[108868]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:06:29 compute-0 python3.9[108870]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:06:29 compute-0 sudo[108868]: pam_unix(sudo:session): session closed for user root
Feb 16 13:06:29 compute-0 sudo[109020]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qrbssvjgxexjvxnwhwiqacmaykxrqtyn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247189.183531-312-266602079251877/AnsiballZ_file.py'
Feb 16 13:06:29 compute-0 sudo[109020]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:06:29 compute-0 python3.9[109022]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:06:29 compute-0 sudo[109020]: pam_unix(sudo:session): session closed for user root
Feb 16 13:06:30 compute-0 sudo[109172]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dbxtntclwjoizpxrgwscomqlxizwgekc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247189.8080513-312-262794397088987/AnsiballZ_file.py'
Feb 16 13:06:30 compute-0 sudo[109172]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:06:30 compute-0 python3.9[109174]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:06:30 compute-0 sudo[109172]: pam_unix(sudo:session): session closed for user root
Feb 16 13:06:30 compute-0 sudo[109324]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-llhkkkimitvyssbqzfcwbqexajwquxun ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247190.3207135-312-51140109565521/AnsiballZ_file.py'
Feb 16 13:06:30 compute-0 sudo[109324]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:06:30 compute-0 python3.9[109326]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:06:30 compute-0 sudo[109324]: pam_unix(sudo:session): session closed for user root
Feb 16 13:06:31 compute-0 sudo[109476]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fyejldlpllfrkbxckkeqxbbycrucvmev ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247190.8906367-312-194319685604262/AnsiballZ_file.py'
Feb 16 13:06:31 compute-0 sudo[109476]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:06:31 compute-0 python3.9[109478]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:06:31 compute-0 sudo[109476]: pam_unix(sudo:session): session closed for user root
Feb 16 13:06:31 compute-0 sudo[109639]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vexxwzcxnlhlsognglielwkhntqlpnts ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247191.4381669-312-76493570487332/AnsiballZ_file.py'
Feb 16 13:06:31 compute-0 sudo[109639]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:06:31 compute-0 podman[109602]: 2026-02-16 13:06:31.759000702 +0000 UTC m=+0.077837767 container health_status d69bd0be854ec8043fcba555b70c2cd311c7a2510d07d6bc9929fe7684abece9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260127, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 16 13:06:31 compute-0 python3.9[109643]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:06:31 compute-0 sudo[109639]: pam_unix(sudo:session): session closed for user root
Feb 16 13:06:32 compute-0 sudo[109799]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mvedwpsqzxmsfufjbfdrtgzgqrrhktwr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247192.0628073-312-72801527833448/AnsiballZ_file.py'
Feb 16 13:06:32 compute-0 sudo[109799]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:06:32 compute-0 python3.9[109801]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:06:32 compute-0 sudo[109799]: pam_unix(sudo:session): session closed for user root
Feb 16 13:06:32 compute-0 sudo[109951]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eatrlkfyednbfzotqtokralegyislien ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247192.647458-312-85681384781056/AnsiballZ_file.py'
Feb 16 13:06:32 compute-0 sudo[109951]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:06:33 compute-0 python3.9[109953]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:06:33 compute-0 sudo[109951]: pam_unix(sudo:session): session closed for user root
Feb 16 13:06:33 compute-0 sudo[110103]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jumxgiwnfxjxujsgflbcroxwtdaldced ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247193.3434725-414-229986902105500/AnsiballZ_command.py'
Feb 16 13:06:33 compute-0 sudo[110103]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:06:33 compute-0 python3.9[110105]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                               systemctl disable --now certmonger.service
                                               test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                             fi
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 13:06:33 compute-0 sudo[110103]: pam_unix(sudo:session): session closed for user root
Feb 16 13:06:34 compute-0 podman[110231]: 2026-02-16 13:06:34.443168395 +0000 UTC m=+0.106720645 container health_status 19961a2f872cc72daf954f4dd556f980b5f58f137d145776df6132c167a8a93c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Feb 16 13:06:34 compute-0 python3.9[110267]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Feb 16 13:06:35 compute-0 sudo[110431]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lgziqfmaeosaulhwoeddmnqagxfyimec ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247194.8249283-450-278802477547211/AnsiballZ_systemd_service.py'
Feb 16 13:06:35 compute-0 sudo[110431]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:06:35 compute-0 python3.9[110433]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 16 13:06:35 compute-0 systemd[1]: Reloading.
Feb 16 13:06:35 compute-0 systemd-rc-local-generator[110454]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 13:06:35 compute-0 systemd-sysv-generator[110461]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 16 13:06:35 compute-0 sudo[110431]: pam_unix(sudo:session): session closed for user root
Feb 16 13:06:35 compute-0 sudo[110625]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fmziexfjuxxtdumvtrydffobybxdnksd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247195.7327695-466-210137392552361/AnsiballZ_command.py'
Feb 16 13:06:35 compute-0 sudo[110625]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:06:36 compute-0 python3.9[110627]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_libvirt.target _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 13:06:36 compute-0 sudo[110625]: pam_unix(sudo:session): session closed for user root
Feb 16 13:06:36 compute-0 sudo[110778]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wcpsboohvtzjuhypdslkoofqhsstnuaf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247196.2741632-466-83916150359392/AnsiballZ_command.py'
Feb 16 13:06:36 compute-0 sudo[110778]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:06:36 compute-0 python3.9[110780]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtlogd_wrapper.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 13:06:36 compute-0 sudo[110778]: pam_unix(sudo:session): session closed for user root
Feb 16 13:06:37 compute-0 sudo[110931]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-soayeutmccxfnykdrenhsoetukmieixz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247196.834798-466-144462495803730/AnsiballZ_command.py'
Feb 16 13:06:37 compute-0 sudo[110931]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:06:37 compute-0 python3.9[110933]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtnodedevd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 13:06:37 compute-0 sudo[110931]: pam_unix(sudo:session): session closed for user root
Feb 16 13:06:37 compute-0 sudo[111084]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mwzxqtpmrfoszemrlazrtxeguwfzkeyf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247197.4167464-466-33050907072934/AnsiballZ_command.py'
Feb 16 13:06:37 compute-0 sudo[111084]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:06:37 compute-0 python3.9[111086]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtproxyd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 13:06:37 compute-0 sudo[111084]: pam_unix(sudo:session): session closed for user root
Feb 16 13:06:38 compute-0 sudo[111237]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ufduiplrsdyxrqdzeoflmcjpejiclgaa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247197.9452837-466-65926626812135/AnsiballZ_command.py'
Feb 16 13:06:38 compute-0 sudo[111237]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:06:38 compute-0 python3.9[111239]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtqemud.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 13:06:38 compute-0 sudo[111237]: pam_unix(sudo:session): session closed for user root
Feb 16 13:06:38 compute-0 sudo[111390]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pajptrqnlmryisovoukhktrsyfauwlvd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247198.4872906-466-129658827610591/AnsiballZ_command.py'
Feb 16 13:06:38 compute-0 sudo[111390]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:06:38 compute-0 python3.9[111392]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtsecretd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 13:06:38 compute-0 sudo[111390]: pam_unix(sudo:session): session closed for user root
Feb 16 13:06:39 compute-0 sudo[111543]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rjrqkdwhlfcqmvtzleswvpliyaftdxoj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247199.029207-466-275025050194107/AnsiballZ_command.py'
Feb 16 13:06:39 compute-0 sudo[111543]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:06:39 compute-0 python3.9[111545]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtstoraged.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 13:06:39 compute-0 sudo[111543]: pam_unix(sudo:session): session closed for user root
Feb 16 13:06:40 compute-0 sudo[111696]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-riqeumlkxzflnmtedruvwlkxsbcfliml ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247199.9941213-574-16411474351667/AnsiballZ_getent.py'
Feb 16 13:06:40 compute-0 sudo[111696]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:06:40 compute-0 python3.9[111698]: ansible-ansible.builtin.getent Invoked with database=passwd key=libvirt fail_key=True service=None split=None
Feb 16 13:06:40 compute-0 sudo[111696]: pam_unix(sudo:session): session closed for user root
Feb 16 13:06:41 compute-0 sudo[111849]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ltdrjzrqsyoqnspwidsasnggsezrgzld ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247200.760459-590-236035869589405/AnsiballZ_group.py'
Feb 16 13:06:41 compute-0 sudo[111849]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:06:41 compute-0 python3.9[111851]: ansible-ansible.builtin.group Invoked with gid=42473 name=libvirt state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Feb 16 13:06:41 compute-0 groupadd[111852]: group added to /etc/group: name=libvirt, GID=42473
Feb 16 13:06:41 compute-0 groupadd[111852]: group added to /etc/gshadow: name=libvirt
Feb 16 13:06:41 compute-0 groupadd[111852]: new group: name=libvirt, GID=42473
Feb 16 13:06:41 compute-0 sudo[111849]: pam_unix(sudo:session): session closed for user root
Feb 16 13:06:42 compute-0 sudo[112007]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gyntqecgipppmbwpreoplcczekeenwer ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247201.6463408-606-6344852803915/AnsiballZ_user.py'
Feb 16 13:06:42 compute-0 sudo[112007]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:06:42 compute-0 python3.9[112009]: ansible-ansible.builtin.user Invoked with comment=libvirt user group=libvirt groups=[''] name=libvirt shell=/sbin/nologin state=present uid=42473 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Feb 16 13:06:42 compute-0 useradd[112011]: new user: name=libvirt, UID=42473, GID=42473, home=/home/libvirt, shell=/sbin/nologin, from=/dev/pts/0
Feb 16 13:06:42 compute-0 sudo[112007]: pam_unix(sudo:session): session closed for user root
Feb 16 13:06:43 compute-0 sudo[112167]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nvwmsnhwjptzgknougpmxfvogloqjbwt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247202.83543-628-268870737430023/AnsiballZ_setup.py'
Feb 16 13:06:43 compute-0 sudo[112167]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:06:43 compute-0 python3.9[112169]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 16 13:06:43 compute-0 sudo[112167]: pam_unix(sudo:session): session closed for user root
Feb 16 13:06:44 compute-0 sudo[112251]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wnhfbjhuphamvekbnxgkxallbjcmdwaf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247202.83543-628-268870737430023/AnsiballZ_dnf.py'
Feb 16 13:06:44 compute-0 sudo[112251]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:06:44 compute-0 python3.9[112253]: ansible-ansible.legacy.dnf Invoked with name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 16 13:06:44 compute-0 sshd-session[112256]: error: kex_exchange_identification: read: Connection reset by peer
Feb 16 13:06:44 compute-0 sshd-session[112256]: Connection reset by 176.120.22.52 port 26612
Feb 16 13:07:02 compute-0 podman[112444]: 2026-02-16 13:07:02.030302602 +0000 UTC m=+0.060130856 container health_status d69bd0be854ec8043fcba555b70c2cd311c7a2510d07d6bc9929fe7684abece9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Feb 16 13:07:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:07:03.195 105360 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:07:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:07:03.196 105360 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:07:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:07:03.196 105360 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:07:05 compute-0 podman[112465]: 2026-02-16 13:07:05.056162173 +0000 UTC m=+0.093864114 container health_status 19961a2f872cc72daf954f4dd556f980b5f58f137d145776df6132c167a8a93c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2)
Feb 16 13:07:06 compute-0 sshd-session[112491]: Connection closed by authenticating user root 146.190.226.24 port 60514 [preauth]
Feb 16 13:07:10 compute-0 kernel: SELinux:  Converting 2766 SID table entries...
Feb 16 13:07:10 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Feb 16 13:07:10 compute-0 kernel: SELinux:  policy capability open_perms=1
Feb 16 13:07:10 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Feb 16 13:07:10 compute-0 kernel: SELinux:  policy capability always_check_network=0
Feb 16 13:07:10 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Feb 16 13:07:10 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Feb 16 13:07:10 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Feb 16 13:07:22 compute-0 kernel: SELinux:  Converting 2766 SID table entries...
Feb 16 13:07:22 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Feb 16 13:07:22 compute-0 kernel: SELinux:  policy capability open_perms=1
Feb 16 13:07:22 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Feb 16 13:07:22 compute-0 kernel: SELinux:  policy capability always_check_network=0
Feb 16 13:07:22 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Feb 16 13:07:22 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Feb 16 13:07:22 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Feb 16 13:07:32 compute-0 dbus-broker-launch[808]: avc:  op=load_policy lsm=selinux seqno=11 res=1
Feb 16 13:07:33 compute-0 podman[112509]: 2026-02-16 13:07:33.040128421 +0000 UTC m=+0.064576235 container health_status d69bd0be854ec8043fcba555b70c2cd311c7a2510d07d6bc9929fe7684abece9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3)
Feb 16 13:07:36 compute-0 podman[114201]: 2026-02-16 13:07:36.026238413 +0000 UTC m=+0.069429312 container health_status 19961a2f872cc72daf954f4dd556f980b5f58f137d145776df6132c167a8a93c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=ovn_controller, org.label-schema.build-date=20260127)
Feb 16 13:08:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:08:03.195 105360 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:08:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:08:03.196 105360 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:08:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:08:03.196 105360 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:08:04 compute-0 podman[129442]: 2026-02-16 13:08:04.062294693 +0000 UTC m=+0.092344698 container health_status d69bd0be854ec8043fcba555b70c2cd311c7a2510d07d6bc9929fe7684abece9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2)
Feb 16 13:08:07 compute-0 podman[129467]: 2026-02-16 13:08:07.072599131 +0000 UTC m=+0.099996813 container health_status 19961a2f872cc72daf954f4dd556f980b5f58f137d145776df6132c167a8a93c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3)
Feb 16 13:08:07 compute-0 kernel: SELinux:  Converting 2767 SID table entries...
Feb 16 13:08:07 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Feb 16 13:08:07 compute-0 kernel: SELinux:  policy capability open_perms=1
Feb 16 13:08:07 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Feb 16 13:08:07 compute-0 kernel: SELinux:  policy capability always_check_network=0
Feb 16 13:08:07 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Feb 16 13:08:07 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Feb 16 13:08:07 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Feb 16 13:08:08 compute-0 groupadd[129501]: group added to /etc/group: name=dnsmasq, GID=993
Feb 16 13:08:08 compute-0 groupadd[129501]: group added to /etc/gshadow: name=dnsmasq
Feb 16 13:08:08 compute-0 groupadd[129501]: new group: name=dnsmasq, GID=993
Feb 16 13:08:08 compute-0 useradd[129508]: new user: name=dnsmasq, UID=992, GID=993, home=/var/lib/dnsmasq, shell=/usr/sbin/nologin, from=none
Feb 16 13:08:08 compute-0 dbus-broker-launch[793]: Noticed file-system modification, trigger reload.
Feb 16 13:08:08 compute-0 dbus-broker-launch[808]: avc:  op=load_policy lsm=selinux seqno=12 res=1
Feb 16 13:08:08 compute-0 dbus-broker-launch[793]: Noticed file-system modification, trigger reload.
Feb 16 13:08:09 compute-0 groupadd[129521]: group added to /etc/group: name=clevis, GID=992
Feb 16 13:08:09 compute-0 groupadd[129521]: group added to /etc/gshadow: name=clevis
Feb 16 13:08:09 compute-0 groupadd[129521]: new group: name=clevis, GID=992
Feb 16 13:08:09 compute-0 useradd[129528]: new user: name=clevis, UID=991, GID=992, home=/var/cache/clevis, shell=/usr/sbin/nologin, from=none
Feb 16 13:08:09 compute-0 usermod[129538]: add 'clevis' to group 'tss'
Feb 16 13:08:09 compute-0 usermod[129538]: add 'clevis' to shadow group 'tss'
Feb 16 13:08:10 compute-0 sshd-session[129556]: Connection closed by authenticating user root 146.190.226.24 port 57498 [preauth]
Feb 16 13:08:11 compute-0 polkitd[44299]: Reloading rules
Feb 16 13:08:11 compute-0 polkitd[44299]: Collecting garbage unconditionally...
Feb 16 13:08:11 compute-0 polkitd[44299]: Loading rules from directory /etc/polkit-1/rules.d
Feb 16 13:08:11 compute-0 polkitd[44299]: Loading rules from directory /usr/share/polkit-1/rules.d
Feb 16 13:08:11 compute-0 polkitd[44299]: Finished loading, compiling and executing 3 rules
Feb 16 13:08:11 compute-0 polkitd[44299]: Reloading rules
Feb 16 13:08:11 compute-0 polkitd[44299]: Collecting garbage unconditionally...
Feb 16 13:08:11 compute-0 polkitd[44299]: Loading rules from directory /etc/polkit-1/rules.d
Feb 16 13:08:11 compute-0 polkitd[44299]: Loading rules from directory /usr/share/polkit-1/rules.d
Feb 16 13:08:11 compute-0 polkitd[44299]: Finished loading, compiling and executing 3 rules
Feb 16 13:08:12 compute-0 groupadd[129730]: group added to /etc/group: name=ceph, GID=167
Feb 16 13:08:12 compute-0 groupadd[129730]: group added to /etc/gshadow: name=ceph
Feb 16 13:08:12 compute-0 groupadd[129730]: new group: name=ceph, GID=167
Feb 16 13:08:12 compute-0 useradd[129736]: new user: name=ceph, UID=167, GID=167, home=/var/lib/ceph, shell=/sbin/nologin, from=none
Feb 16 13:08:15 compute-0 systemd[1]: Stopping OpenSSH server daemon...
Feb 16 13:08:15 compute-0 sshd[1018]: Received signal 15; terminating.
Feb 16 13:08:15 compute-0 systemd[1]: sshd.service: Deactivated successfully.
Feb 16 13:08:15 compute-0 systemd[1]: Stopped OpenSSH server daemon.
Feb 16 13:08:15 compute-0 systemd[1]: sshd.service: Consumed 1.503s CPU time, read 32.0K from disk, written 0B to disk.
Feb 16 13:08:15 compute-0 systemd[1]: Stopped target sshd-keygen.target.
Feb 16 13:08:15 compute-0 systemd[1]: Stopping sshd-keygen.target...
Feb 16 13:08:15 compute-0 systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Feb 16 13:08:15 compute-0 systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Feb 16 13:08:15 compute-0 systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Feb 16 13:08:15 compute-0 systemd[1]: Reached target sshd-keygen.target.
Feb 16 13:08:15 compute-0 systemd[1]: Starting OpenSSH server daemon...
Feb 16 13:08:15 compute-0 sshd[130256]: Server listening on 0.0.0.0 port 22.
Feb 16 13:08:15 compute-0 sshd[130256]: Server listening on :: port 22.
Feb 16 13:08:15 compute-0 systemd[1]: Started OpenSSH server daemon.
Feb 16 13:08:16 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Feb 16 13:08:16 compute-0 systemd[1]: Starting man-db-cache-update.service...
Feb 16 13:08:16 compute-0 systemd[1]: Reloading.
Feb 16 13:08:16 compute-0 systemd-sysv-generator[130520]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 16 13:08:16 compute-0 systemd-rc-local-generator[130515]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 13:08:16 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Feb 16 13:08:19 compute-0 sudo[112251]: pam_unix(sudo:session): session closed for user root
Feb 16 13:08:19 compute-0 sudo[135268]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uqtqdwsrhgmswbopzvhnwnknuefdevyt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247299.2565584-652-20734101195860/AnsiballZ_systemd.py'
Feb 16 13:08:19 compute-0 sudo[135268]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:08:20 compute-0 python3.9[135290]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Feb 16 13:08:20 compute-0 systemd[1]: Reloading.
Feb 16 13:08:20 compute-0 systemd-rc-local-generator[135858]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 13:08:20 compute-0 systemd-sysv-generator[135861]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 16 13:08:20 compute-0 sudo[135268]: pam_unix(sudo:session): session closed for user root
Feb 16 13:08:20 compute-0 sudo[136883]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lxagzcstthzciruixsafrsnnktwpuzlx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247300.606662-652-225760709912843/AnsiballZ_systemd.py'
Feb 16 13:08:20 compute-0 sudo[136883]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:08:21 compute-0 python3.9[136917]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Feb 16 13:08:21 compute-0 systemd[1]: Reloading.
Feb 16 13:08:21 compute-0 systemd-rc-local-generator[137528]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 13:08:21 compute-0 systemd-sysv-generator[137536]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 16 13:08:21 compute-0 sudo[136883]: pam_unix(sudo:session): session closed for user root
Feb 16 13:08:21 compute-0 sudo[138565]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gdcqalodfcztsezjqijfnkqmygmkggff ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247301.642611-652-59372835620035/AnsiballZ_systemd.py'
Feb 16 13:08:21 compute-0 sudo[138565]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:08:22 compute-0 python3.9[138609]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tls.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Feb 16 13:08:22 compute-0 systemd[1]: Reloading.
Feb 16 13:08:22 compute-0 systemd-rc-local-generator[139167]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 13:08:22 compute-0 systemd-sysv-generator[139172]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 16 13:08:22 compute-0 sudo[138565]: pam_unix(sudo:session): session closed for user root
Feb 16 13:08:22 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Feb 16 13:08:22 compute-0 systemd[1]: Finished man-db-cache-update.service.
Feb 16 13:08:22 compute-0 systemd[1]: man-db-cache-update.service: Consumed 7.586s CPU time.
Feb 16 13:08:22 compute-0 systemd[1]: run-r4b93cb45d02346f68cd5566e8e1018a5.service: Deactivated successfully.
Feb 16 13:08:22 compute-0 sudo[139643]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oxhaoddzhptpdjeetroxdgyuuazkilbx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247302.629363-652-137579517866276/AnsiballZ_systemd.py'
Feb 16 13:08:22 compute-0 sudo[139643]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:08:23 compute-0 python3.9[139645]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=virtproxyd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Feb 16 13:08:23 compute-0 systemd[1]: Reloading.
Feb 16 13:08:23 compute-0 systemd-rc-local-generator[139670]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 13:08:23 compute-0 systemd-sysv-generator[139675]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 16 13:08:23 compute-0 sudo[139643]: pam_unix(sudo:session): session closed for user root
Feb 16 13:08:24 compute-0 sudo[139840]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ndenzsgmtcfsfhvgrdaxveyqeyaqomxx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247303.9955857-710-36193719173054/AnsiballZ_systemd.py'
Feb 16 13:08:24 compute-0 sudo[139840]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:08:24 compute-0 python3.9[139842]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 16 13:08:24 compute-0 systemd[1]: Reloading.
Feb 16 13:08:24 compute-0 systemd-sysv-generator[139877]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 16 13:08:24 compute-0 systemd-rc-local-generator[139873]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 13:08:24 compute-0 sudo[139840]: pam_unix(sudo:session): session closed for user root
Feb 16 13:08:25 compute-0 sudo[140036]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yxhmzmkbwfcoptvqksufooudhkkmehpy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247305.0136952-710-70708924260505/AnsiballZ_systemd.py'
Feb 16 13:08:25 compute-0 sudo[140036]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:08:25 compute-0 python3.9[140038]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 16 13:08:25 compute-0 systemd[1]: Reloading.
Feb 16 13:08:25 compute-0 systemd-rc-local-generator[140069]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 13:08:25 compute-0 systemd-sysv-generator[140072]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 16 13:08:25 compute-0 sudo[140036]: pam_unix(sudo:session): session closed for user root
Feb 16 13:08:26 compute-0 sudo[140233]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mhtdvthoixehtecovejncehkuisaopyn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247305.9998062-710-108388467425606/AnsiballZ_systemd.py'
Feb 16 13:08:26 compute-0 sudo[140233]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:08:26 compute-0 python3.9[140235]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 16 13:08:26 compute-0 systemd[1]: Reloading.
Feb 16 13:08:26 compute-0 systemd-rc-local-generator[140267]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 13:08:26 compute-0 systemd-sysv-generator[140270]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 16 13:08:27 compute-0 sudo[140233]: pam_unix(sudo:session): session closed for user root
Feb 16 13:08:27 compute-0 sudo[140430]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ibpizkbckbyegantvgcdvqjfbcrtcfaf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247307.279277-710-79633018064519/AnsiballZ_systemd.py'
Feb 16 13:08:27 compute-0 sudo[140430]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:08:27 compute-0 python3.9[140432]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 16 13:08:27 compute-0 sudo[140430]: pam_unix(sudo:session): session closed for user root
Feb 16 13:08:28 compute-0 sudo[140585]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pajcygeufemmnbwptbxhadplfjilaene ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247308.0017443-710-278911910413573/AnsiballZ_systemd.py'
Feb 16 13:08:28 compute-0 sudo[140585]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:08:28 compute-0 python3.9[140587]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 16 13:08:28 compute-0 systemd[1]: Reloading.
Feb 16 13:08:28 compute-0 systemd-sysv-generator[140626]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 16 13:08:28 compute-0 systemd-rc-local-generator[140622]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 13:08:28 compute-0 sudo[140585]: pam_unix(sudo:session): session closed for user root
Feb 16 13:08:29 compute-0 sudo[140784]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jkwizygzoikvmgsuzsvfaeyhlmjxuwsq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247309.1560128-782-25193236839350/AnsiballZ_systemd.py'
Feb 16 13:08:29 compute-0 sudo[140784]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:08:29 compute-0 python3.9[140786]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-tls.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Feb 16 13:08:29 compute-0 systemd[1]: Reloading.
Feb 16 13:08:29 compute-0 systemd-rc-local-generator[140818]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 13:08:29 compute-0 systemd-sysv-generator[140822]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 16 13:08:30 compute-0 systemd[1]: Listening on libvirt proxy daemon socket.
Feb 16 13:08:30 compute-0 systemd[1]: Listening on libvirt proxy daemon TLS IP socket.
Feb 16 13:08:30 compute-0 sudo[140784]: pam_unix(sudo:session): session closed for user root
Feb 16 13:08:30 compute-0 sudo[140984]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hphcanulhiwrjrmlqhvuizclvkqstoqf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247310.5299168-798-137363361287368/AnsiballZ_systemd.py'
Feb 16 13:08:30 compute-0 sudo[140984]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:08:31 compute-0 python3.9[140986]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 16 13:08:31 compute-0 sudo[140984]: pam_unix(sudo:session): session closed for user root
Feb 16 13:08:31 compute-0 sudo[141139]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vetpmvakeysjutewhgpnmcebalwxyhoi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247311.305792-798-49032042726315/AnsiballZ_systemd.py'
Feb 16 13:08:31 compute-0 sudo[141139]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:08:31 compute-0 python3.9[141141]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 16 13:08:31 compute-0 sudo[141139]: pam_unix(sudo:session): session closed for user root
Feb 16 13:08:32 compute-0 sudo[141294]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jhsnawolcqwaynaxtoxiayotliqljikk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247312.0872133-798-228234192158636/AnsiballZ_systemd.py'
Feb 16 13:08:32 compute-0 sudo[141294]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:08:32 compute-0 python3.9[141296]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 16 13:08:32 compute-0 sudo[141294]: pam_unix(sudo:session): session closed for user root
Feb 16 13:08:33 compute-0 sudo[141449]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wsximcdyrctepsanoxqxportiyitrhtt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247312.8243427-798-188210116742100/AnsiballZ_systemd.py'
Feb 16 13:08:33 compute-0 sudo[141449]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:08:33 compute-0 python3.9[141451]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 16 13:08:33 compute-0 sudo[141449]: pam_unix(sudo:session): session closed for user root
Feb 16 13:08:33 compute-0 sudo[141604]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ysvgyedbbxbpkjkvjzgmrgxwmfmjbzuy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247313.664032-798-189658349473123/AnsiballZ_systemd.py'
Feb 16 13:08:33 compute-0 sudo[141604]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:08:34 compute-0 python3.9[141606]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 16 13:08:34 compute-0 sudo[141604]: pam_unix(sudo:session): session closed for user root
Feb 16 13:08:34 compute-0 podman[141608]: 2026-02-16 13:08:34.347234356 +0000 UTC m=+0.089464222 container health_status d69bd0be854ec8043fcba555b70c2cd311c7a2510d07d6bc9929fe7684abece9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Feb 16 13:08:34 compute-0 sudo[141778]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-otavbybbkadnwuydlcynbgmyflgzvenf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247314.698007-798-211357034786786/AnsiballZ_systemd.py'
Feb 16 13:08:34 compute-0 sudo[141778]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:08:35 compute-0 python3.9[141780]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 16 13:08:35 compute-0 sudo[141778]: pam_unix(sudo:session): session closed for user root
Feb 16 13:08:35 compute-0 sudo[141933]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mxigdremkfmfiyxzftoujoeyvwfakyvt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247315.5079858-798-53819871015714/AnsiballZ_systemd.py'
Feb 16 13:08:35 compute-0 sudo[141933]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:08:36 compute-0 python3.9[141935]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 16 13:08:36 compute-0 sudo[141933]: pam_unix(sudo:session): session closed for user root
Feb 16 13:08:36 compute-0 sudo[142088]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zwaykmsbqvuzfvxutwiumiavxutqvmtm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247316.3315947-798-29235634457417/AnsiballZ_systemd.py'
Feb 16 13:08:36 compute-0 sudo[142088]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:08:36 compute-0 python3.9[142090]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 16 13:08:37 compute-0 sudo[142088]: pam_unix(sudo:session): session closed for user root
Feb 16 13:08:37 compute-0 sudo[142254]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bpappkronxigvhtlfgumtczuuebjhssk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247317.1857562-798-9606047141081/AnsiballZ_systemd.py'
Feb 16 13:08:37 compute-0 sudo[142254]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:08:37 compute-0 podman[142217]: 2026-02-16 13:08:37.590629046 +0000 UTC m=+0.131618698 container health_status 19961a2f872cc72daf954f4dd556f980b5f58f137d145776df6132c167a8a93c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Feb 16 13:08:37 compute-0 python3.9[142262]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 16 13:08:37 compute-0 sudo[142254]: pam_unix(sudo:session): session closed for user root
Feb 16 13:08:38 compute-0 sudo[142424]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lbogocwsiouklgsxgjbxcyslxgynemwx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247318.0431342-798-172662690290710/AnsiballZ_systemd.py'
Feb 16 13:08:38 compute-0 sudo[142424]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:08:38 compute-0 python3.9[142426]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 16 13:08:39 compute-0 sudo[142424]: pam_unix(sudo:session): session closed for user root
Feb 16 13:08:40 compute-0 sudo[142579]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dqwofqsvngahuvuwablzgzvimxnhqpzb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247319.8854933-798-10586939354209/AnsiballZ_systemd.py'
Feb 16 13:08:40 compute-0 sudo[142579]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:08:40 compute-0 python3.9[142581]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 16 13:08:40 compute-0 sudo[142579]: pam_unix(sudo:session): session closed for user root
Feb 16 13:08:40 compute-0 sudo[142734]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aapdgbpgavumbwzrxbyjjxfqocwanqus ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247320.664939-798-124150061062155/AnsiballZ_systemd.py'
Feb 16 13:08:40 compute-0 sudo[142734]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:08:41 compute-0 python3.9[142736]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 16 13:08:41 compute-0 sudo[142734]: pam_unix(sudo:session): session closed for user root
Feb 16 13:08:41 compute-0 sudo[142889]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jqhznrevuqehiqrkklqytfihifcmflcg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247321.4657724-798-266636951917248/AnsiballZ_systemd.py'
Feb 16 13:08:41 compute-0 sudo[142889]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:08:42 compute-0 python3.9[142891]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 16 13:08:42 compute-0 sudo[142889]: pam_unix(sudo:session): session closed for user root
Feb 16 13:08:42 compute-0 sudo[143044]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lammwefjcmwyvlmkkurhavktemqhdyip ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247322.2753258-798-62319581666034/AnsiballZ_systemd.py'
Feb 16 13:08:42 compute-0 sudo[143044]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:08:42 compute-0 python3.9[143046]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 16 13:08:42 compute-0 sudo[143044]: pam_unix(sudo:session): session closed for user root
Feb 16 13:08:43 compute-0 sudo[143199]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mdqxkskmfvgnhzgpvagjqhomtzlxgxbl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247323.5103915-1002-190804839549624/AnsiballZ_file.py'
Feb 16 13:08:43 compute-0 sudo[143199]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:08:43 compute-0 python3.9[143201]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/etc/tmpfiles.d/ setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Feb 16 13:08:44 compute-0 sudo[143199]: pam_unix(sudo:session): session closed for user root
Feb 16 13:08:44 compute-0 sudo[143351]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ylypqsdiiltvaybnzoqahxoxjgrdfnkg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247324.1274927-1002-227801237182176/AnsiballZ_file.py'
Feb 16 13:08:44 compute-0 sudo[143351]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:08:44 compute-0 python3.9[143353]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Feb 16 13:08:44 compute-0 sudo[143351]: pam_unix(sudo:session): session closed for user root
Feb 16 13:08:44 compute-0 sudo[143503]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yofgvzrmbdwqkefobzkrajwrhzivwjuj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247324.7243576-1002-37842896591176/AnsiballZ_file.py'
Feb 16 13:08:44 compute-0 sudo[143503]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:08:45 compute-0 python3.9[143505]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 16 13:08:45 compute-0 sudo[143503]: pam_unix(sudo:session): session closed for user root
Feb 16 13:08:45 compute-0 sudo[143655]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vqqvbhqoedjwgwmzwfzftynjlujezehn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247325.2917726-1002-29651082030713/AnsiballZ_file.py'
Feb 16 13:08:45 compute-0 sudo[143655]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:08:45 compute-0 python3.9[143657]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt/private setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 16 13:08:45 compute-0 sudo[143655]: pam_unix(sudo:session): session closed for user root
Feb 16 13:08:46 compute-0 sudo[143807]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-olbxllnzxpyobdtpcpatrsvjgymqzkab ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247325.9017453-1002-20159662125757/AnsiballZ_file.py'
Feb 16 13:08:46 compute-0 sudo[143807]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:08:46 compute-0 python3.9[143809]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/CA setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 16 13:08:46 compute-0 sudo[143807]: pam_unix(sudo:session): session closed for user root
Feb 16 13:08:46 compute-0 sudo[143959]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kwjigxqskquoqgbrcvpoivuxgrkngcoe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247326.5353703-1002-38654593302902/AnsiballZ_file.py'
Feb 16 13:08:46 compute-0 sudo[143959]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:08:46 compute-0 python3.9[143961]: ansible-ansible.builtin.file Invoked with group=qemu owner=root path=/etc/pki/qemu setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Feb 16 13:08:47 compute-0 sudo[143959]: pam_unix(sudo:session): session closed for user root
Feb 16 13:08:47 compute-0 python3.9[144111]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 16 13:08:48 compute-0 sudo[144261]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-szqoaehbdpymburhzwupxwmzaptfoqon ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247328.0229204-1104-169498060649753/AnsiballZ_stat.py'
Feb 16 13:08:48 compute-0 sudo[144261]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:08:48 compute-0 python3.9[144263]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtlogd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:08:48 compute-0 sudo[144261]: pam_unix(sudo:session): session closed for user root
Feb 16 13:08:49 compute-0 sudo[144386]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-olozazuvhinpjbsmuealgbboasnnkuic ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247328.0229204-1104-169498060649753/AnsiballZ_copy.py'
Feb 16 13:08:49 compute-0 sudo[144386]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:08:49 compute-0 python3.9[144388]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtlogd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1771247328.0229204-1104-169498060649753/.source.conf follow=False _original_basename=virtlogd.conf checksum=d7a72ae92c2c205983b029473e05a6aa4c58ec24 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:08:49 compute-0 sudo[144386]: pam_unix(sudo:session): session closed for user root
Feb 16 13:08:49 compute-0 sudo[144538]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bcmflmgpfufgfdgvdweqmcyfcnadezjl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247329.4973187-1104-188067567337211/AnsiballZ_stat.py'
Feb 16 13:08:49 compute-0 sudo[144538]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:08:49 compute-0 python3.9[144540]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtnodedevd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:08:49 compute-0 sudo[144538]: pam_unix(sudo:session): session closed for user root
Feb 16 13:08:50 compute-0 sudo[144663]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mvoqksjavbbxitzqrbjtamucmxcvmdog ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247329.4973187-1104-188067567337211/AnsiballZ_copy.py'
Feb 16 13:08:50 compute-0 sudo[144663]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:08:50 compute-0 python3.9[144665]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtnodedevd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1771247329.4973187-1104-188067567337211/.source.conf follow=False _original_basename=virtnodedevd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:08:50 compute-0 sudo[144663]: pam_unix(sudo:session): session closed for user root
Feb 16 13:08:50 compute-0 sudo[144815]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-piisgsucpdjddrabdpcdpwdappxolwgc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247330.688098-1104-30650354900647/AnsiballZ_stat.py'
Feb 16 13:08:50 compute-0 sudo[144815]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:08:51 compute-0 python3.9[144817]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtproxyd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:08:51 compute-0 sudo[144815]: pam_unix(sudo:session): session closed for user root
Feb 16 13:08:51 compute-0 sudo[144940]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vqxwfcdnxjiphowfijygtqzdlkjfccii ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247330.688098-1104-30650354900647/AnsiballZ_copy.py'
Feb 16 13:08:51 compute-0 sudo[144940]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:08:51 compute-0 python3.9[144942]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtproxyd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1771247330.688098-1104-30650354900647/.source.conf follow=False _original_basename=virtproxyd.conf checksum=28bc484b7c9988e03de49d4fcc0a088ea975f716 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:08:51 compute-0 sudo[144940]: pam_unix(sudo:session): session closed for user root
Feb 16 13:08:52 compute-0 sudo[145092]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-evihrkdqzjeqjelhqjnmygmgbkggpuok ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247331.8840425-1104-235422668262787/AnsiballZ_stat.py'
Feb 16 13:08:52 compute-0 sudo[145092]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:08:52 compute-0 python3.9[145094]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtqemud.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:08:52 compute-0 sudo[145092]: pam_unix(sudo:session): session closed for user root
Feb 16 13:08:52 compute-0 sudo[145217]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tsvuisqrrcclzejedykqhbfsnvwutpdg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247331.8840425-1104-235422668262787/AnsiballZ_copy.py'
Feb 16 13:08:52 compute-0 sudo[145217]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:08:52 compute-0 python3.9[145219]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtqemud.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1771247331.8840425-1104-235422668262787/.source.conf follow=False _original_basename=virtqemud.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:08:52 compute-0 sudo[145217]: pam_unix(sudo:session): session closed for user root
Feb 16 13:08:53 compute-0 sudo[145369]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kujpmorisdrieeygnentkqvsqoomphvp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247333.0736136-1104-195655846899673/AnsiballZ_stat.py'
Feb 16 13:08:53 compute-0 sudo[145369]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:08:53 compute-0 python3.9[145371]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/qemu.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:08:53 compute-0 sudo[145369]: pam_unix(sudo:session): session closed for user root
Feb 16 13:08:54 compute-0 sudo[145494]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nfufkqtgesfkknaprvncidepqpinxana ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247333.0736136-1104-195655846899673/AnsiballZ_copy.py'
Feb 16 13:08:54 compute-0 sudo[145494]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:08:54 compute-0 python3.9[145496]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/qemu.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1771247333.0736136-1104-195655846899673/.source.conf follow=False _original_basename=qemu.conf.j2 checksum=c44de21af13c90603565570f09ff60c6a41ed8df backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:08:54 compute-0 sudo[145494]: pam_unix(sudo:session): session closed for user root
Feb 16 13:08:54 compute-0 sudo[145646]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hubyumlhzhmletbyypvkhzdrdqericbs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247334.536964-1104-102164961893116/AnsiballZ_stat.py'
Feb 16 13:08:54 compute-0 sudo[145646]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:08:54 compute-0 python3.9[145648]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtsecretd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:08:55 compute-0 sudo[145646]: pam_unix(sudo:session): session closed for user root
Feb 16 13:08:55 compute-0 sudo[145771]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-widsecuqprfpxziuaepdeipuylmxmvfd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247334.536964-1104-102164961893116/AnsiballZ_copy.py'
Feb 16 13:08:55 compute-0 sudo[145771]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:08:55 compute-0 python3.9[145773]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtsecretd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1771247334.536964-1104-102164961893116/.source.conf follow=False _original_basename=virtsecretd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:08:55 compute-0 sudo[145771]: pam_unix(sudo:session): session closed for user root
Feb 16 13:08:56 compute-0 sudo[145923]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eceyslljfwigftlozfclyedjefqnrwfm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247335.7790587-1104-153276745936729/AnsiballZ_stat.py'
Feb 16 13:08:56 compute-0 sudo[145923]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:08:56 compute-0 python3.9[145925]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/auth.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:08:56 compute-0 sudo[145923]: pam_unix(sudo:session): session closed for user root
Feb 16 13:08:56 compute-0 sudo[146046]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jjihotjuzfevkehjeelhzdtidglzletb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247335.7790587-1104-153276745936729/AnsiballZ_copy.py'
Feb 16 13:08:56 compute-0 sudo[146046]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:08:56 compute-0 python3.9[146048]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/auth.conf group=libvirt mode=0600 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1771247335.7790587-1104-153276745936729/.source.conf follow=False _original_basename=auth.conf checksum=a94cd818c374cec2c8425b70d2e0e2f41b743ae4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:08:56 compute-0 sudo[146046]: pam_unix(sudo:session): session closed for user root
Feb 16 13:08:57 compute-0 sudo[146198]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wxryspiuowhnpxztlowaotsyownvtgvm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247336.9632802-1104-168770037283248/AnsiballZ_stat.py'
Feb 16 13:08:57 compute-0 sudo[146198]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:08:57 compute-0 python3.9[146200]: ansible-ansible.legacy.stat Invoked with path=/etc/sasl2/libvirt.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:08:57 compute-0 sudo[146198]: pam_unix(sudo:session): session closed for user root
Feb 16 13:08:57 compute-0 sudo[146323]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tvvvdvhuaajgzhftlamdlvwulpkmlqlr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247336.9632802-1104-168770037283248/AnsiballZ_copy.py'
Feb 16 13:08:57 compute-0 sudo[146323]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:08:58 compute-0 python3.9[146325]: ansible-ansible.legacy.copy Invoked with dest=/etc/sasl2/libvirt.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1771247336.9632802-1104-168770037283248/.source.conf follow=False _original_basename=sasl_libvirt.conf checksum=652e4d404bf79253d06956b8e9847c9364979d4a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:08:58 compute-0 sudo[146323]: pam_unix(sudo:session): session closed for user root
Feb 16 13:08:58 compute-0 sudo[146475]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bhahwoonszlohkhaqwllqvzdmjldqedu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247338.3397226-1330-254408217283121/AnsiballZ_command.py'
Feb 16 13:08:58 compute-0 sudo[146475]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:08:58 compute-0 python3.9[146477]: ansible-ansible.legacy.command Invoked with cmd=saslpasswd2 -f /etc/libvirt/passwd.db -p -a libvirt -u openstack migration stdin=12345678 _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None
Feb 16 13:08:58 compute-0 sudo[146475]: pam_unix(sudo:session): session closed for user root
Feb 16 13:08:59 compute-0 sudo[146628]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nkunkdgsmmkzarvwlvzcyehnqedqzcyy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247339.1004052-1348-166131753030448/AnsiballZ_file.py'
Feb 16 13:08:59 compute-0 sudo[146628]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:08:59 compute-0 python3.9[146630]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:08:59 compute-0 sudo[146628]: pam_unix(sudo:session): session closed for user root
Feb 16 13:09:00 compute-0 sudo[146780]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rauxyubgmysikpuzcbzpysichjpgxlij ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247339.8173153-1348-125515326753258/AnsiballZ_file.py'
Feb 16 13:09:00 compute-0 sudo[146780]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:09:00 compute-0 python3.9[146782]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:09:00 compute-0 sudo[146780]: pam_unix(sudo:session): session closed for user root
Feb 16 13:09:00 compute-0 sudo[146932]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xxufgnjatyrvdibpuowzbecqltswbqso ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247340.4563127-1348-5322016967752/AnsiballZ_file.py'
Feb 16 13:09:00 compute-0 sudo[146932]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:09:00 compute-0 python3.9[146934]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:09:00 compute-0 sudo[146932]: pam_unix(sudo:session): session closed for user root
Feb 16 13:09:01 compute-0 sudo[147084]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wbyygdbxhdtdfjwdxjcrhnntccqgidbq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247341.1307776-1348-153007363525937/AnsiballZ_file.py'
Feb 16 13:09:01 compute-0 sudo[147084]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:09:01 compute-0 python3.9[147086]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:09:01 compute-0 sudo[147084]: pam_unix(sudo:session): session closed for user root
Feb 16 13:09:02 compute-0 sudo[147236]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zbpqlmadjcalfazdljutynufusdppjbc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247341.8280137-1348-43177300575794/AnsiballZ_file.py'
Feb 16 13:09:02 compute-0 sudo[147236]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:09:02 compute-0 python3.9[147238]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:09:02 compute-0 sudo[147236]: pam_unix(sudo:session): session closed for user root
Feb 16 13:09:02 compute-0 sudo[147388]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aigtmsfwgrqhyzazyftktgaylgnmhtqd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247342.405644-1348-132658213925669/AnsiballZ_file.py'
Feb 16 13:09:02 compute-0 sudo[147388]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:09:02 compute-0 python3.9[147390]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:09:02 compute-0 sudo[147388]: pam_unix(sudo:session): session closed for user root
Feb 16 13:09:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:09:03.196 105360 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:09:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:09:03.198 105360 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:09:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:09:03.198 105360 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:09:03 compute-0 sudo[147540]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dsqxarvjzhgejyqhqraklyezkyytervw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247342.9818022-1348-171507027984465/AnsiballZ_file.py'
Feb 16 13:09:03 compute-0 sudo[147540]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:09:03 compute-0 python3.9[147542]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:09:03 compute-0 sudo[147540]: pam_unix(sudo:session): session closed for user root
Feb 16 13:09:03 compute-0 sudo[147692]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xwivcicjhytnbdgonvtpbrbwjeanqnzg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247343.6184623-1348-168473389530801/AnsiballZ_file.py'
Feb 16 13:09:03 compute-0 sudo[147692]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:09:04 compute-0 python3.9[147694]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:09:04 compute-0 sudo[147692]: pam_unix(sudo:session): session closed for user root
Feb 16 13:09:04 compute-0 sudo[147854]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-obbjwfjlrjqmkwfipckvzcrjjtwyejip ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247344.208469-1348-121856045453205/AnsiballZ_file.py'
Feb 16 13:09:04 compute-0 sudo[147854]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:09:04 compute-0 podman[147818]: 2026-02-16 13:09:04.543363465 +0000 UTC m=+0.071451541 container health_status d69bd0be854ec8043fcba555b70c2cd311c7a2510d07d6bc9929fe7684abece9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Feb 16 13:09:04 compute-0 python3.9[147861]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:09:04 compute-0 sudo[147854]: pam_unix(sudo:session): session closed for user root
Feb 16 13:09:05 compute-0 sudo[148016]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zscnwskmiiyzhahkiglzznxjyzqhyhfb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247344.8615866-1348-208594340016198/AnsiballZ_file.py'
Feb 16 13:09:05 compute-0 sudo[148016]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:09:05 compute-0 python3.9[148018]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:09:05 compute-0 sudo[148016]: pam_unix(sudo:session): session closed for user root
Feb 16 13:09:05 compute-0 sudo[148168]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lmmilmrsaovnfaiyqzmgsvvamhhxwroy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247345.4364846-1348-167617623636138/AnsiballZ_file.py'
Feb 16 13:09:05 compute-0 sudo[148168]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:09:05 compute-0 python3.9[148170]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:09:05 compute-0 sudo[148168]: pam_unix(sudo:session): session closed for user root
Feb 16 13:09:06 compute-0 sudo[148320]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-odgajmtvbdhfvztvzcuqkggcygitqnej ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247346.0638213-1348-109368072779889/AnsiballZ_file.py'
Feb 16 13:09:06 compute-0 sudo[148320]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:09:06 compute-0 python3.9[148322]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:09:06 compute-0 sudo[148320]: pam_unix(sudo:session): session closed for user root
Feb 16 13:09:06 compute-0 sudo[148472]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-arpvtzupqszmnuvphbwwdmianauhesge ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247346.6959379-1348-192751437224618/AnsiballZ_file.py'
Feb 16 13:09:06 compute-0 sudo[148472]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:09:07 compute-0 python3.9[148474]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:09:07 compute-0 sudo[148472]: pam_unix(sudo:session): session closed for user root
Feb 16 13:09:07 compute-0 sudo[148624]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gjwmkwrwtjowaqcxxrxqosqmovewwdry ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247347.3582616-1348-43061721633898/AnsiballZ_file.py'
Feb 16 13:09:07 compute-0 sudo[148624]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:09:07 compute-0 podman[148626]: 2026-02-16 13:09:07.770489296 +0000 UTC m=+0.123118425 container health_status 19961a2f872cc72daf954f4dd556f980b5f58f137d145776df6132c167a8a93c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Feb 16 13:09:07 compute-0 python3.9[148627]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:09:07 compute-0 sudo[148624]: pam_unix(sudo:session): session closed for user root
Feb 16 13:09:08 compute-0 sudo[148802]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nwtfjkdweotsnubznlmmleisgnoujmuj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247348.0777583-1546-74771712555319/AnsiballZ_stat.py'
Feb 16 13:09:08 compute-0 sudo[148802]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:09:08 compute-0 python3.9[148804]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:09:08 compute-0 sudo[148802]: pam_unix(sudo:session): session closed for user root
Feb 16 13:09:08 compute-0 sudo[148925]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dhdzkrejzevrijajwibsyodjyrvicmxq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247348.0777583-1546-74771712555319/AnsiballZ_copy.py'
Feb 16 13:09:08 compute-0 sudo[148925]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:09:09 compute-0 python3.9[148927]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771247348.0777583-1546-74771712555319/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:09:09 compute-0 sudo[148925]: pam_unix(sudo:session): session closed for user root
Feb 16 13:09:09 compute-0 sudo[149077]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-icfqmvycnmmwadkfaotesssbwmmxrplj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247349.3602562-1546-159118724705919/AnsiballZ_stat.py'
Feb 16 13:09:09 compute-0 sudo[149077]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:09:09 compute-0 python3.9[149079]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:09:09 compute-0 sudo[149077]: pam_unix(sudo:session): session closed for user root
Feb 16 13:09:10 compute-0 sudo[149200]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-arzuvesoihxavopctlmaripiqikswcnh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247349.3602562-1546-159118724705919/AnsiballZ_copy.py'
Feb 16 13:09:10 compute-0 sudo[149200]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:09:10 compute-0 python3.9[149202]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771247349.3602562-1546-159118724705919/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:09:10 compute-0 sudo[149200]: pam_unix(sudo:session): session closed for user root
Feb 16 13:09:10 compute-0 sudo[149352]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ygrxkahynjmexgbewbfwljgslkmxqgfi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247350.4661708-1546-173869717732850/AnsiballZ_stat.py'
Feb 16 13:09:10 compute-0 sudo[149352]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:09:10 compute-0 python3.9[149354]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:09:10 compute-0 sudo[149352]: pam_unix(sudo:session): session closed for user root
Feb 16 13:09:11 compute-0 sudo[149475]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ffomcvztcpcxcotkoqcptmpyklkpfqti ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247350.4661708-1546-173869717732850/AnsiballZ_copy.py'
Feb 16 13:09:11 compute-0 sudo[149475]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:09:11 compute-0 python3.9[149477]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771247350.4661708-1546-173869717732850/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:09:11 compute-0 sudo[149475]: pam_unix(sudo:session): session closed for user root
Feb 16 13:09:11 compute-0 sudo[149627]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wpwbjzptvymjeblyvrqtkgfbihbmeubz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247351.6407332-1546-86652440970602/AnsiballZ_stat.py'
Feb 16 13:09:11 compute-0 sudo[149627]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:09:12 compute-0 python3.9[149629]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:09:12 compute-0 sudo[149627]: pam_unix(sudo:session): session closed for user root
Feb 16 13:09:12 compute-0 sudo[149750]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ozyjfcvdlbajhapmvonczrrgbroucymj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247351.6407332-1546-86652440970602/AnsiballZ_copy.py'
Feb 16 13:09:12 compute-0 sudo[149750]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:09:12 compute-0 python3.9[149752]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771247351.6407332-1546-86652440970602/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:09:12 compute-0 sudo[149750]: pam_unix(sudo:session): session closed for user root
Feb 16 13:09:13 compute-0 sudo[149902]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-phptbbmmqoqrewjkyzmanqdtfvkkmldp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247352.8171678-1546-240813302216441/AnsiballZ_stat.py'
Feb 16 13:09:13 compute-0 sudo[149902]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:09:13 compute-0 python3.9[149904]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:09:13 compute-0 sudo[149902]: pam_unix(sudo:session): session closed for user root
Feb 16 13:09:13 compute-0 sudo[150025]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gfwdxzbktddfpnirxozjwsiddxdbiwxg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247352.8171678-1546-240813302216441/AnsiballZ_copy.py'
Feb 16 13:09:13 compute-0 sudo[150025]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:09:13 compute-0 python3.9[150027]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771247352.8171678-1546-240813302216441/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:09:13 compute-0 sudo[150025]: pam_unix(sudo:session): session closed for user root
Feb 16 13:09:14 compute-0 sudo[150177]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-abxorulgcvbtilxrukpfhkkkzzsglbwl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247353.9252841-1546-187768062904213/AnsiballZ_stat.py'
Feb 16 13:09:14 compute-0 sudo[150177]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:09:14 compute-0 python3.9[150179]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:09:14 compute-0 sudo[150177]: pam_unix(sudo:session): session closed for user root
Feb 16 13:09:14 compute-0 sudo[150300]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qiaweoqulgnoymudbpbsvwjcmvchdopa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247353.9252841-1546-187768062904213/AnsiballZ_copy.py'
Feb 16 13:09:14 compute-0 sudo[150300]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:09:14 compute-0 python3.9[150302]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771247353.9252841-1546-187768062904213/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:09:14 compute-0 sudo[150300]: pam_unix(sudo:session): session closed for user root
Feb 16 13:09:15 compute-0 sudo[150452]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aadlmhfhswfehmgkgoxfcaeqwcbuhroe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247355.1038234-1546-97372191868883/AnsiballZ_stat.py'
Feb 16 13:09:15 compute-0 sudo[150452]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:09:15 compute-0 python3.9[150454]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:09:15 compute-0 sudo[150452]: pam_unix(sudo:session): session closed for user root
Feb 16 13:09:15 compute-0 sudo[150575]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iutevxgmejutwkukryyxyxyfvopaqzcl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247355.1038234-1546-97372191868883/AnsiballZ_copy.py'
Feb 16 13:09:15 compute-0 sudo[150575]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:09:16 compute-0 python3.9[150577]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771247355.1038234-1546-97372191868883/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:09:16 compute-0 sudo[150575]: pam_unix(sudo:session): session closed for user root
Feb 16 13:09:16 compute-0 sudo[150727]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ehncknshufxprcurbchcqalemjvokubg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247356.5506082-1546-138340573414485/AnsiballZ_stat.py'
Feb 16 13:09:16 compute-0 sudo[150727]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:09:17 compute-0 python3.9[150729]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:09:17 compute-0 sudo[150727]: pam_unix(sudo:session): session closed for user root
Feb 16 13:09:17 compute-0 sudo[150850]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-byqrfqfuddhklroluckfuxbiurxtuswr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247356.5506082-1546-138340573414485/AnsiballZ_copy.py'
Feb 16 13:09:17 compute-0 sudo[150850]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:09:17 compute-0 python3.9[150852]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771247356.5506082-1546-138340573414485/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:09:17 compute-0 sudo[150850]: pam_unix(sudo:session): session closed for user root
Feb 16 13:09:18 compute-0 sudo[151003]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-odrbvginqbtvutchgmzdeligvwqcnldt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247357.790583-1546-213788846891474/AnsiballZ_stat.py'
Feb 16 13:09:18 compute-0 sudo[151003]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:09:18 compute-0 python3.9[151005]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:09:18 compute-0 sudo[151003]: pam_unix(sudo:session): session closed for user root
Feb 16 13:09:18 compute-0 sudo[151127]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ztoinceuazudhrryczskpitmfdzmpbgy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247357.790583-1546-213788846891474/AnsiballZ_copy.py'
Feb 16 13:09:18 compute-0 sudo[151127]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:09:18 compute-0 sshd-session[150853]: Connection closed by authenticating user root 146.190.226.24 port 42130 [preauth]
Feb 16 13:09:18 compute-0 python3.9[151129]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771247357.790583-1546-213788846891474/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:09:18 compute-0 sudo[151127]: pam_unix(sudo:session): session closed for user root
Feb 16 13:09:19 compute-0 sudo[151279]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lkiouorriqbkwwojyeutaruncesmfutv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247358.9561899-1546-571432366271/AnsiballZ_stat.py'
Feb 16 13:09:19 compute-0 sudo[151279]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:09:19 compute-0 python3.9[151281]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:09:19 compute-0 sudo[151279]: pam_unix(sudo:session): session closed for user root
Feb 16 13:09:19 compute-0 sudo[151402]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-brsufvpvdumohtfrooajonmdpevnvvly ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247358.9561899-1546-571432366271/AnsiballZ_copy.py'
Feb 16 13:09:19 compute-0 sudo[151402]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:09:20 compute-0 python3.9[151404]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771247358.9561899-1546-571432366271/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:09:20 compute-0 sudo[151402]: pam_unix(sudo:session): session closed for user root
Feb 16 13:09:20 compute-0 sudo[151554]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fdlzsvlmtxvpjwsqwraeearezclfwhrk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247360.1853774-1546-125995448577765/AnsiballZ_stat.py'
Feb 16 13:09:20 compute-0 sudo[151554]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:09:20 compute-0 python3.9[151556]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:09:20 compute-0 sudo[151554]: pam_unix(sudo:session): session closed for user root
Feb 16 13:09:21 compute-0 sudo[151677]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aauridsneilzklgxydcnjadxtrtoldgh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247360.1853774-1546-125995448577765/AnsiballZ_copy.py'
Feb 16 13:09:21 compute-0 sudo[151677]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:09:21 compute-0 python3.9[151679]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771247360.1853774-1546-125995448577765/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:09:21 compute-0 sudo[151677]: pam_unix(sudo:session): session closed for user root
Feb 16 13:09:21 compute-0 sudo[151829]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jxwzwzcumsfwplvfshnpxihlmmezzhnk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247361.4175642-1546-105284753930412/AnsiballZ_stat.py'
Feb 16 13:09:21 compute-0 sudo[151829]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:09:21 compute-0 python3.9[151831]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:09:21 compute-0 sudo[151829]: pam_unix(sudo:session): session closed for user root
Feb 16 13:09:22 compute-0 sudo[151952]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tidzsqpwqhxcgkgsupehxfjuqdvygazg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247361.4175642-1546-105284753930412/AnsiballZ_copy.py'
Feb 16 13:09:22 compute-0 sudo[151952]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:09:22 compute-0 python3.9[151954]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771247361.4175642-1546-105284753930412/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:09:22 compute-0 sudo[151952]: pam_unix(sudo:session): session closed for user root
Feb 16 13:09:22 compute-0 sudo[152104]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-buobmbhfkrytsiziizdkcncgzstejkzc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247362.5888133-1546-102258329636861/AnsiballZ_stat.py'
Feb 16 13:09:22 compute-0 sudo[152104]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:09:23 compute-0 python3.9[152106]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:09:23 compute-0 sudo[152104]: pam_unix(sudo:session): session closed for user root
Feb 16 13:09:23 compute-0 sudo[152227]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xelimoxfieakvxxiaepelsvyoukqwhbb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247362.5888133-1546-102258329636861/AnsiballZ_copy.py'
Feb 16 13:09:23 compute-0 sudo[152227]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:09:23 compute-0 python3.9[152229]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771247362.5888133-1546-102258329636861/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:09:23 compute-0 sudo[152227]: pam_unix(sudo:session): session closed for user root
Feb 16 13:09:24 compute-0 sudo[152379]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aqcsghgaujcmkjqcmcojiyhkqbhjxasa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247363.728885-1546-213870197180121/AnsiballZ_stat.py'
Feb 16 13:09:24 compute-0 sudo[152379]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:09:24 compute-0 python3.9[152381]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:09:24 compute-0 sudo[152379]: pam_unix(sudo:session): session closed for user root
Feb 16 13:09:24 compute-0 sudo[152502]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gdcdmitmojjzmyjaokauvdzwztetbvxx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247363.728885-1546-213870197180121/AnsiballZ_copy.py'
Feb 16 13:09:24 compute-0 sudo[152502]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:09:24 compute-0 python3.9[152504]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771247363.728885-1546-213870197180121/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:09:24 compute-0 sudo[152502]: pam_unix(sudo:session): session closed for user root
Feb 16 13:09:25 compute-0 python3.9[152654]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail
                                             ls -lRZ /run/libvirt | grep -E ':container_\S+_t'
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 13:09:26 compute-0 sudo[152807]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fdamioclmujnyrnryyzizsysdnqjpngr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247365.701908-1958-144981036489982/AnsiballZ_seboolean.py'
Feb 16 13:09:26 compute-0 sudo[152807]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:09:26 compute-0 python3.9[152809]: ansible-ansible.posix.seboolean Invoked with name=os_enable_vtpm persistent=True state=True ignore_selinux_state=False
Feb 16 13:09:27 compute-0 sudo[152807]: pam_unix(sudo:session): session closed for user root
Feb 16 13:09:27 compute-0 sudo[152963]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-avwzvijalhyywwssxujujcfggafuemkl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247367.551255-1974-21347416700474/AnsiballZ_copy.py'
Feb 16 13:09:27 compute-0 dbus-broker-launch[808]: avc:  op=load_policy lsm=selinux seqno=13 res=1
Feb 16 13:09:27 compute-0 sudo[152963]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:09:28 compute-0 python3.9[152965]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/servercert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:09:28 compute-0 sudo[152963]: pam_unix(sudo:session): session closed for user root
Feb 16 13:09:28 compute-0 sudo[153115]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hivkaitopwyycrevramuiqxnrogqhuwb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247368.27289-1974-111304660757359/AnsiballZ_copy.py'
Feb 16 13:09:28 compute-0 sudo[153115]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:09:28 compute-0 python3.9[153117]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/serverkey.pem group=root mode=0600 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:09:28 compute-0 sudo[153115]: pam_unix(sudo:session): session closed for user root
Feb 16 13:09:29 compute-0 sudo[153267]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ekodjizoxwqasntdciztgobxyoorjwwg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247368.8646352-1974-72028007097160/AnsiballZ_copy.py'
Feb 16 13:09:29 compute-0 sudo[153267]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:09:29 compute-0 python3.9[153269]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/clientcert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:09:29 compute-0 sudo[153267]: pam_unix(sudo:session): session closed for user root
Feb 16 13:09:29 compute-0 sudo[153419]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hsntgxymxqajeplpbfudciqrdwwbyzvi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247369.5952168-1974-141025963490920/AnsiballZ_copy.py'
Feb 16 13:09:29 compute-0 sudo[153419]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:09:30 compute-0 python3.9[153421]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/clientkey.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:09:30 compute-0 sudo[153419]: pam_unix(sudo:session): session closed for user root
Feb 16 13:09:30 compute-0 sudo[153571]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ajnibnzbpolsqslcjxzpbhfjdmdqkbta ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247370.2043538-1974-24009476449221/AnsiballZ_copy.py'
Feb 16 13:09:30 compute-0 sudo[153571]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:09:30 compute-0 python3.9[153573]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/CA/cacert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:09:30 compute-0 sudo[153571]: pam_unix(sudo:session): session closed for user root
Feb 16 13:09:31 compute-0 sudo[153723]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lnimeqxfqkuujdzcbcarsofzlnhlmnfh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247370.8787146-2046-148345275663502/AnsiballZ_copy.py'
Feb 16 13:09:31 compute-0 sudo[153723]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:09:31 compute-0 python3.9[153725]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:09:31 compute-0 sudo[153723]: pam_unix(sudo:session): session closed for user root
Feb 16 13:09:31 compute-0 sudo[153875]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pikmwsmedlihsnyupnbcjvvznjvengvj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247371.5377629-2046-8263326757326/AnsiballZ_copy.py'
Feb 16 13:09:31 compute-0 sudo[153875]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:09:32 compute-0 python3.9[153877]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:09:32 compute-0 sudo[153875]: pam_unix(sudo:session): session closed for user root
Feb 16 13:09:32 compute-0 sudo[154027]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uuxhqzcfzxpcnwsoezinykzxffkjnenc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247372.1748502-2046-118714934660804/AnsiballZ_copy.py'
Feb 16 13:09:32 compute-0 sudo[154027]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:09:32 compute-0 python3.9[154029]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:09:32 compute-0 sudo[154027]: pam_unix(sudo:session): session closed for user root
Feb 16 13:09:33 compute-0 sudo[154179]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zvfllkgiraobyqoehbpkfqytsutvxalu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247372.8193326-2046-274573587044097/AnsiballZ_copy.py'
Feb 16 13:09:33 compute-0 sudo[154179]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:09:33 compute-0 python3.9[154181]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:09:33 compute-0 sudo[154179]: pam_unix(sudo:session): session closed for user root
Feb 16 13:09:33 compute-0 sudo[154331]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zughykqcqisyxmxajxfzhbsuhltorhfr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247373.4665687-2046-70178419194658/AnsiballZ_copy.py'
Feb 16 13:09:33 compute-0 sudo[154331]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:09:33 compute-0 python3.9[154333]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/ca-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:09:33 compute-0 sudo[154331]: pam_unix(sudo:session): session closed for user root
Feb 16 13:09:34 compute-0 sudo[154483]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mckacmjzmmyfzxqiuptyxsjsbbzsipxm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247374.1559143-2118-66417923586751/AnsiballZ_systemd.py'
Feb 16 13:09:34 compute-0 sudo[154483]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:09:34 compute-0 python3.9[154485]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtlogd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 16 13:09:34 compute-0 systemd[1]: Reloading.
Feb 16 13:09:34 compute-0 systemd-rc-local-generator[154531]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 13:09:34 compute-0 systemd-sysv-generator[154534]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 16 13:09:34 compute-0 podman[154487]: 2026-02-16 13:09:34.884595515 +0000 UTC m=+0.083342506 container health_status d69bd0be854ec8043fcba555b70c2cd311c7a2510d07d6bc9929fe7684abece9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_managed=true)
Feb 16 13:09:35 compute-0 systemd[1]: Starting libvirt logging daemon socket...
Feb 16 13:09:35 compute-0 systemd[1]: Listening on libvirt logging daemon socket.
Feb 16 13:09:35 compute-0 systemd[1]: Starting libvirt logging daemon admin socket...
Feb 16 13:09:35 compute-0 systemd[1]: Listening on libvirt logging daemon admin socket.
Feb 16 13:09:35 compute-0 systemd[1]: Starting libvirt logging daemon...
Feb 16 13:09:35 compute-0 systemd[1]: Started libvirt logging daemon.
Feb 16 13:09:35 compute-0 sudo[154483]: pam_unix(sudo:session): session closed for user root
Feb 16 13:09:35 compute-0 sudo[154704]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yormcjmqpwwbluuszxexqpfxwejqyeky ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247375.298778-2118-244304119678800/AnsiballZ_systemd.py'
Feb 16 13:09:35 compute-0 sudo[154704]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:09:35 compute-0 python3.9[154706]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtnodedevd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 16 13:09:35 compute-0 systemd[1]: Reloading.
Feb 16 13:09:35 compute-0 systemd-rc-local-generator[154727]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 13:09:35 compute-0 systemd-sysv-generator[154733]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 16 13:09:36 compute-0 systemd[1]: Starting libvirt nodedev daemon socket...
Feb 16 13:09:36 compute-0 systemd[1]: Listening on libvirt nodedev daemon socket.
Feb 16 13:09:36 compute-0 systemd[1]: Starting libvirt nodedev daemon admin socket...
Feb 16 13:09:36 compute-0 systemd[1]: Starting libvirt nodedev daemon read-only socket...
Feb 16 13:09:36 compute-0 systemd[1]: Listening on libvirt nodedev daemon admin socket.
Feb 16 13:09:36 compute-0 systemd[1]: Listening on libvirt nodedev daemon read-only socket.
Feb 16 13:09:36 compute-0 systemd[1]: Starting libvirt nodedev daemon...
Feb 16 13:09:36 compute-0 systemd[1]: Started libvirt nodedev daemon.
Feb 16 13:09:36 compute-0 sudo[154704]: pam_unix(sudo:session): session closed for user root
Feb 16 13:09:36 compute-0 sudo[154927]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zwrjwbusnxduqlrckosnuggliwmwnjlr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247376.3778138-2118-75418658865996/AnsiballZ_systemd.py'
Feb 16 13:09:36 compute-0 sudo[154927]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:09:36 compute-0 python3.9[154929]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtproxyd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 16 13:09:36 compute-0 systemd[1]: Reloading.
Feb 16 13:09:37 compute-0 systemd-rc-local-generator[154953]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 13:09:37 compute-0 systemd-sysv-generator[154956]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 16 13:09:37 compute-0 systemd[1]: Starting SETroubleshoot daemon for processing new SELinux denial logs...
Feb 16 13:09:37 compute-0 systemd[1]: Starting libvirt proxy daemon admin socket...
Feb 16 13:09:37 compute-0 systemd[1]: Starting libvirt proxy daemon read-only socket...
Feb 16 13:09:37 compute-0 systemd[1]: Listening on libvirt proxy daemon admin socket.
Feb 16 13:09:37 compute-0 systemd[1]: Listening on libvirt proxy daemon read-only socket.
Feb 16 13:09:37 compute-0 systemd[1]: Starting libvirt proxy daemon...
Feb 16 13:09:37 compute-0 systemd[1]: Started libvirt proxy daemon.
Feb 16 13:09:37 compute-0 sudo[154927]: pam_unix(sudo:session): session closed for user root
Feb 16 13:09:37 compute-0 systemd[1]: Started SETroubleshoot daemon for processing new SELinux denial logs.
Feb 16 13:09:37 compute-0 systemd[1]: Created slice Slice /system/dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged.
Feb 16 13:09:37 compute-0 systemd[1]: Started dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service.
Feb 16 13:09:37 compute-0 sudo[155154]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-blijunmhetgwcanigmibnrwklgkybzqy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247377.4471564-2118-156256543646599/AnsiballZ_systemd.py'
Feb 16 13:09:37 compute-0 sudo[155154]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:09:38 compute-0 python3.9[155156]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtqemud.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 16 13:09:38 compute-0 systemd[1]: Reloading.
Feb 16 13:09:38 compute-0 podman[155159]: 2026-02-16 13:09:38.156709557 +0000 UTC m=+0.189265016 container health_status 19961a2f872cc72daf954f4dd556f980b5f58f137d145776df6132c167a8a93c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Feb 16 13:09:38 compute-0 systemd-rc-local-generator[155209]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 13:09:38 compute-0 systemd-sysv-generator[155212]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 16 13:09:38 compute-0 systemd[1]: Listening on libvirt locking daemon socket.
Feb 16 13:09:38 compute-0 systemd[1]: Starting libvirt QEMU daemon socket...
Feb 16 13:09:38 compute-0 systemd[1]: Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw).
Feb 16 13:09:38 compute-0 systemd[1]: Starting Virtual Machine and Container Registration Service...
Feb 16 13:09:38 compute-0 systemd[1]: Listening on libvirt QEMU daemon socket.
Feb 16 13:09:38 compute-0 systemd[1]: Starting libvirt QEMU daemon admin socket...
Feb 16 13:09:38 compute-0 systemd[1]: Starting libvirt QEMU daemon read-only socket...
Feb 16 13:09:38 compute-0 systemd[1]: Listening on libvirt QEMU daemon admin socket.
Feb 16 13:09:38 compute-0 systemd[1]: Listening on libvirt QEMU daemon read-only socket.
Feb 16 13:09:38 compute-0 systemd[1]: Started Virtual Machine and Container Registration Service.
Feb 16 13:09:38 compute-0 systemd[1]: Starting libvirt QEMU daemon...
Feb 16 13:09:38 compute-0 systemd[1]: Started libvirt QEMU daemon.
Feb 16 13:09:38 compute-0 sudo[155154]: pam_unix(sudo:session): session closed for user root
Feb 16 13:09:38 compute-0 setroubleshoot[154974]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l a5d2fe2a-11d2-4369-9bcb-f95d9357f9ee
Feb 16 13:09:38 compute-0 setroubleshoot[154974]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.
                                                  
                                                  *****  Plugin dac_override (91.4 confidence) suggests   **********************
                                                  
                                                  If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system
                                                  Then turn on full auditing to get path information about the offending file and generate the error again.
                                                  Do
                                                  
                                                  Turn on full auditing
                                                  # auditctl -w /etc/shadow -p w
                                                  Try to recreate AVC. Then execute
                                                  # ausearch -m avc -ts recent
                                                  If you see PATH record check ownership/permissions on file, and fix it,
                                                  otherwise report as a bugzilla.
                                                  
                                                  *****  Plugin catchall (9.59 confidence) suggests   **************************
                                                  
                                                  If you believe that virtlogd should have the dac_read_search capability by default.
                                                  Then you should report this as a bug.
                                                  You can generate a local policy module to allow this access.
                                                  Do
                                                  allow this access for now by executing:
                                                  # ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd
                                                  # semodule -X 300 -i my-virtlogd.pp
                                                  
Feb 16 13:09:38 compute-0 setroubleshoot[154974]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l a5d2fe2a-11d2-4369-9bcb-f95d9357f9ee
Feb 16 13:09:38 compute-0 rsyslogd[1017]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Feb 16 13:09:38 compute-0 rsyslogd[1017]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Feb 16 13:09:38 compute-0 setroubleshoot[154974]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.
                                                  
                                                  *****  Plugin dac_override (91.4 confidence) suggests   **********************
                                                  
                                                  If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system
                                                  Then turn on full auditing to get path information about the offending file and generate the error again.
                                                  Do
                                                  
                                                  Turn on full auditing
                                                  # auditctl -w /etc/shadow -p w
                                                  Try to recreate AVC. Then execute
                                                  # ausearch -m avc -ts recent
                                                  If you see PATH record check ownership/permissions on file, and fix it,
                                                  otherwise report as a bugzilla.
                                                  
                                                  *****  Plugin catchall (9.59 confidence) suggests   **************************
                                                  
                                                  If you believe that virtlogd should have the dac_read_search capability by default.
                                                  Then you should report this as a bug.
                                                  You can generate a local policy module to allow this access.
                                                  Do
                                                  allow this access for now by executing:
                                                  # ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd
                                                  # semodule -X 300 -i my-virtlogd.pp
                                                  
Feb 16 13:09:38 compute-0 sudo[155404]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qwzxtfqssmgrpmgluvwthkfwyygeeqwx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247378.6129558-2118-222953875146576/AnsiballZ_systemd.py'
Feb 16 13:09:38 compute-0 sudo[155404]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:09:39 compute-0 python3.9[155406]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtsecretd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 16 13:09:39 compute-0 systemd[1]: Reloading.
Feb 16 13:09:39 compute-0 systemd-sysv-generator[155438]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 16 13:09:39 compute-0 systemd-rc-local-generator[155433]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 13:09:39 compute-0 systemd[1]: Starting libvirt secret daemon socket...
Feb 16 13:09:39 compute-0 systemd[1]: Listening on libvirt secret daemon socket.
Feb 16 13:09:39 compute-0 systemd[1]: Starting libvirt secret daemon admin socket...
Feb 16 13:09:39 compute-0 systemd[1]: Starting libvirt secret daemon read-only socket...
Feb 16 13:09:39 compute-0 systemd[1]: Listening on libvirt secret daemon admin socket.
Feb 16 13:09:39 compute-0 systemd[1]: Listening on libvirt secret daemon read-only socket.
Feb 16 13:09:39 compute-0 systemd[1]: Starting libvirt secret daemon...
Feb 16 13:09:39 compute-0 systemd[1]: Started libvirt secret daemon.
Feb 16 13:09:39 compute-0 sudo[155404]: pam_unix(sudo:session): session closed for user root
Feb 16 13:09:40 compute-0 sudo[155623]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hftjgsvmjcudamlfmtptpcaqbamiseyc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247379.8934348-2192-92419913481694/AnsiballZ_file.py'
Feb 16 13:09:40 compute-0 sudo[155623]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:09:40 compute-0 python3.9[155625]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:09:40 compute-0 sudo[155623]: pam_unix(sudo:session): session closed for user root
Feb 16 13:09:40 compute-0 sudo[155775]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ojicuaxutacwluettrjxrdarklfraytq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247380.6072292-2208-64996065180781/AnsiballZ_find.py'
Feb 16 13:09:40 compute-0 sudo[155775]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:09:41 compute-0 python3.9[155777]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/config/ceph'] patterns=['*.conf'] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Feb 16 13:09:41 compute-0 sudo[155775]: pam_unix(sudo:session): session closed for user root
Feb 16 13:09:41 compute-0 sudo[155927]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kychasvwyuujngbyphspypdcsiqporyg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247381.605512-2236-82593064943899/AnsiballZ_stat.py'
Feb 16 13:09:41 compute-0 sudo[155927]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:09:42 compute-0 python3.9[155929]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/libvirt.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:09:42 compute-0 sudo[155927]: pam_unix(sudo:session): session closed for user root
Feb 16 13:09:42 compute-0 sudo[156050]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xksdkbxdzbbopmvvwtulrligeeevkmud ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247381.605512-2236-82593064943899/AnsiballZ_copy.py'
Feb 16 13:09:42 compute-0 sudo[156050]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:09:42 compute-0 python3.9[156052]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/libvirt.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1771247381.605512-2236-82593064943899/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=5ca83b1310a74c5e48c4c3d4640e1cb8fdac1061 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:09:42 compute-0 sudo[156050]: pam_unix(sudo:session): session closed for user root
Feb 16 13:09:43 compute-0 sudo[156202]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-njqcpvqeafhldbbuxnfbiqozklujhbho ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247382.986436-2268-51992052256837/AnsiballZ_file.py'
Feb 16 13:09:43 compute-0 sudo[156202]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:09:43 compute-0 python3.9[156204]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:09:43 compute-0 sudo[156202]: pam_unix(sudo:session): session closed for user root
Feb 16 13:09:44 compute-0 sudo[156354]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-roklfglhffcpuckomcbldlgzigrdgntz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247383.7080588-2284-126969393930269/AnsiballZ_stat.py'
Feb 16 13:09:44 compute-0 sudo[156354]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:09:44 compute-0 python3.9[156356]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:09:44 compute-0 sudo[156354]: pam_unix(sudo:session): session closed for user root
Feb 16 13:09:44 compute-0 sudo[156432]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-npzisbptawiimngdgiuegqakgkodeakg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247383.7080588-2284-126969393930269/AnsiballZ_file.py'
Feb 16 13:09:44 compute-0 sudo[156432]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:09:44 compute-0 python3.9[156434]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:09:44 compute-0 sudo[156432]: pam_unix(sudo:session): session closed for user root
Feb 16 13:09:45 compute-0 sudo[156584]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rdsonoardsizmuagzvavzeffcllrlegd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247384.9168026-2308-115826223423487/AnsiballZ_stat.py'
Feb 16 13:09:45 compute-0 sudo[156584]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:09:45 compute-0 python3.9[156586]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:09:45 compute-0 sudo[156584]: pam_unix(sudo:session): session closed for user root
Feb 16 13:09:45 compute-0 sudo[156662]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nchioxhspjyqrxsmdotvmnaitqmyfaau ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247384.9168026-2308-115826223423487/AnsiballZ_file.py'
Feb 16 13:09:45 compute-0 sudo[156662]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:09:45 compute-0 python3.9[156664]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.5kr4yauf recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:09:45 compute-0 sudo[156662]: pam_unix(sudo:session): session closed for user root
Feb 16 13:09:46 compute-0 sudo[156814]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qtovexxacgzctzasmanshaossjxypoka ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247386.0883625-2332-212425972296802/AnsiballZ_stat.py'
Feb 16 13:09:46 compute-0 sudo[156814]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:09:46 compute-0 python3.9[156816]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:09:46 compute-0 sudo[156814]: pam_unix(sudo:session): session closed for user root
Feb 16 13:09:46 compute-0 sudo[156892]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uifiqlrvudgdrmqprjifoczaaoopfaqs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247386.0883625-2332-212425972296802/AnsiballZ_file.py'
Feb 16 13:09:46 compute-0 sudo[156892]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:09:47 compute-0 python3.9[156894]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:09:47 compute-0 sudo[156892]: pam_unix(sudo:session): session closed for user root
Feb 16 13:09:47 compute-0 sudo[157044]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vvekwsmzbfhdbiyzysmkdpfobrubqsma ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247387.3027833-2358-101116982096940/AnsiballZ_command.py'
Feb 16 13:09:47 compute-0 sudo[157044]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:09:47 compute-0 python3.9[157046]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 13:09:47 compute-0 sudo[157044]: pam_unix(sudo:session): session closed for user root
Feb 16 13:09:48 compute-0 sudo[157197]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lamidmlcigjwhtvrknhyodwusoifxizr ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1771247387.9958038-2374-49112748243617/AnsiballZ_edpm_nftables_from_files.py'
Feb 16 13:09:48 compute-0 sudo[157197]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:09:48 compute-0 systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service: Deactivated successfully.
Feb 16 13:09:48 compute-0 systemd[1]: setroubleshootd.service: Deactivated successfully.
Feb 16 13:09:48 compute-0 python3[157199]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Feb 16 13:09:48 compute-0 sudo[157197]: pam_unix(sudo:session): session closed for user root
Feb 16 13:09:49 compute-0 sudo[157349]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ynvbbfhunhbrfsljfuhzsctumdqydupd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247389.1282823-2390-267751738936813/AnsiballZ_stat.py'
Feb 16 13:09:49 compute-0 sudo[157349]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:09:49 compute-0 python3.9[157351]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:09:49 compute-0 sudo[157349]: pam_unix(sudo:session): session closed for user root
Feb 16 13:09:49 compute-0 sudo[157427]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-snalaiepkhnmahomxhxpqustakdlxkmz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247389.1282823-2390-267751738936813/AnsiballZ_file.py'
Feb 16 13:09:49 compute-0 sudo[157427]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:09:50 compute-0 python3.9[157429]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:09:50 compute-0 sudo[157427]: pam_unix(sudo:session): session closed for user root
Feb 16 13:09:50 compute-0 sudo[157579]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jakmkxphubudiqxptqdlwxfmltcfmouc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247390.3186007-2414-35705887630850/AnsiballZ_stat.py'
Feb 16 13:09:50 compute-0 sudo[157579]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:09:50 compute-0 python3.9[157581]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:09:50 compute-0 sudo[157579]: pam_unix(sudo:session): session closed for user root
Feb 16 13:09:51 compute-0 sudo[157704]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-owjuxdpfowfvecftkpsbnvjxyloctlal ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247390.3186007-2414-35705887630850/AnsiballZ_copy.py'
Feb 16 13:09:51 compute-0 sudo[157704]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:09:51 compute-0 python3.9[157706]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771247390.3186007-2414-35705887630850/.source.nft follow=False _original_basename=jump-chain.j2 checksum=3ce353c89bce3b135a0ed688d4e338b2efb15185 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:09:51 compute-0 sudo[157704]: pam_unix(sudo:session): session closed for user root
Feb 16 13:09:51 compute-0 sudo[157856]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uchkajncqfwtrmpjhfrpfhcbxgjsabsy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247391.5503724-2444-239556111091920/AnsiballZ_stat.py'
Feb 16 13:09:51 compute-0 sudo[157856]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:09:52 compute-0 python3.9[157858]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:09:52 compute-0 sudo[157856]: pam_unix(sudo:session): session closed for user root
Feb 16 13:09:52 compute-0 sudo[157934]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xozrekeyzxbsnhqqtypydrasyxauqohq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247391.5503724-2444-239556111091920/AnsiballZ_file.py'
Feb 16 13:09:52 compute-0 sudo[157934]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:09:52 compute-0 python3.9[157936]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:09:52 compute-0 sudo[157934]: pam_unix(sudo:session): session closed for user root
Feb 16 13:09:53 compute-0 sudo[158086]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oclubycaucztfldbupapimurzaczytkd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247392.7067962-2468-160705250137163/AnsiballZ_stat.py'
Feb 16 13:09:53 compute-0 sudo[158086]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:09:53 compute-0 python3.9[158088]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:09:53 compute-0 sudo[158086]: pam_unix(sudo:session): session closed for user root
Feb 16 13:09:53 compute-0 sudo[158164]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kzhwiahejnwpwrwllwlfzcbtznxpfkbx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247392.7067962-2468-160705250137163/AnsiballZ_file.py'
Feb 16 13:09:53 compute-0 sudo[158164]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:09:53 compute-0 python3.9[158166]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:09:53 compute-0 sudo[158164]: pam_unix(sudo:session): session closed for user root
Feb 16 13:09:54 compute-0 sudo[158316]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aikzfxywiucbmxsitrabblekfpidqsuf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247393.82431-2492-183423741579246/AnsiballZ_stat.py'
Feb 16 13:09:54 compute-0 sudo[158316]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:09:54 compute-0 python3.9[158318]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:09:54 compute-0 sudo[158316]: pam_unix(sudo:session): session closed for user root
Feb 16 13:09:54 compute-0 sudo[158441]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cfyworraycramkqkrkvalurdpzsvigcj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247393.82431-2492-183423741579246/AnsiballZ_copy.py'
Feb 16 13:09:54 compute-0 sudo[158441]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:09:54 compute-0 python3.9[158443]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771247393.82431-2492-183423741579246/.source.nft follow=False _original_basename=ruleset.j2 checksum=8a12d4eb5149b6e500230381c1359a710881e9b0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:09:54 compute-0 sudo[158441]: pam_unix(sudo:session): session closed for user root
Feb 16 13:09:55 compute-0 sudo[158593]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wvfohdbvzeqgjqofzatceeupvvmystmo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247395.1750474-2522-122260580064848/AnsiballZ_file.py'
Feb 16 13:09:55 compute-0 sudo[158593]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:09:55 compute-0 python3.9[158595]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:09:55 compute-0 sudo[158593]: pam_unix(sudo:session): session closed for user root
Feb 16 13:09:56 compute-0 sudo[158745]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sgsxpdqtdmmgvjcwixislwikspxwzqcw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247395.964874-2538-199938676143187/AnsiballZ_command.py'
Feb 16 13:09:56 compute-0 sudo[158745]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:09:56 compute-0 python3.9[158747]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 13:09:56 compute-0 sudo[158745]: pam_unix(sudo:session): session closed for user root
Feb 16 13:09:57 compute-0 sudo[158900]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tltmuebsmgnkusbmeivqlantuilrtocu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247396.7180853-2554-135741521136698/AnsiballZ_blockinfile.py'
Feb 16 13:09:57 compute-0 sudo[158900]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:09:57 compute-0 python3.9[158902]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                             include "/etc/nftables/edpm-chains.nft"
                                             include "/etc/nftables/edpm-rules.nft"
                                             include "/etc/nftables/edpm-jumps.nft"
                                              path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:09:57 compute-0 sudo[158900]: pam_unix(sudo:session): session closed for user root
Feb 16 13:09:58 compute-0 sudo[159052]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pfrbfhmhmiondoleggenzdpqswjhlbzs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247398.0891416-2572-197787487975494/AnsiballZ_command.py'
Feb 16 13:09:58 compute-0 sudo[159052]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:09:58 compute-0 python3.9[159054]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 13:09:58 compute-0 sudo[159052]: pam_unix(sudo:session): session closed for user root
Feb 16 13:09:59 compute-0 sudo[159205]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-swkryxsqgmapuhwiquocbthfdzfszyli ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247398.7792478-2588-151645862671449/AnsiballZ_stat.py'
Feb 16 13:09:59 compute-0 sudo[159205]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:09:59 compute-0 python3.9[159207]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 16 13:09:59 compute-0 sudo[159205]: pam_unix(sudo:session): session closed for user root
Feb 16 13:09:59 compute-0 sudo[159359]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qjqcpzguugdosjhkwahzhphgmjdtjikl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247399.4556394-2604-228547904998502/AnsiballZ_command.py'
Feb 16 13:09:59 compute-0 sudo[159359]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:09:59 compute-0 python3.9[159361]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 13:09:59 compute-0 sudo[159359]: pam_unix(sudo:session): session closed for user root
Feb 16 13:10:00 compute-0 sudo[159514]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kmrfwzcqnvjjmjtoxnoqkzouyxmrbkud ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247400.1860871-2620-207808832328482/AnsiballZ_file.py'
Feb 16 13:10:00 compute-0 sudo[159514]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:10:00 compute-0 python3.9[159516]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:10:00 compute-0 sudo[159514]: pam_unix(sudo:session): session closed for user root
Feb 16 13:10:01 compute-0 sudo[159666]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ylynaobdsmbpqdltufahjlveyqudrvwo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247401.0220416-2636-96830410532588/AnsiballZ_stat.py'
Feb 16 13:10:01 compute-0 sudo[159666]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:10:01 compute-0 sshd-session[159669]: Connection closed by 146.190.22.227 port 35618
Feb 16 13:10:01 compute-0 python3.9[159668]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:10:01 compute-0 sudo[159666]: pam_unix(sudo:session): session closed for user root
Feb 16 13:10:02 compute-0 sudo[159790]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gipkblbudubkjlhybycaeyvbnrsloakm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247401.0220416-2636-96830410532588/AnsiballZ_copy.py'
Feb 16 13:10:02 compute-0 sudo[159790]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:10:02 compute-0 python3.9[159792]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771247401.0220416-2636-96830410532588/.source.target follow=False _original_basename=edpm_libvirt.target checksum=13035a1aa0f414c677b14be9a5a363b6623d393c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:10:02 compute-0 sudo[159790]: pam_unix(sudo:session): session closed for user root
Feb 16 13:10:02 compute-0 sudo[159942]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rbyoeriqhlfcbebexzgqzoriacjnqgyp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247402.5046196-2666-55262663078323/AnsiballZ_stat.py'
Feb 16 13:10:02 compute-0 sudo[159942]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:10:02 compute-0 python3.9[159944]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt_guests.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:10:02 compute-0 sudo[159942]: pam_unix(sudo:session): session closed for user root
Feb 16 13:10:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:10:03.196 105360 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:10:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:10:03.198 105360 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:10:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:10:03.198 105360 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:10:03 compute-0 sudo[160065]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-akcyxvejniepbzljndcarcfhwwpvqmfn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247402.5046196-2666-55262663078323/AnsiballZ_copy.py'
Feb 16 13:10:03 compute-0 sudo[160065]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:10:03 compute-0 python3.9[160067]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt_guests.service mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771247402.5046196-2666-55262663078323/.source.service follow=False _original_basename=edpm_libvirt_guests.service checksum=db83430a42fc2ccfd6ed8b56ebf04f3dff9cd0cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:10:03 compute-0 sudo[160065]: pam_unix(sudo:session): session closed for user root
Feb 16 13:10:04 compute-0 sudo[160217]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-twdguqizzzumsmndltbnbgvyopdnagxb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247403.8251727-2696-55445298292004/AnsiballZ_stat.py'
Feb 16 13:10:04 compute-0 sudo[160217]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:10:04 compute-0 python3.9[160219]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virt-guest-shutdown.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:10:04 compute-0 sudo[160217]: pam_unix(sudo:session): session closed for user root
Feb 16 13:10:04 compute-0 sudo[160340]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qfggplvbvbzjhwvxtuiqolpdsejtkxeo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247403.8251727-2696-55445298292004/AnsiballZ_copy.py'
Feb 16 13:10:04 compute-0 sudo[160340]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:10:04 compute-0 python3.9[160342]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virt-guest-shutdown.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771247403.8251727-2696-55445298292004/.source.target follow=False _original_basename=virt-guest-shutdown.target checksum=49ca149619c596cbba877418629d2cf8f7b0f5cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:10:04 compute-0 sudo[160340]: pam_unix(sudo:session): session closed for user root
Feb 16 13:10:05 compute-0 sudo[160505]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fxrcxzsrpzwonfdciqcejjqygqfahkbl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247405.1575682-2726-136979213563381/AnsiballZ_systemd.py'
Feb 16 13:10:05 compute-0 sudo[160505]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:10:05 compute-0 podman[160466]: 2026-02-16 13:10:05.745801701 +0000 UTC m=+0.074252481 container health_status d69bd0be854ec8043fcba555b70c2cd311c7a2510d07d6bc9929fe7684abece9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Feb 16 13:10:06 compute-0 python3.9[160511]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt.target state=restarted daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 16 13:10:06 compute-0 systemd[1]: Reloading.
Feb 16 13:10:06 compute-0 systemd-sysv-generator[160541]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 16 13:10:06 compute-0 systemd-rc-local-generator[160532]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 13:10:06 compute-0 systemd[1]: Reached target edpm_libvirt.target.
Feb 16 13:10:06 compute-0 sudo[160505]: pam_unix(sudo:session): session closed for user root
Feb 16 13:10:06 compute-0 sudo[160709]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jjshtmbtavcryjjgueuzdukmlfllltnu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247406.5454295-2742-64518753853993/AnsiballZ_systemd.py'
Feb 16 13:10:06 compute-0 sudo[160709]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:10:07 compute-0 python3.9[160711]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt_guests daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Feb 16 13:10:07 compute-0 systemd[1]: Reloading.
Feb 16 13:10:07 compute-0 systemd-sysv-generator[160740]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 16 13:10:07 compute-0 systemd-rc-local-generator[160737]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 13:10:07 compute-0 systemd[1]: Reloading.
Feb 16 13:10:07 compute-0 systemd-rc-local-generator[160774]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 13:10:07 compute-0 systemd-sysv-generator[160781]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 16 13:10:07 compute-0 sudo[160709]: pam_unix(sudo:session): session closed for user root
Feb 16 13:10:08 compute-0 sshd-session[105942]: Connection closed by 192.168.122.30 port 54578
Feb 16 13:10:08 compute-0 sshd-session[105939]: pam_unix(sshd:session): session closed for user zuul
Feb 16 13:10:08 compute-0 systemd[1]: session-23.scope: Deactivated successfully.
Feb 16 13:10:08 compute-0 systemd[1]: session-23.scope: Consumed 3min 2.472s CPU time.
Feb 16 13:10:08 compute-0 systemd-logind[818]: Session 23 logged out. Waiting for processes to exit.
Feb 16 13:10:08 compute-0 systemd-logind[818]: Removed session 23.
Feb 16 13:10:09 compute-0 podman[160823]: 2026-02-16 13:10:09.035770604 +0000 UTC m=+0.077317064 container health_status 19961a2f872cc72daf954f4dd556f980b5f58f137d145776df6132c167a8a93c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible)
Feb 16 13:10:14 compute-0 sshd-session[160850]: Accepted publickey for zuul from 192.168.122.30 port 48644 ssh2: ECDSA SHA256:6P1Q60WP6ePVjzkJMgpM7PslBYS6YIK3pVS2L1FZ4Yk
Feb 16 13:10:14 compute-0 systemd-logind[818]: New session 24 of user zuul.
Feb 16 13:10:14 compute-0 systemd[1]: Started Session 24 of User zuul.
Feb 16 13:10:14 compute-0 sshd-session[160850]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 16 13:10:15 compute-0 python3.9[161003]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 16 13:10:16 compute-0 python3.9[161157]: ansible-ansible.builtin.service_facts Invoked
Feb 16 13:10:16 compute-0 network[161174]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Feb 16 13:10:16 compute-0 network[161175]: 'network-scripts' will be removed from distribution in near future.
Feb 16 13:10:16 compute-0 network[161176]: It is advised to switch to 'NetworkManager' instead for network management.
Feb 16 13:10:21 compute-0 sudo[161446]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fvigucntgsunifbjjlllbnhwhzorqxiu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247420.9374504-74-41302104272511/AnsiballZ_setup.py'
Feb 16 13:10:21 compute-0 sudo[161446]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:10:21 compute-0 python3.9[161448]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 16 13:10:21 compute-0 sudo[161446]: pam_unix(sudo:session): session closed for user root
Feb 16 13:10:22 compute-0 sudo[161530]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dwltxxtqmkiralwfuezbjwiiczkxzdis ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247420.9374504-74-41302104272511/AnsiballZ_dnf.py'
Feb 16 13:10:22 compute-0 sudo[161530]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:10:22 compute-0 python3.9[161532]: ansible-ansible.legacy.dnf Invoked with name=['iscsi-initiator-utils'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 16 13:10:25 compute-0 sshd-session[161534]: Connection closed by authenticating user root 146.190.226.24 port 38136 [preauth]
Feb 16 13:10:27 compute-0 sudo[161530]: pam_unix(sudo:session): session closed for user root
Feb 16 13:10:28 compute-0 sudo[161685]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-snpioyzkucjiphtmbkpvwnkhecsemjfw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247428.4404943-98-172774607888051/AnsiballZ_stat.py'
Feb 16 13:10:28 compute-0 sudo[161685]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:10:29 compute-0 python3.9[161687]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated/iscsid/etc/iscsi follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 16 13:10:29 compute-0 sudo[161685]: pam_unix(sudo:session): session closed for user root
Feb 16 13:10:29 compute-0 sudo[161837]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-onyohfbsgtkymcdbolrckzsnjqznnhrb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247429.4072757-118-224669302845291/AnsiballZ_command.py'
Feb 16 13:10:29 compute-0 sudo[161837]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:10:30 compute-0 python3.9[161839]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -nvr /etc/iscsi /var/lib/iscsi _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 13:10:30 compute-0 sudo[161837]: pam_unix(sudo:session): session closed for user root
Feb 16 13:10:30 compute-0 sudo[161990]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kxspjwlujrnvtmatrzjkkxwqelnzmlem ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247430.4818816-138-275400006196843/AnsiballZ_stat.py'
Feb 16 13:10:30 compute-0 sudo[161990]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:10:30 compute-0 python3.9[161992]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/.initiator_reset follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 16 13:10:30 compute-0 sudo[161990]: pam_unix(sudo:session): session closed for user root
Feb 16 13:10:31 compute-0 sudo[162142]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xyixwhbwhbhjcszycbdhgspzwborbrvw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247431.236039-154-139302893184939/AnsiballZ_command.py'
Feb 16 13:10:31 compute-0 sudo[162142]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:10:31 compute-0 python3.9[162144]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/iscsi-iname _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 13:10:31 compute-0 sudo[162142]: pam_unix(sudo:session): session closed for user root
Feb 16 13:10:32 compute-0 sudo[162295]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bphaftlvvehomkxbimqrbwvmberhpdmb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247431.9357758-170-16477541102288/AnsiballZ_stat.py'
Feb 16 13:10:32 compute-0 sudo[162295]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:10:32 compute-0 python3.9[162297]: ansible-ansible.legacy.stat Invoked with path=/etc/iscsi/initiatorname.iscsi follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:10:32 compute-0 sudo[162295]: pam_unix(sudo:session): session closed for user root
Feb 16 13:10:32 compute-0 sudo[162418]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tsjcepcohxehqteixomsbtjztunlhzhy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247431.9357758-170-16477541102288/AnsiballZ_copy.py'
Feb 16 13:10:32 compute-0 sudo[162418]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:10:33 compute-0 python3.9[162420]: ansible-ansible.legacy.copy Invoked with dest=/etc/iscsi/initiatorname.iscsi mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771247431.9357758-170-16477541102288/.source.iscsi _original_basename=.h6pr2h2j follow=False checksum=c7007235e5484b036a2fe232d454bffd02be5917 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:10:33 compute-0 sudo[162418]: pam_unix(sudo:session): session closed for user root
Feb 16 13:10:33 compute-0 sudo[162570]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fdcbqfkkrdljrchosdnzsxdrcqygcend ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247433.2852306-200-99016132718115/AnsiballZ_file.py'
Feb 16 13:10:33 compute-0 sudo[162570]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:10:34 compute-0 python3.9[162572]: ansible-ansible.builtin.file Invoked with mode=0600 path=/etc/iscsi/.initiator_reset state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:10:34 compute-0 sudo[162570]: pam_unix(sudo:session): session closed for user root
Feb 16 13:10:35 compute-0 sudo[162722]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tmedimqyjywdaxmqrscvachzuyvbvyyn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247434.464299-216-103958059228247/AnsiballZ_lineinfile.py'
Feb 16 13:10:35 compute-0 sudo[162722]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:10:35 compute-0 python3.9[162724]: ansible-ansible.builtin.lineinfile Invoked with insertafter=^#node.session.auth.chap.algs line=node.session.auth.chap_algs = SHA3-256,SHA256,SHA1,MD5 path=/etc/iscsi/iscsid.conf regexp=^node.session.auth.chap_algs state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:10:35 compute-0 sudo[162722]: pam_unix(sudo:session): session closed for user root
Feb 16 13:10:36 compute-0 podman[162801]: 2026-02-16 13:10:36.015092704 +0000 UTC m=+0.056348700 container health_status d69bd0be854ec8043fcba555b70c2cd311c7a2510d07d6bc9929fe7684abece9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 16 13:10:36 compute-0 sudo[162893]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-brwyxmapcheikmwndhykcwubfcuspitg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247435.608117-234-162277368237607/AnsiballZ_systemd_service.py'
Feb 16 13:10:36 compute-0 sudo[162893]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:10:36 compute-0 python3.9[162895]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 16 13:10:36 compute-0 systemd[1]: Listening on Open-iSCSI iscsid Socket.
Feb 16 13:10:36 compute-0 sudo[162893]: pam_unix(sudo:session): session closed for user root
Feb 16 13:10:37 compute-0 sudo[163049]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-thxvcmyxzddgagesaalvaawpbmvvnwkp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247436.894415-250-227272731380199/AnsiballZ_systemd_service.py'
Feb 16 13:10:37 compute-0 sudo[163049]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:10:37 compute-0 python3.9[163051]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 16 13:10:37 compute-0 systemd[1]: Reloading.
Feb 16 13:10:37 compute-0 systemd-rc-local-generator[163082]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 13:10:37 compute-0 systemd-sysv-generator[163085]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 16 13:10:37 compute-0 systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Feb 16 13:10:37 compute-0 systemd[1]: Starting Open-iSCSI...
Feb 16 13:10:37 compute-0 kernel: Loading iSCSI transport class v2.0-870.
Feb 16 13:10:37 compute-0 systemd[1]: Started Open-iSCSI.
Feb 16 13:10:37 compute-0 systemd[1]: Starting Logout off all iSCSI sessions on shutdown...
Feb 16 13:10:37 compute-0 systemd[1]: Finished Logout off all iSCSI sessions on shutdown.
Feb 16 13:10:37 compute-0 sudo[163049]: pam_unix(sudo:session): session closed for user root
Feb 16 13:10:38 compute-0 python3.9[163256]: ansible-ansible.builtin.service_facts Invoked
Feb 16 13:10:38 compute-0 network[163273]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Feb 16 13:10:38 compute-0 network[163274]: 'network-scripts' will be removed from distribution in near future.
Feb 16 13:10:38 compute-0 network[163275]: It is advised to switch to 'NetworkManager' instead for network management.
Feb 16 13:10:39 compute-0 podman[163282]: 2026-02-16 13:10:39.560360207 +0000 UTC m=+0.118546100 container health_status 19961a2f872cc72daf954f4dd556f980b5f58f137d145776df6132c167a8a93c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_id=ovn_controller)
Feb 16 13:10:41 compute-0 sshd-session[163447]: Connection closed by 64.227.72.94 port 51292
Feb 16 13:10:42 compute-0 sudo[163573]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-axdowbuoimzizegucfbooadposuqqjji ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247442.6629355-296-171884758349205/AnsiballZ_dnf.py'
Feb 16 13:10:42 compute-0 sudo[163573]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:10:43 compute-0 python3.9[163575]: ansible-ansible.legacy.dnf Invoked with name=['device-mapper-multipath'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 16 13:10:45 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Feb 16 13:10:45 compute-0 systemd[1]: Starting man-db-cache-update.service...
Feb 16 13:10:45 compute-0 systemd[1]: Reloading.
Feb 16 13:10:45 compute-0 systemd-rc-local-generator[163620]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 13:10:45 compute-0 systemd-sysv-generator[163625]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 16 13:10:45 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Feb 16 13:10:45 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Feb 16 13:10:45 compute-0 systemd[1]: Finished man-db-cache-update.service.
Feb 16 13:10:45 compute-0 systemd[1]: run-r8569346c18eb4694a9f63c3f715f0323.service: Deactivated successfully.
Feb 16 13:10:45 compute-0 sudo[163573]: pam_unix(sudo:session): session closed for user root
Feb 16 13:10:46 compute-0 sudo[163896]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vonegwbroatflwqimnfrmdpvgbpqecqo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247446.312411-314-56869836328661/AnsiballZ_file.py'
Feb 16 13:10:46 compute-0 sudo[163896]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:10:46 compute-0 python3.9[163898]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Feb 16 13:10:46 compute-0 sudo[163896]: pam_unix(sudo:session): session closed for user root
Feb 16 13:10:47 compute-0 sudo[164048]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dgwyjhdvqldkifqsxocpzbeoemsfcnlv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247446.8989346-330-250443275214200/AnsiballZ_modprobe.py'
Feb 16 13:10:47 compute-0 sudo[164048]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:10:47 compute-0 python3.9[164050]: ansible-community.general.modprobe Invoked with name=dm-multipath state=present params= persistent=disabled
Feb 16 13:10:47 compute-0 sudo[164048]: pam_unix(sudo:session): session closed for user root
Feb 16 13:10:48 compute-0 sudo[164204]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bqyrmqlsfkeawddsuewvqtpoefabrriy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247447.7741013-346-221624700710250/AnsiballZ_stat.py'
Feb 16 13:10:48 compute-0 sudo[164204]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:10:48 compute-0 python3.9[164206]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/dm-multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:10:48 compute-0 sudo[164204]: pam_unix(sudo:session): session closed for user root
Feb 16 13:10:48 compute-0 sudo[164327]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oiuopygbozjvksyvfypywjbzfpjakxaf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247447.7741013-346-221624700710250/AnsiballZ_copy.py'
Feb 16 13:10:48 compute-0 sudo[164327]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:10:48 compute-0 python3.9[164329]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/dm-multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771247447.7741013-346-221624700710250/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=065061c60917e4f67cecc70d12ce55e42f9d0b3f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:10:48 compute-0 sudo[164327]: pam_unix(sudo:session): session closed for user root
Feb 16 13:10:49 compute-0 sudo[164479]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-slknuxllvdsyqnkrsyocauljzdahbzyc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247448.992471-378-272502217883177/AnsiballZ_lineinfile.py'
Feb 16 13:10:49 compute-0 sudo[164479]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:10:49 compute-0 python3.9[164481]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=dm-multipath  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:10:49 compute-0 sudo[164479]: pam_unix(sudo:session): session closed for user root
Feb 16 13:10:50 compute-0 sudo[164631]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rkcijvfgvcvflopmuwntfxgtglmedbwz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247449.6618223-394-217627404044449/AnsiballZ_systemd.py'
Feb 16 13:10:50 compute-0 sudo[164631]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:10:50 compute-0 python3.9[164633]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 16 13:10:50 compute-0 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Feb 16 13:10:50 compute-0 systemd[1]: Stopped Load Kernel Modules.
Feb 16 13:10:50 compute-0 systemd[1]: Stopping Load Kernel Modules...
Feb 16 13:10:50 compute-0 systemd[1]: Starting Load Kernel Modules...
Feb 16 13:10:50 compute-0 systemd[1]: Finished Load Kernel Modules.
Feb 16 13:10:50 compute-0 sudo[164631]: pam_unix(sudo:session): session closed for user root
Feb 16 13:10:51 compute-0 sudo[164787]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iguezgmagutgmqbhzbqolrpzkrqpyeuf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247450.7513406-410-271740763551410/AnsiballZ_command.py'
Feb 16 13:10:51 compute-0 sudo[164787]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:10:51 compute-0 python3.9[164789]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -nvr /etc/multipath _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 13:10:51 compute-0 sudo[164787]: pam_unix(sudo:session): session closed for user root
Feb 16 13:10:51 compute-0 sudo[164940]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nkngnwakuybiynspnzompfwcjopvlwag ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247451.6425662-430-200556826994748/AnsiballZ_stat.py'
Feb 16 13:10:51 compute-0 sudo[164940]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:10:52 compute-0 python3.9[164942]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 16 13:10:52 compute-0 sudo[164940]: pam_unix(sudo:session): session closed for user root
Feb 16 13:10:52 compute-0 sudo[165092]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vgwiadgvucyuwgimrzekxqwmaemrutvo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247452.3896027-448-51853684009171/AnsiballZ_stat.py'
Feb 16 13:10:52 compute-0 sudo[165092]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:10:52 compute-0 python3.9[165094]: ansible-ansible.legacy.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:10:52 compute-0 sudo[165092]: pam_unix(sudo:session): session closed for user root
Feb 16 13:10:53 compute-0 sudo[165215]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pqrpuifmuugjudoopoltfmfsoixgcrwc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247452.3896027-448-51853684009171/AnsiballZ_copy.py'
Feb 16 13:10:53 compute-0 sudo[165215]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:10:53 compute-0 python3.9[165217]: ansible-ansible.legacy.copy Invoked with dest=/etc/multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771247452.3896027-448-51853684009171/.source.conf _original_basename=multipath.conf follow=False checksum=bf02ab264d3d648048a81f3bacec8bc58db93162 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:10:53 compute-0 sudo[165215]: pam_unix(sudo:session): session closed for user root
Feb 16 13:10:53 compute-0 sudo[165367]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kytweattibruvrmkgkpoeuihxhzipvch ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247453.5692184-478-9308485292958/AnsiballZ_command.py'
Feb 16 13:10:53 compute-0 sudo[165367]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:10:54 compute-0 python3.9[165369]: ansible-ansible.legacy.command Invoked with _raw_params=grep -q '^blacklist\s*{' /etc/multipath.conf _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 13:10:54 compute-0 sudo[165367]: pam_unix(sudo:session): session closed for user root
Feb 16 13:10:54 compute-0 sudo[165520]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qkjioplvrnefbjdbrznnenzontnxcpzg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247454.2428856-494-117363748596929/AnsiballZ_lineinfile.py'
Feb 16 13:10:54 compute-0 sudo[165520]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:10:54 compute-0 python3.9[165522]: ansible-ansible.builtin.lineinfile Invoked with line=blacklist { path=/etc/multipath.conf state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:10:54 compute-0 sudo[165520]: pam_unix(sudo:session): session closed for user root
Feb 16 13:10:55 compute-0 sudo[165672]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dknqgzdajuiruadwmptaxljfajzlocfv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247454.9180627-510-46906169244105/AnsiballZ_replace.py'
Feb 16 13:10:55 compute-0 sudo[165672]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:10:55 compute-0 python3.9[165674]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^(blacklist {) replace=\1\n} backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:10:55 compute-0 sudo[165672]: pam_unix(sudo:session): session closed for user root
Feb 16 13:10:56 compute-0 sudo[165824]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bjtbkimxgixspbpzypsywanruapvyvck ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247455.7843463-526-116299915936345/AnsiballZ_replace.py'
Feb 16 13:10:56 compute-0 sudo[165824]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:10:56 compute-0 python3.9[165826]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^blacklist\s*{\n[\s]+devnode \"\.\*\" replace=blacklist { backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:10:56 compute-0 sudo[165824]: pam_unix(sudo:session): session closed for user root
Feb 16 13:10:56 compute-0 sudo[165976]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uokmqxiviyeyitrerzjejacqejfpmjey ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247456.461486-544-173090975015333/AnsiballZ_lineinfile.py'
Feb 16 13:10:56 compute-0 sudo[165976]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:10:56 compute-0 python3.9[165978]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        find_multipaths yes path=/etc/multipath.conf regexp=^\s+find_multipaths state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:10:56 compute-0 sudo[165976]: pam_unix(sudo:session): session closed for user root
Feb 16 13:10:57 compute-0 sudo[166128]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pvrotzxstxmxpjverdvikfageisdardi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247457.1861877-544-165387007476496/AnsiballZ_lineinfile.py'
Feb 16 13:10:57 compute-0 sudo[166128]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:10:57 compute-0 python3.9[166130]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        recheck_wwid yes path=/etc/multipath.conf regexp=^\s+recheck_wwid state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:10:57 compute-0 sudo[166128]: pam_unix(sudo:session): session closed for user root
Feb 16 13:10:58 compute-0 sudo[166280]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rvahrcqpjjfmumbzikcupreojfdzxgpq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247457.9017107-544-215015494987636/AnsiballZ_lineinfile.py'
Feb 16 13:10:58 compute-0 sudo[166280]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:10:58 compute-0 python3.9[166282]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        skip_kpartx yes path=/etc/multipath.conf regexp=^\s+skip_kpartx state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:10:58 compute-0 sudo[166280]: pam_unix(sudo:session): session closed for user root
Feb 16 13:10:58 compute-0 sudo[166432]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fiqlcccjtjygvlkkbrzyenpucfxqjqva ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247458.5125983-544-47702277953824/AnsiballZ_lineinfile.py'
Feb 16 13:10:58 compute-0 sudo[166432]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:10:58 compute-0 python3.9[166434]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        user_friendly_names no path=/etc/multipath.conf regexp=^\s+user_friendly_names state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:10:58 compute-0 sudo[166432]: pam_unix(sudo:session): session closed for user root
Feb 16 13:10:59 compute-0 sudo[166584]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mqtcqzybxsupaeuavbyxrfqcyivbhhrf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247459.126108-602-198847628237432/AnsiballZ_stat.py'
Feb 16 13:10:59 compute-0 sudo[166584]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:10:59 compute-0 python3.9[166586]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 16 13:10:59 compute-0 sudo[166584]: pam_unix(sudo:session): session closed for user root
Feb 16 13:11:00 compute-0 sudo[166738]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xatzbjjwmeeyqidrpjkmzbdcljzfclkp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247459.8126538-618-179863960906140/AnsiballZ_command.py'
Feb 16 13:11:00 compute-0 sudo[166738]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:11:00 compute-0 python3.9[166740]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/true _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 13:11:00 compute-0 sudo[166738]: pam_unix(sudo:session): session closed for user root
Feb 16 13:11:00 compute-0 sudo[166891]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xjqfqezppspisrhynqaiedkwysurzgui ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247460.5479536-636-71532992847808/AnsiballZ_systemd_service.py'
Feb 16 13:11:00 compute-0 sudo[166891]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:11:01 compute-0 python3.9[166893]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=multipathd.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 16 13:11:01 compute-0 systemd[1]: Listening on multipathd control socket.
Feb 16 13:11:01 compute-0 sudo[166891]: pam_unix(sudo:session): session closed for user root
Feb 16 13:11:01 compute-0 sudo[167047]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ymbcpcjtembdrqxgkuavpmkyxieercyl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247461.3849313-652-200346351333439/AnsiballZ_systemd_service.py'
Feb 16 13:11:01 compute-0 sudo[167047]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:11:01 compute-0 python3.9[167049]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=multipathd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 16 13:11:02 compute-0 systemd[1]: Starting Wait for udev To Complete Device Initialization...
Feb 16 13:11:02 compute-0 udevadm[167054]: systemd-udev-settle.service is deprecated. Please fix multipathd.service not to pull it in.
Feb 16 13:11:02 compute-0 systemd[1]: Finished Wait for udev To Complete Device Initialization.
Feb 16 13:11:02 compute-0 systemd[1]: Starting Device-Mapper Multipath Device Controller...
Feb 16 13:11:02 compute-0 multipathd[167057]: --------start up--------
Feb 16 13:11:02 compute-0 multipathd[167057]: read /etc/multipath.conf
Feb 16 13:11:02 compute-0 multipathd[167057]: path checkers start up
Feb 16 13:11:02 compute-0 systemd[1]: Started Device-Mapper Multipath Device Controller.
Feb 16 13:11:02 compute-0 sudo[167047]: pam_unix(sudo:session): session closed for user root
Feb 16 13:11:02 compute-0 sudo[167214]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fgrhzfqxlxemxjkrhxpzqltosqctqxkn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247462.6607594-676-26094436317344/AnsiballZ_file.py'
Feb 16 13:11:02 compute-0 sudo[167214]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:11:03 compute-0 python3.9[167216]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Feb 16 13:11:03 compute-0 sudo[167214]: pam_unix(sudo:session): session closed for user root
Feb 16 13:11:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:11:03.198 105360 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:11:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:11:03.200 105360 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:11:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:11:03.200 105360 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:11:03 compute-0 sudo[167366]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mqurebouwzrmpztawwoxkbswltmeacil ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247463.3868814-692-182914416917702/AnsiballZ_modprobe.py'
Feb 16 13:11:03 compute-0 sudo[167366]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:11:03 compute-0 python3.9[167368]: ansible-community.general.modprobe Invoked with name=nvme-fabrics state=present params= persistent=disabled
Feb 16 13:11:03 compute-0 kernel: Key type psk registered
Feb 16 13:11:03 compute-0 sudo[167366]: pam_unix(sudo:session): session closed for user root
Feb 16 13:11:04 compute-0 sudo[167531]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mojqparrnmspcyjvjlopjqouwvvsbvwj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247464.110876-708-158110343970153/AnsiballZ_stat.py'
Feb 16 13:11:04 compute-0 sudo[167531]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:11:04 compute-0 python3.9[167533]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/nvme-fabrics.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:11:04 compute-0 sudo[167531]: pam_unix(sudo:session): session closed for user root
Feb 16 13:11:05 compute-0 sudo[167654]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tnqixhbycdssjdiocnmhfcelcduhmxnm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247464.110876-708-158110343970153/AnsiballZ_copy.py'
Feb 16 13:11:05 compute-0 sudo[167654]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:11:05 compute-0 python3.9[167656]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/nvme-fabrics.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771247464.110876-708-158110343970153/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=783c778f0c68cc414f35486f234cbb1cf3f9bbff backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:11:05 compute-0 sudo[167654]: pam_unix(sudo:session): session closed for user root
Feb 16 13:11:05 compute-0 sudo[167806]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uipscyszuqzdicnprwpycoflerzyhioe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247465.586432-740-169865289430030/AnsiballZ_lineinfile.py'
Feb 16 13:11:05 compute-0 sudo[167806]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:11:06 compute-0 python3.9[167808]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=nvme-fabrics  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:11:06 compute-0 sudo[167806]: pam_unix(sudo:session): session closed for user root
Feb 16 13:11:06 compute-0 sudo[167969]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zblopnmnadkbyofcrforgunszfgbdivz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247466.2914534-756-105755044922968/AnsiballZ_systemd.py'
Feb 16 13:11:06 compute-0 sudo[167969]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:11:06 compute-0 podman[167932]: 2026-02-16 13:11:06.893970895 +0000 UTC m=+0.064245662 container health_status d69bd0be854ec8043fcba555b70c2cd311c7a2510d07d6bc9929fe7684abece9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent)
Feb 16 13:11:07 compute-0 python3.9[167977]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 16 13:11:07 compute-0 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Feb 16 13:11:07 compute-0 systemd[1]: Stopped Load Kernel Modules.
Feb 16 13:11:07 compute-0 systemd[1]: Stopping Load Kernel Modules...
Feb 16 13:11:07 compute-0 systemd[1]: Starting Load Kernel Modules...
Feb 16 13:11:07 compute-0 systemd[1]: Finished Load Kernel Modules.
Feb 16 13:11:07 compute-0 sudo[167969]: pam_unix(sudo:session): session closed for user root
Feb 16 13:11:07 compute-0 sudo[168134]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-euiabsitutwmtxxkuahwqbmagzvyexep ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247467.6425774-772-262756000242797/AnsiballZ_dnf.py'
Feb 16 13:11:07 compute-0 sudo[168134]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:11:08 compute-0 python3.9[168136]: ansible-ansible.legacy.dnf Invoked with name=['nvme-cli'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 16 13:11:10 compute-0 podman[168141]: 2026-02-16 13:11:10.053363295 +0000 UTC m=+0.089342441 container health_status 19961a2f872cc72daf954f4dd556f980b5f58f137d145776df6132c167a8a93c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ovn_controller, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 16 13:11:10 compute-0 systemd[1]: Reloading.
Feb 16 13:11:10 compute-0 systemd-rc-local-generator[168196]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 13:11:10 compute-0 systemd-sysv-generator[168199]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 16 13:11:10 compute-0 systemd[1]: Reloading.
Feb 16 13:11:10 compute-0 systemd-rc-local-generator[168239]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 13:11:10 compute-0 systemd-sysv-generator[168243]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 16 13:11:10 compute-0 systemd-logind[818]: Watching system buttons on /dev/input/event0 (Power Button)
Feb 16 13:11:10 compute-0 systemd-logind[818]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Feb 16 13:11:11 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Feb 16 13:11:11 compute-0 systemd[1]: Starting man-db-cache-update.service...
Feb 16 13:11:11 compute-0 systemd[1]: Reloading.
Feb 16 13:11:11 compute-0 systemd-rc-local-generator[168332]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 13:11:11 compute-0 systemd-sysv-generator[168339]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 16 13:11:11 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Feb 16 13:11:12 compute-0 sudo[168134]: pam_unix(sudo:session): session closed for user root
Feb 16 13:11:12 compute-0 sudo[169650]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cqcwxlopmitrihslajzxbspdhrfnoyhb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247472.308049-788-174714399811952/AnsiballZ_systemd_service.py'
Feb 16 13:11:12 compute-0 sudo[169650]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:11:12 compute-0 python3.9[169652]: ansible-ansible.builtin.systemd_service Invoked with name=iscsid state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 16 13:11:12 compute-0 iscsid[163098]: iscsid shutting down.
Feb 16 13:11:12 compute-0 systemd[1]: Stopping Open-iSCSI...
Feb 16 13:11:12 compute-0 systemd[1]: iscsid.service: Deactivated successfully.
Feb 16 13:11:12 compute-0 systemd[1]: Stopped Open-iSCSI.
Feb 16 13:11:12 compute-0 systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Feb 16 13:11:12 compute-0 systemd[1]: Starting Open-iSCSI...
Feb 16 13:11:12 compute-0 systemd[1]: Started Open-iSCSI.
Feb 16 13:11:12 compute-0 sudo[169650]: pam_unix(sudo:session): session closed for user root
Feb 16 13:11:13 compute-0 sudo[169806]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lcgivrtpmxadcryslnzxcttnqkxulsss ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247473.156319-804-250707275621085/AnsiballZ_systemd_service.py'
Feb 16 13:11:13 compute-0 sudo[169806]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:11:13 compute-0 python3.9[169808]: ansible-ansible.builtin.systemd_service Invoked with name=multipathd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 16 13:11:13 compute-0 systemd[1]: Stopping Device-Mapper Multipath Device Controller...
Feb 16 13:11:13 compute-0 multipathd[167057]: exit (signal)
Feb 16 13:11:13 compute-0 multipathd[167057]: --------shut down-------
Feb 16 13:11:13 compute-0 systemd[1]: multipathd.service: Deactivated successfully.
Feb 16 13:11:13 compute-0 systemd[1]: Stopped Device-Mapper Multipath Device Controller.
Feb 16 13:11:13 compute-0 systemd[1]: Starting Device-Mapper Multipath Device Controller...
Feb 16 13:11:13 compute-0 multipathd[169814]: --------start up--------
Feb 16 13:11:13 compute-0 multipathd[169814]: read /etc/multipath.conf
Feb 16 13:11:13 compute-0 multipathd[169814]: path checkers start up
Feb 16 13:11:13 compute-0 systemd[1]: Started Device-Mapper Multipath Device Controller.
Feb 16 13:11:13 compute-0 sudo[169806]: pam_unix(sudo:session): session closed for user root
Feb 16 13:11:14 compute-0 python3.9[169971]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 16 13:11:15 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Feb 16 13:11:15 compute-0 systemd[1]: Finished man-db-cache-update.service.
Feb 16 13:11:15 compute-0 systemd[1]: man-db-cache-update.service: Consumed 1.225s CPU time.
Feb 16 13:11:15 compute-0 systemd[1]: run-r13bb6fde03754e9d89da93be83bce85a.service: Deactivated successfully.
Feb 16 13:11:15 compute-0 sudo[170126]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wzkwttxqdhoaycqrmhqwauxseoricnmp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247475.420778-839-29983956469075/AnsiballZ_file.py'
Feb 16 13:11:15 compute-0 sudo[170126]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:11:15 compute-0 python3.9[170128]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/ssh/ssh_known_hosts state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:11:15 compute-0 sudo[170126]: pam_unix(sudo:session): session closed for user root
Feb 16 13:11:16 compute-0 sudo[170278]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mvkjcmqmwsqhgotcrtmnsvmrjsicxbxo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247476.4870327-861-141199978825686/AnsiballZ_systemd_service.py'
Feb 16 13:11:16 compute-0 sudo[170278]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:11:17 compute-0 python3.9[170280]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 16 13:11:17 compute-0 systemd[1]: Reloading.
Feb 16 13:11:17 compute-0 systemd-sysv-generator[170306]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 16 13:11:17 compute-0 systemd-rc-local-generator[170303]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 13:11:17 compute-0 sudo[170278]: pam_unix(sudo:session): session closed for user root
Feb 16 13:11:17 compute-0 python3.9[170472]: ansible-ansible.builtin.service_facts Invoked
Feb 16 13:11:17 compute-0 network[170489]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Feb 16 13:11:17 compute-0 network[170490]: 'network-scripts' will be removed from distribution in near future.
Feb 16 13:11:17 compute-0 network[170491]: It is advised to switch to 'NetworkManager' instead for network management.
Feb 16 13:11:22 compute-0 sudo[170762]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gmvcsdshqgvczinamwoiohbdinnqincs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247482.0140395-899-84676943364987/AnsiballZ_systemd_service.py'
Feb 16 13:11:22 compute-0 sudo[170762]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:11:22 compute-0 python3.9[170764]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 16 13:11:22 compute-0 sudo[170762]: pam_unix(sudo:session): session closed for user root
Feb 16 13:11:23 compute-0 sudo[170915]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aubcuvscjuupdfrporwuijjvcrjsdjpj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247482.9715273-899-242335991735148/AnsiballZ_systemd_service.py'
Feb 16 13:11:23 compute-0 sudo[170915]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:11:24 compute-0 python3.9[170917]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_migration_target.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 16 13:11:24 compute-0 sudo[170915]: pam_unix(sudo:session): session closed for user root
Feb 16 13:11:24 compute-0 sudo[171068]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fqnfhzpbspbwpnqugxrysmhxhtpvmobn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247484.204479-899-156315783852888/AnsiballZ_systemd_service.py'
Feb 16 13:11:24 compute-0 sudo[171068]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:11:24 compute-0 python3.9[171070]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api_cron.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 16 13:11:24 compute-0 sudo[171068]: pam_unix(sudo:session): session closed for user root
Feb 16 13:11:25 compute-0 sudo[171221]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dglulwoiqxrqwaisiibsscqidibhllfv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247485.1321588-899-9786017722064/AnsiballZ_systemd_service.py'
Feb 16 13:11:25 compute-0 sudo[171221]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:11:25 compute-0 python3.9[171223]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 16 13:11:25 compute-0 sudo[171221]: pam_unix(sudo:session): session closed for user root
Feb 16 13:11:26 compute-0 sudo[171374]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mudxqxapgpyajnfuziadenwzzlfguimw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247485.8852808-899-238168749680151/AnsiballZ_systemd_service.py'
Feb 16 13:11:26 compute-0 sudo[171374]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:11:26 compute-0 python3.9[171376]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_conductor.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 16 13:11:26 compute-0 sudo[171374]: pam_unix(sudo:session): session closed for user root
Feb 16 13:11:26 compute-0 sudo[171527]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ggwdhyhzypwuffznmynxmskzjlcybqfv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247486.6915948-899-275364473440665/AnsiballZ_systemd_service.py'
Feb 16 13:11:26 compute-0 sudo[171527]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:11:27 compute-0 python3.9[171529]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_metadata.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 16 13:11:27 compute-0 sudo[171527]: pam_unix(sudo:session): session closed for user root
Feb 16 13:11:27 compute-0 sudo[171680]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fkiykqrjbhmklztshxxlxgnhbeomrlwm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247487.429057-899-105248756033329/AnsiballZ_systemd_service.py'
Feb 16 13:11:27 compute-0 sudo[171680]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:11:27 compute-0 python3.9[171682]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_scheduler.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 16 13:11:28 compute-0 sudo[171680]: pam_unix(sudo:session): session closed for user root
Feb 16 13:11:28 compute-0 sudo[171833]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mblsrurpjtcmdiqkpxxhbuzfvzvmqlqa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247488.1902928-899-69033509202217/AnsiballZ_systemd_service.py'
Feb 16 13:11:28 compute-0 sudo[171833]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:11:28 compute-0 python3.9[171835]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_vnc_proxy.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 16 13:11:28 compute-0 sudo[171833]: pam_unix(sudo:session): session closed for user root
Feb 16 13:11:30 compute-0 sudo[171986]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-teqokrotqhttmevhasxsljqqxyjobnoe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247490.5010443-1017-253175488451456/AnsiballZ_file.py'
Feb 16 13:11:30 compute-0 sudo[171986]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:11:30 compute-0 python3.9[171988]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:11:30 compute-0 sudo[171986]: pam_unix(sudo:session): session closed for user root
Feb 16 13:11:31 compute-0 sudo[172140]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iagrimzdvrgivukhvqbjpxctbjwgejtt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247491.13069-1017-224040570486654/AnsiballZ_file.py'
Feb 16 13:11:31 compute-0 sudo[172140]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:11:31 compute-0 sshd-session[171989]: Connection closed by authenticating user root 146.190.226.24 port 34052 [preauth]
Feb 16 13:11:31 compute-0 python3.9[172142]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:11:31 compute-0 sudo[172140]: pam_unix(sudo:session): session closed for user root
Feb 16 13:11:32 compute-0 sudo[172292]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lwopqbbgirtlnplyzjzakgbaqaaginve ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247491.7655547-1017-96448983503757/AnsiballZ_file.py'
Feb 16 13:11:32 compute-0 sudo[172292]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:11:32 compute-0 python3.9[172294]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:11:32 compute-0 sudo[172292]: pam_unix(sudo:session): session closed for user root
Feb 16 13:11:32 compute-0 sudo[172444]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rlquahimckpgsjyhtfrmulvtcthtwwkq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247492.4684107-1017-66189060608303/AnsiballZ_file.py'
Feb 16 13:11:32 compute-0 sudo[172444]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:11:33 compute-0 python3.9[172446]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:11:33 compute-0 sudo[172444]: pam_unix(sudo:session): session closed for user root
Feb 16 13:11:33 compute-0 sudo[172596]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-myyyugknzmpmrzqdjxgprojslgloamwi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247493.1948938-1017-171642911443120/AnsiballZ_file.py'
Feb 16 13:11:33 compute-0 sudo[172596]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:11:33 compute-0 python3.9[172598]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:11:33 compute-0 sudo[172596]: pam_unix(sudo:session): session closed for user root
Feb 16 13:11:33 compute-0 sudo[172748]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wsfdfjlgcubsqvonkkpdeyxngqzdxydt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247493.7706485-1017-125051260394183/AnsiballZ_file.py'
Feb 16 13:11:33 compute-0 sudo[172748]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:11:34 compute-0 python3.9[172750]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:11:34 compute-0 sudo[172748]: pam_unix(sudo:session): session closed for user root
Feb 16 13:11:34 compute-0 sudo[172900]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-phuvkcipglaficuopmiqxkfrzwxczgmo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247494.3347075-1017-188352434665213/AnsiballZ_file.py'
Feb 16 13:11:34 compute-0 sudo[172900]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:11:34 compute-0 python3.9[172902]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:11:34 compute-0 sudo[172900]: pam_unix(sudo:session): session closed for user root
Feb 16 13:11:35 compute-0 sudo[173052]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-puzhhghipynvafbwppkfyhtlplfxaxwe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247494.8745553-1017-180539795225768/AnsiballZ_file.py'
Feb 16 13:11:35 compute-0 sudo[173052]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:11:35 compute-0 python3.9[173054]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:11:35 compute-0 sudo[173052]: pam_unix(sudo:session): session closed for user root
Feb 16 13:11:36 compute-0 systemd[1]: virtnodedevd.service: Deactivated successfully.
Feb 16 13:11:36 compute-0 sudo[173205]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ekcwwvstspmqxgwjwwewlhliwtrylegs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247495.9491568-1131-234132115923208/AnsiballZ_file.py'
Feb 16 13:11:36 compute-0 sudo[173205]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:11:36 compute-0 python3.9[173207]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:11:36 compute-0 sudo[173205]: pam_unix(sudo:session): session closed for user root
Feb 16 13:11:37 compute-0 podman[173284]: 2026-02-16 13:11:37.021379048 +0000 UTC m=+0.060123634 container health_status d69bd0be854ec8043fcba555b70c2cd311c7a2510d07d6bc9929fe7684abece9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, managed_by=edpm_ansible)
Feb 16 13:11:37 compute-0 sudo[173376]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wnmdfhyvdfqwfesrfluizxwcdlnnkrye ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247496.8053467-1131-134922124237525/AnsiballZ_file.py'
Feb 16 13:11:37 compute-0 sudo[173376]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:11:37 compute-0 systemd[1]: virtproxyd.service: Deactivated successfully.
Feb 16 13:11:37 compute-0 python3.9[173378]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:11:37 compute-0 sudo[173376]: pam_unix(sudo:session): session closed for user root
Feb 16 13:11:38 compute-0 sudo[173529]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ilizgcckqjqifiahdutqhzlvagtxpcju ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247497.7145755-1131-200192920547747/AnsiballZ_file.py'
Feb 16 13:11:38 compute-0 sudo[173529]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:11:38 compute-0 python3.9[173531]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:11:38 compute-0 sudo[173529]: pam_unix(sudo:session): session closed for user root
Feb 16 13:11:38 compute-0 systemd[1]: virtqemud.service: Deactivated successfully.
Feb 16 13:11:38 compute-0 sudo[173682]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-midkpvrlhcflywnbdxuyatncreujukyu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247498.4485757-1131-247796967089533/AnsiballZ_file.py'
Feb 16 13:11:38 compute-0 sudo[173682]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:11:38 compute-0 python3.9[173684]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:11:38 compute-0 sudo[173682]: pam_unix(sudo:session): session closed for user root
Feb 16 13:11:39 compute-0 sudo[173834]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mczhoxdjzrjectqehzvrqbkqelwhzmuz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247499.0396602-1131-280848843415856/AnsiballZ_file.py'
Feb 16 13:11:39 compute-0 sudo[173834]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:11:39 compute-0 python3.9[173836]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:11:39 compute-0 sudo[173834]: pam_unix(sudo:session): session closed for user root
Feb 16 13:11:39 compute-0 systemd[1]: virtsecretd.service: Deactivated successfully.
Feb 16 13:11:40 compute-0 sudo[174000]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-efynaumycplorcrkkdhvvlrdyenmruof ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247499.73746-1131-216760736524249/AnsiballZ_file.py'
Feb 16 13:11:40 compute-0 sudo[174000]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:11:40 compute-0 podman[173961]: 2026-02-16 13:11:40.457231892 +0000 UTC m=+0.076321041 container health_status 19961a2f872cc72daf954f4dd556f980b5f58f137d145776df6132c167a8a93c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Feb 16 13:11:40 compute-0 python3.9[174009]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:11:40 compute-0 sudo[174000]: pam_unix(sudo:session): session closed for user root
Feb 16 13:11:41 compute-0 sudo[174165]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vaeozaagcbknshwikvxrpzqguqzjzcxb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247500.7827098-1131-223040521891937/AnsiballZ_file.py'
Feb 16 13:11:41 compute-0 sudo[174165]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:11:41 compute-0 python3.9[174167]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:11:41 compute-0 sudo[174165]: pam_unix(sudo:session): session closed for user root
Feb 16 13:11:41 compute-0 sudo[174317]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bvxsblybwrclearngqtdhqvotbolcpbk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247501.4503171-1131-150281721955475/AnsiballZ_file.py'
Feb 16 13:11:41 compute-0 sudo[174317]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:11:41 compute-0 python3.9[174319]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:11:41 compute-0 sudo[174317]: pam_unix(sudo:session): session closed for user root
Feb 16 13:11:42 compute-0 sudo[174469]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-omvwkbpqhgclfbkvojtvxheewgkouksi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247502.51741-1247-239790088037456/AnsiballZ_command.py'
Feb 16 13:11:42 compute-0 sudo[174469]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:11:43 compute-0 python3.9[174471]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                               systemctl disable --now certmonger.service
                                               test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                             fi
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 13:11:43 compute-0 sudo[174469]: pam_unix(sudo:session): session closed for user root
Feb 16 13:11:44 compute-0 python3.9[174623]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Feb 16 13:11:45 compute-0 sudo[174773]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vinsimkegauzmsuzizockicxquumpcts ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247504.853306-1283-72247534133591/AnsiballZ_systemd_service.py'
Feb 16 13:11:45 compute-0 sudo[174773]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:11:45 compute-0 python3.9[174775]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 16 13:11:45 compute-0 systemd[1]: Reloading.
Feb 16 13:11:45 compute-0 systemd-rc-local-generator[174804]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 13:11:45 compute-0 systemd-sysv-generator[174807]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 16 13:11:45 compute-0 sudo[174773]: pam_unix(sudo:session): session closed for user root
Feb 16 13:11:46 compute-0 sudo[174967]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ffossboeulknknuljwenjflcaxdbcrpy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247506.0541453-1299-185874631663586/AnsiballZ_command.py'
Feb 16 13:11:46 compute-0 sudo[174967]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:11:46 compute-0 python3.9[174969]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 13:11:46 compute-0 sudo[174967]: pam_unix(sudo:session): session closed for user root
Feb 16 13:11:47 compute-0 sudo[175120]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gptzpdprsejqjfeiisfkdormlhqnalgs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247506.7358806-1299-39742073820049/AnsiballZ_command.py'
Feb 16 13:11:47 compute-0 sudo[175120]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:11:47 compute-0 python3.9[175122]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_migration_target.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 13:11:47 compute-0 sudo[175120]: pam_unix(sudo:session): session closed for user root
Feb 16 13:11:47 compute-0 sudo[175273]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qiqnhbpgaznscadugkhuhhljcrzncmpt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247507.3494127-1299-176643483484207/AnsiballZ_command.py'
Feb 16 13:11:47 compute-0 sudo[175273]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:11:47 compute-0 python3.9[175275]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api_cron.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 13:11:47 compute-0 sudo[175273]: pam_unix(sudo:session): session closed for user root
Feb 16 13:11:48 compute-0 sudo[175426]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fkkjwqjfifpxbaapbumtownzlubdjuwp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247507.9646807-1299-80566126972259/AnsiballZ_command.py'
Feb 16 13:11:48 compute-0 sudo[175426]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:11:48 compute-0 python3.9[175428]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 13:11:48 compute-0 sudo[175426]: pam_unix(sudo:session): session closed for user root
Feb 16 13:11:49 compute-0 sudo[175579]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fepconwfkjtvxqqiltgljvgtgyezyvnj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247508.9495614-1299-160675029867582/AnsiballZ_command.py'
Feb 16 13:11:49 compute-0 sudo[175579]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:11:49 compute-0 python3.9[175581]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_conductor.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 13:11:49 compute-0 sudo[175579]: pam_unix(sudo:session): session closed for user root
Feb 16 13:11:49 compute-0 sudo[175732]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kaqigiwryzixfvelbmgvvxbklxattmfb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247509.5869076-1299-207094573309336/AnsiballZ_command.py'
Feb 16 13:11:49 compute-0 sudo[175732]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:11:50 compute-0 python3.9[175734]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_metadata.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 13:11:50 compute-0 sudo[175732]: pam_unix(sudo:session): session closed for user root
Feb 16 13:11:50 compute-0 sudo[175885]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eattqmjebpwmdhgbpxevcrtniexoosxt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247510.2169025-1299-27589680310113/AnsiballZ_command.py'
Feb 16 13:11:50 compute-0 sudo[175885]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:11:50 compute-0 python3.9[175887]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_scheduler.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 13:11:50 compute-0 sudo[175885]: pam_unix(sudo:session): session closed for user root
Feb 16 13:11:51 compute-0 sudo[176038]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-syyuckaxdhecbvaeadksvjxmmbbhesxc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247510.8944333-1299-269206567925110/AnsiballZ_command.py'
Feb 16 13:11:51 compute-0 sudo[176038]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:11:51 compute-0 python3.9[176040]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_vnc_proxy.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 13:11:51 compute-0 sudo[176038]: pam_unix(sudo:session): session closed for user root
Feb 16 13:11:53 compute-0 sudo[176191]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ngiscahjvbmpxiwmvidndbanajsidufv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247512.741893-1442-194520935897963/AnsiballZ_file.py'
Feb 16 13:11:53 compute-0 sudo[176191]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:11:53 compute-0 python3.9[176193]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 16 13:11:53 compute-0 sudo[176191]: pam_unix(sudo:session): session closed for user root
Feb 16 13:11:53 compute-0 sudo[176343]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xhbzpkassdaviwzxjoxesiomvtklzbgs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247513.400528-1442-124356481052729/AnsiballZ_file.py'
Feb 16 13:11:53 compute-0 sudo[176343]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:11:53 compute-0 python3.9[176345]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/nova_nvme_cleaner setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 16 13:11:53 compute-0 sudo[176343]: pam_unix(sudo:session): session closed for user root
Feb 16 13:11:54 compute-0 sudo[176495]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xischckhvqryeqipatkpfydqdunbctix ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247514.1485622-1472-96648267819839/AnsiballZ_file.py'
Feb 16 13:11:54 compute-0 sudo[176495]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:11:54 compute-0 python3.9[176497]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 16 13:11:54 compute-0 sudo[176495]: pam_unix(sudo:session): session closed for user root
Feb 16 13:11:55 compute-0 sudo[176647]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bsuwhbcbtoqurffvpkqzduhialhdwcau ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247515.134792-1472-168897224922164/AnsiballZ_file.py'
Feb 16 13:11:55 compute-0 sudo[176647]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:11:55 compute-0 python3.9[176649]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/_nova_secontext setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 16 13:11:55 compute-0 sudo[176647]: pam_unix(sudo:session): session closed for user root
Feb 16 13:11:56 compute-0 sudo[176799]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qmzhnugmdqepuxcecvkufjctbriephhr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247515.7332563-1472-132525260288838/AnsiballZ_file.py'
Feb 16 13:11:56 compute-0 sudo[176799]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:11:56 compute-0 python3.9[176801]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova/instances setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 16 13:11:56 compute-0 sudo[176799]: pam_unix(sudo:session): session closed for user root
Feb 16 13:11:56 compute-0 sudo[176951]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aschzpvaamgscfocjxwiekxjafotcgjn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247516.4065762-1472-200473221041172/AnsiballZ_file.py'
Feb 16 13:11:56 compute-0 sudo[176951]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:11:56 compute-0 python3.9[176953]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/etc/ceph setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 16 13:11:56 compute-0 sudo[176951]: pam_unix(sudo:session): session closed for user root
Feb 16 13:11:57 compute-0 sudo[177103]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nftnzplojcqsxrbilemlrucwpbbvcjjx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247517.0550473-1472-151630227626034/AnsiballZ_file.py'
Feb 16 13:11:57 compute-0 sudo[177103]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:11:57 compute-0 python3.9[177105]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Feb 16 13:11:57 compute-0 sudo[177103]: pam_unix(sudo:session): session closed for user root
Feb 16 13:11:57 compute-0 sudo[177255]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rgycesrtcohskkqkwlyplnbunjfbozuy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247517.6907873-1472-140732422510201/AnsiballZ_file.py'
Feb 16 13:11:57 compute-0 sudo[177255]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:11:58 compute-0 python3.9[177257]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/nvme setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Feb 16 13:11:58 compute-0 sudo[177255]: pam_unix(sudo:session): session closed for user root
Feb 16 13:11:58 compute-0 sudo[177407]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kcnhjolnembtcltygtxbigfluvhptxjx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247518.307979-1472-258680472398349/AnsiballZ_file.py'
Feb 16 13:11:58 compute-0 sudo[177407]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:11:58 compute-0 python3.9[177409]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/run/openvswitch setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Feb 16 13:11:58 compute-0 sudo[177407]: pam_unix(sudo:session): session closed for user root
Feb 16 13:12:01 compute-0 anacron[60464]: Job `cron.daily' started
Feb 16 13:12:01 compute-0 anacron[60464]: Job `cron.daily' terminated
Feb 16 13:12:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:12:03.200 105360 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:12:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:12:03.203 105360 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:12:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:12:03.203 105360 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:12:05 compute-0 sudo[177561]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-krbphkvtrwsmwlpgfftvstepgsvzprma ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247524.7862709-1709-149259327623967/AnsiballZ_getent.py'
Feb 16 13:12:05 compute-0 sudo[177561]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:12:05 compute-0 python3.9[177563]: ansible-ansible.builtin.getent Invoked with database=passwd key=nova fail_key=True service=None split=None
Feb 16 13:12:05 compute-0 sudo[177561]: pam_unix(sudo:session): session closed for user root
Feb 16 13:12:06 compute-0 sudo[177714]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gbfegxacjvafedeusotvuxqfvzixrisv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247525.6187668-1725-198857267614636/AnsiballZ_group.py'
Feb 16 13:12:06 compute-0 sudo[177714]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:12:06 compute-0 python3.9[177716]: ansible-ansible.builtin.group Invoked with gid=42436 name=nova state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Feb 16 13:12:06 compute-0 groupadd[177717]: group added to /etc/group: name=nova, GID=42436
Feb 16 13:12:06 compute-0 groupadd[177717]: group added to /etc/gshadow: name=nova
Feb 16 13:12:06 compute-0 groupadd[177717]: new group: name=nova, GID=42436
Feb 16 13:12:06 compute-0 sudo[177714]: pam_unix(sudo:session): session closed for user root
Feb 16 13:12:07 compute-0 sudo[177883]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rlkwldyowhpllsxeadmfhptjdcjpfddx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247526.5513535-1741-48950013315685/AnsiballZ_user.py'
Feb 16 13:12:07 compute-0 sudo[177883]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:12:07 compute-0 podman[177846]: 2026-02-16 13:12:07.209137374 +0000 UTC m=+0.063992252 container health_status d69bd0be854ec8043fcba555b70c2cd311c7a2510d07d6bc9929fe7684abece9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Feb 16 13:12:07 compute-0 python3.9[177891]: ansible-ansible.builtin.user Invoked with comment=nova user group=nova groups=['libvirt'] name=nova shell=/bin/sh state=present uid=42436 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Feb 16 13:12:07 compute-0 useradd[177896]: new user: name=nova, UID=42436, GID=42436, home=/home/nova, shell=/bin/sh, from=/dev/pts/0
Feb 16 13:12:07 compute-0 useradd[177896]: add 'nova' to group 'libvirt'
Feb 16 13:12:07 compute-0 useradd[177896]: add 'nova' to shadow group 'libvirt'
Feb 16 13:12:07 compute-0 sudo[177883]: pam_unix(sudo:session): session closed for user root
Feb 16 13:12:08 compute-0 sshd-session[177927]: Accepted publickey for zuul from 192.168.122.30 port 56636 ssh2: ECDSA SHA256:6P1Q60WP6ePVjzkJMgpM7PslBYS6YIK3pVS2L1FZ4Yk
Feb 16 13:12:08 compute-0 systemd-logind[818]: New session 25 of user zuul.
Feb 16 13:12:08 compute-0 systemd[1]: Started Session 25 of User zuul.
Feb 16 13:12:08 compute-0 sshd-session[177927]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 16 13:12:09 compute-0 sshd-session[177930]: Received disconnect from 192.168.122.30 port 56636:11: disconnected by user
Feb 16 13:12:09 compute-0 sshd-session[177930]: Disconnected from user zuul 192.168.122.30 port 56636
Feb 16 13:12:09 compute-0 sshd-session[177927]: pam_unix(sshd:session): session closed for user zuul
Feb 16 13:12:09 compute-0 systemd[1]: session-25.scope: Deactivated successfully.
Feb 16 13:12:09 compute-0 systemd-logind[818]: Session 25 logged out. Waiting for processes to exit.
Feb 16 13:12:09 compute-0 systemd-logind[818]: Removed session 25.
Feb 16 13:12:09 compute-0 python3.9[178080]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/nova/nova-blank.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:12:09 compute-0 python3.9[178156]: ansible-ansible.legacy.file Invoked with mode=0644 setype=container_file_t dest=/var/lib/openstack/nova/nova-blank.conf _original_basename=nova-blank.conf recurse=False state=file path=/var/lib/openstack/nova/nova-blank.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 16 13:12:10 compute-0 python3.9[178306]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/nova/ssh-config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:12:10 compute-0 podman[178401]: 2026-02-16 13:12:10.838918734 +0000 UTC m=+0.062176111 container health_status 19961a2f872cc72daf954f4dd556f980b5f58f137d145776df6132c167a8a93c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 16 13:12:10 compute-0 python3.9[178440]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/nova/ssh-config mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771247530.082864-1791-273280741396178/.source _original_basename=ssh-config follow=False checksum=4297f735c41bdc1ff52d72e6f623a02242f37958 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 16 13:12:11 compute-0 python3.9[178603]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/nova/nova_statedir_ownership.py follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:12:12 compute-0 python3.9[178724]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/nova/nova_statedir_ownership.py mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771247531.1191716-1791-31281092476454/.source.py _original_basename=nova_statedir_ownership.py follow=False checksum=c6c8a3cfefa5efd60ceb1408c4e977becedb71e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 16 13:12:12 compute-0 python3.9[178874]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/nova/run-on-host follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:12:13 compute-0 python3.9[178995]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/nova/run-on-host mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771247532.3612888-1791-195467073493592/.source _original_basename=run-on-host follow=False checksum=93aba8edc83d5878604a66d37fea2f12b60bdea2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 16 13:12:13 compute-0 python3.9[179145]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/nova/02-nova-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:12:14 compute-0 python3.9[179266]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/nova/02-nova-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771247533.5026581-1899-199854416994455/.source.conf follow=False _original_basename=02-nova-host-specific.conf.j2 checksum=1feba546d0beacad9258164ab79b8a747685ccc8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 16 13:12:14 compute-0 sudo[179416]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hllocrtlxhgeoowibfscpaikaknwjjbr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247534.574937-1929-110148754392828/AnsiballZ_file.py'
Feb 16 13:12:14 compute-0 sudo[179416]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:12:15 compute-0 python3.9[179418]: ansible-ansible.builtin.file Invoked with group=nova mode=0700 owner=nova path=/home/nova/.ssh state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:12:15 compute-0 sudo[179416]: pam_unix(sudo:session): session closed for user root
Feb 16 13:12:15 compute-0 sudo[179568]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-quzgaslseioqnleuovdgdaujrnaqmhbh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247535.2753918-1945-113558699118466/AnsiballZ_copy.py'
Feb 16 13:12:15 compute-0 sudo[179568]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:12:15 compute-0 python3.9[179570]: ansible-ansible.legacy.copy Invoked with dest=/home/nova/.ssh/authorized_keys group=nova mode=0600 owner=nova remote_src=True src=/var/lib/openstack/nova/ssh-publickey backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:12:15 compute-0 sudo[179568]: pam_unix(sudo:session): session closed for user root
Feb 16 13:12:16 compute-0 sudo[179720]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kzlausufapvhwgmahytjyydmzeydjqjm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247536.263579-1961-23083902969085/AnsiballZ_stat.py'
Feb 16 13:12:16 compute-0 sudo[179720]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:12:16 compute-0 python3.9[179722]: ansible-ansible.builtin.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 16 13:12:16 compute-0 sudo[179720]: pam_unix(sudo:session): session closed for user root
Feb 16 13:12:17 compute-0 sudo[179872]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-voxxvlhgtscjssmwynatfhoakfkoykcl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247536.8157396-1977-178209365149038/AnsiballZ_stat.py'
Feb 16 13:12:17 compute-0 sudo[179872]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:12:17 compute-0 python3.9[179874]: ansible-ansible.legacy.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:12:17 compute-0 sudo[179872]: pam_unix(sudo:session): session closed for user root
Feb 16 13:12:17 compute-0 sudo[179995]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eccnkaqvyfobjogndzefebmvvzcztmyk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247536.8157396-1977-178209365149038/AnsiballZ_copy.py'
Feb 16 13:12:17 compute-0 sudo[179995]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:12:17 compute-0 python3.9[179997]: ansible-ansible.legacy.copy Invoked with attributes=+i dest=/var/lib/nova/compute_id group=nova mode=0400 owner=nova src=/home/zuul/.ansible/tmp/ansible-tmp-1771247536.8157396-1977-178209365149038/.source _original_basename=.wmlial0x follow=False checksum=c59bb849d4258243b6df389b11d934a684b189a4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None
Feb 16 13:12:17 compute-0 sudo[179995]: pam_unix(sudo:session): session closed for user root
Feb 16 13:12:18 compute-0 python3.9[180149]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 16 13:12:19 compute-0 sudo[180301]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-igzutquzyulhcqqinlnatyknerrinadt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247539.1525152-2033-136767059425572/AnsiballZ_file.py'
Feb 16 13:12:19 compute-0 sudo[180301]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:12:19 compute-0 python3.9[180303]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:12:19 compute-0 sudo[180301]: pam_unix(sudo:session): session closed for user root
Feb 16 13:12:20 compute-0 sudo[180453]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uxubwovkiemyeavebqvufftrhcgvjxoy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247539.9399457-2049-261637164692479/AnsiballZ_file.py'
Feb 16 13:12:20 compute-0 sudo[180453]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:12:20 compute-0 python3.9[180455]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 16 13:12:20 compute-0 sudo[180453]: pam_unix(sudo:session): session closed for user root
Feb 16 13:12:20 compute-0 python3.9[180605]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/nova_compute_init state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:12:23 compute-0 sudo[181026]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ysiteilijkvgxnzpdbwitacpakuqsstr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247543.3455405-2117-2710320801549/AnsiballZ_container_config_data.py'
Feb 16 13:12:23 compute-0 sudo[181026]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:12:24 compute-0 python3.9[181028]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/nova_compute_init config_pattern=*.json debug=False
Feb 16 13:12:24 compute-0 sudo[181026]: pam_unix(sudo:session): session closed for user root
Feb 16 13:12:24 compute-0 sudo[181178]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dhkuqxxegzfevoqcvgwkhjlprczpizbf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247544.4507353-2139-255854129552536/AnsiballZ_container_config_hash.py'
Feb 16 13:12:24 compute-0 sudo[181178]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:12:25 compute-0 python3.9[181180]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Feb 16 13:12:25 compute-0 sudo[181178]: pam_unix(sudo:session): session closed for user root
Feb 16 13:12:25 compute-0 sudo[181330]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rrytoleosxbzxroiuxtbojygurgjwtby ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1771247545.3983266-2159-85675159010144/AnsiballZ_edpm_container_manage.py'
Feb 16 13:12:25 compute-0 sudo[181330]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:12:26 compute-0 python3[181332]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/nova_compute_init config_id=nova_compute_init config_overrides={} config_patterns=*.json containers=['nova_compute_init'] log_base_path=/var/log/containers/stdouts debug=False
Feb 16 13:12:26 compute-0 podman[181368]: 2026-02-16 13:12:26.37130909 +0000 UTC m=+0.113153790 container create 71fc04e292fa5c5973fc1e2ff0bb481dd4d6957cb4d5278dd3b5424f780b2be6 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, container_name=nova_compute_init, tcib_managed=true, org.label-schema.build-date=20260127, config_id=nova_compute_init, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, config_data={'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False, 'EDPM_CONFIG_HASH': '1fd557e901d7d9afcb043f19c585a1ce3f9fc9352d4613ac527b7499231e4d28'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'none', 'privileged': False, 'restart': 'never', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']})
Feb 16 13:12:26 compute-0 podman[181368]: 2026-02-16 13:12:26.275525382 +0000 UTC m=+0.017370082 image pull f4e0688689eb3c524117ae65df199eeb4e620e591d26898b5cb25b819a2d79fd quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Feb 16 13:12:26 compute-0 python3[181332]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute_init --conmon-pidfile /run/nova_compute_init.pid --env NOVA_STATEDIR_OWNERSHIP_SKIP=/var/lib/nova/compute_id --env __OS_DEBUG=False --env EDPM_CONFIG_HASH=1fd557e901d7d9afcb043f19c585a1ce3f9fc9352d4613ac527b7499231e4d28 --label config_id=nova_compute_init --label container_name=nova_compute_init --label managed_by=edpm_ansible --label config_data={'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False, 'EDPM_CONFIG_HASH': '1fd557e901d7d9afcb043f19c585a1ce3f9fc9352d4613ac527b7499231e4d28'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'none', 'privileged': False, 'restart': 'never', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']} --log-driver journald --log-level info --network none --privileged=False --security-opt label=disable --user root --volume /dev/log:/dev/log --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z --volume /var/lib/openstack/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init
Feb 16 13:12:26 compute-0 sudo[181330]: pam_unix(sudo:session): session closed for user root
Feb 16 13:12:27 compute-0 sudo[181556]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kuytrwikcbhgbtbbfbzkaijovjttaoet ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247546.6612687-2175-83215947972877/AnsiballZ_stat.py'
Feb 16 13:12:27 compute-0 sudo[181556]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:12:27 compute-0 python3.9[181558]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 16 13:12:27 compute-0 sudo[181556]: pam_unix(sudo:session): session closed for user root
Feb 16 13:12:28 compute-0 python3.9[181710]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Feb 16 13:12:29 compute-0 sudo[181860]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hofqffgcyznzqweeemwfrrwbchhjtxkr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247549.1074824-2229-55514150410585/AnsiballZ_stat.py'
Feb 16 13:12:29 compute-0 sudo[181860]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:12:29 compute-0 python3.9[181862]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:12:29 compute-0 sudo[181860]: pam_unix(sudo:session): session closed for user root
Feb 16 13:12:29 compute-0 sudo[181985]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zuizlcpeztsbidkgsstjfuwvpeqyspwe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247549.1074824-2229-55514150410585/AnsiballZ_copy.py'
Feb 16 13:12:29 compute-0 sudo[181985]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:12:30 compute-0 python3.9[181987]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771247549.1074824-2229-55514150410585/.source.yaml _original_basename=.rs8_hjol follow=False checksum=35696d2121a915adbae8ecc15e69892c5fbd315a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:12:30 compute-0 sudo[181985]: pam_unix(sudo:session): session closed for user root
Feb 16 13:12:30 compute-0 sudo[182137]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jkkcfoaebgkbcqencjanjgusuxojsonc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247550.5218835-2263-46660306303718/AnsiballZ_file.py'
Feb 16 13:12:30 compute-0 sudo[182137]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:12:30 compute-0 python3.9[182139]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:12:30 compute-0 sudo[182137]: pam_unix(sudo:session): session closed for user root
Feb 16 13:12:31 compute-0 sudo[182289]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pgwylwxlfcrsozgllhjdorthnrybdyzh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247551.2928197-2279-2193493485974/AnsiballZ_file.py'
Feb 16 13:12:31 compute-0 sudo[182289]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:12:31 compute-0 python3.9[182291]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 16 13:12:31 compute-0 sudo[182289]: pam_unix(sudo:session): session closed for user root
Feb 16 13:12:32 compute-0 sudo[182441]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eyljlnpsnisjuibubaulropnmkjcfhxd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247551.9279904-2295-15410555096015/AnsiballZ_stat.py'
Feb 16 13:12:32 compute-0 sudo[182441]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:12:32 compute-0 python3.9[182443]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/nova_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:12:32 compute-0 sudo[182441]: pam_unix(sudo:session): session closed for user root
Feb 16 13:12:32 compute-0 sudo[182564]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kedngvkqjnkmlyygdzurytwrwktinmtk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247551.9279904-2295-15410555096015/AnsiballZ_copy.py'
Feb 16 13:12:32 compute-0 sudo[182564]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:12:32 compute-0 python3.9[182566]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/nova_compute.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1771247551.9279904-2295-15410555096015/.source.json _original_basename=.99g_upmg follow=False checksum=0018389a48392615f4a8869cad43008a907328ff backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:12:32 compute-0 sudo[182564]: pam_unix(sudo:session): session closed for user root
Feb 16 13:12:33 compute-0 python3.9[182716]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/nova_compute state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:12:35 compute-0 sudo[183137]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vkehqqxzgcyczwvrxseyepgwkvmtstrz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247555.4491386-2375-268009506927686/AnsiballZ_container_config_data.py'
Feb 16 13:12:35 compute-0 sudo[183137]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:12:35 compute-0 python3.9[183139]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/nova_compute config_pattern=*.json debug=False
Feb 16 13:12:35 compute-0 sudo[183137]: pam_unix(sudo:session): session closed for user root
Feb 16 13:12:36 compute-0 sudo[183289]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ghtxfsjofyhfnrompfezwojpxfnewpsn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247556.3251624-2397-215981921311089/AnsiballZ_container_config_hash.py'
Feb 16 13:12:36 compute-0 sudo[183289]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:12:36 compute-0 python3.9[183291]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Feb 16 13:12:36 compute-0 sudo[183289]: pam_unix(sudo:session): session closed for user root
Feb 16 13:12:37 compute-0 sudo[183454]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hetpqxmalqvswavhdcasmcqwgjewlbsp ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1771247557.2085662-2417-35595972285750/AnsiballZ_edpm_container_manage.py'
Feb 16 13:12:37 compute-0 sudo[183454]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:12:37 compute-0 podman[183417]: 2026-02-16 13:12:37.500441395 +0000 UTC m=+0.068305535 container health_status d69bd0be854ec8043fcba555b70c2cd311c7a2510d07d6bc9929fe7684abece9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Feb 16 13:12:37 compute-0 python3[183458]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/nova_compute config_id=nova_compute config_overrides={} config_patterns=*.json containers=['nova_compute'] log_base_path=/var/log/containers/stdouts debug=False
Feb 16 13:12:37 compute-0 podman[183501]: 2026-02-16 13:12:37.902122046 +0000 UTC m=+0.053426583 container create c4f5f852a5697c13fdc119ceb2edf67db899b1ee30325cc905694a680f74e3f8 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855-1fd557e901d7d9afcb043f19c585a1ce3f9fc9352d4613ac527b7499231e4d28'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'nova', 'volumes': ['/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, tcib_managed=true, config_id=nova_compute, managed_by=edpm_ansible, container_name=nova_compute, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 16 13:12:37 compute-0 podman[183501]: 2026-02-16 13:12:37.874385624 +0000 UTC m=+0.025690261 image pull f4e0688689eb3c524117ae65df199eeb4e620e591d26898b5cb25b819a2d79fd quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Feb 16 13:12:37 compute-0 python3[183458]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute --conmon-pidfile /run/nova_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855-1fd557e901d7d9afcb043f19c585a1ce3f9fc9352d4613ac527b7499231e4d28 --label config_id=nova_compute --label container_name=nova_compute --label managed_by=edpm_ansible --label config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855-1fd557e901d7d9afcb043f19c585a1ce3f9fc9352d4613ac527b7499231e4d28'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'nova', 'volumes': ['/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']} --log-driver journald --log-level info --network host --pid host --privileged=True --user nova --volume /var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro --volume /var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /etc/localtime:/etc/localtime:ro --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /var/lib/libvirt:/var/lib/libvirt --volume /run/libvirt:/run/libvirt:shared --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/iscsi:/var/lib/iscsi --volume /etc/multipath:/etc/multipath --volume /etc/multipath.conf:/etc/multipath.conf:ro,Z --volume /etc/iscsi:/etc/iscsi:ro --volume /etc/nvme:/etc/nvme --volume /var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified kolla_start
Feb 16 13:12:38 compute-0 sudo[183454]: pam_unix(sudo:session): session closed for user root
Feb 16 13:12:38 compute-0 sshd-session[183334]: Connection closed by authenticating user root 146.190.226.24 port 50546 [preauth]
Feb 16 13:12:38 compute-0 sudo[183687]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ptkilspzudmbvtxqexxfcraipjhfjleo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247558.2095268-2433-14926927160318/AnsiballZ_stat.py'
Feb 16 13:12:38 compute-0 sudo[183687]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:12:38 compute-0 python3.9[183689]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 16 13:12:38 compute-0 sudo[183687]: pam_unix(sudo:session): session closed for user root
Feb 16 13:12:39 compute-0 sudo[183841]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mrqoenyawxflktannbmdmjvnigbpaldt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247558.967621-2451-216126934928697/AnsiballZ_file.py'
Feb 16 13:12:39 compute-0 sudo[183841]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:12:39 compute-0 python3.9[183843]: ansible-file Invoked with path=/etc/systemd/system/edpm_nova_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:12:39 compute-0 sudo[183841]: pam_unix(sudo:session): session closed for user root
Feb 16 13:12:39 compute-0 sudo[183917]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hgjxgpravvhmjmvjuftysspnzivmhgcs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247558.967621-2451-216126934928697/AnsiballZ_stat.py'
Feb 16 13:12:39 compute-0 sudo[183917]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:12:40 compute-0 python3.9[183919]: ansible-stat Invoked with path=/etc/systemd/system/edpm_nova_compute_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 16 13:12:40 compute-0 sudo[183917]: pam_unix(sudo:session): session closed for user root
Feb 16 13:12:40 compute-0 sudo[184068]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ecsjxzkgwyhbfijnutoykiwtcovxlnjc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247560.1126633-2451-243398482335054/AnsiballZ_copy.py'
Feb 16 13:12:40 compute-0 sudo[184068]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:12:40 compute-0 python3.9[184070]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1771247560.1126633-2451-243398482335054/source dest=/etc/systemd/system/edpm_nova_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:12:40 compute-0 sudo[184068]: pam_unix(sudo:session): session closed for user root
Feb 16 13:12:40 compute-0 sudo[184144]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uvbjvhyubnnqxqndgzqhqigiieqhwwhc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247560.1126633-2451-243398482335054/AnsiballZ_systemd.py'
Feb 16 13:12:40 compute-0 sudo[184144]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:12:41 compute-0 podman[184146]: 2026-02-16 13:12:41.075259001 +0000 UTC m=+0.143082490 container health_status 19961a2f872cc72daf954f4dd556f980b5f58f137d145776df6132c167a8a93c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible)
Feb 16 13:12:41 compute-0 python3.9[184147]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 16 13:12:41 compute-0 systemd[1]: Reloading.
Feb 16 13:12:41 compute-0 systemd-rc-local-generator[184202]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 13:12:41 compute-0 systemd-sysv-generator[184205]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 16 13:12:41 compute-0 sudo[184144]: pam_unix(sudo:session): session closed for user root
Feb 16 13:12:41 compute-0 sudo[184289]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bqwvgmcancvzqslfsdevrohtfrcbpfsm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247560.1126633-2451-243398482335054/AnsiballZ_systemd.py'
Feb 16 13:12:41 compute-0 sudo[184289]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:12:42 compute-0 python3.9[184291]: ansible-systemd Invoked with state=restarted name=edpm_nova_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 16 13:12:42 compute-0 systemd[1]: Reloading.
Feb 16 13:12:42 compute-0 systemd-rc-local-generator[184316]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 13:12:42 compute-0 systemd-sysv-generator[184320]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 16 13:12:42 compute-0 systemd[1]: Starting nova_compute container...
Feb 16 13:12:42 compute-0 systemd[1]: Started libcrun container.
Feb 16 13:12:42 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a804bc366f293b19dde39cfde901c4b366dd2f54137430ab9bb6735d55d8502e/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Feb 16 13:12:42 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a804bc366f293b19dde39cfde901c4b366dd2f54137430ab9bb6735d55d8502e/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Feb 16 13:12:42 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a804bc366f293b19dde39cfde901c4b366dd2f54137430ab9bb6735d55d8502e/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Feb 16 13:12:42 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a804bc366f293b19dde39cfde901c4b366dd2f54137430ab9bb6735d55d8502e/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Feb 16 13:12:42 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a804bc366f293b19dde39cfde901c4b366dd2f54137430ab9bb6735d55d8502e/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Feb 16 13:12:42 compute-0 podman[184338]: 2026-02-16 13:12:42.488072328 +0000 UTC m=+0.102199301 container init c4f5f852a5697c13fdc119ceb2edf67db899b1ee30325cc905694a680f74e3f8 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, config_id=nova_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=nova_compute, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855-1fd557e901d7d9afcb043f19c585a1ce3f9fc9352d4613ac527b7499231e4d28'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'nova', 'volumes': ['/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Feb 16 13:12:42 compute-0 podman[184338]: 2026-02-16 13:12:42.502867617 +0000 UTC m=+0.116994610 container start c4f5f852a5697c13fdc119ceb2edf67db899b1ee30325cc905694a680f74e3f8 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=nova_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855-1fd557e901d7d9afcb043f19c585a1ce3f9fc9352d4613ac527b7499231e4d28'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'nova', 'volumes': ['/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, container_name=nova_compute)
Feb 16 13:12:42 compute-0 podman[184338]: nova_compute
Feb 16 13:12:42 compute-0 systemd[1]: Started nova_compute container.
Feb 16 13:12:42 compute-0 nova_compute[184354]: + sudo -E kolla_set_configs
Feb 16 13:12:42 compute-0 sudo[184289]: pam_unix(sudo:session): session closed for user root
Feb 16 13:12:42 compute-0 nova_compute[184354]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Feb 16 13:12:42 compute-0 nova_compute[184354]: INFO:__main__:Validating config file
Feb 16 13:12:42 compute-0 nova_compute[184354]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Feb 16 13:12:42 compute-0 nova_compute[184354]: INFO:__main__:Copying service configuration files
Feb 16 13:12:42 compute-0 nova_compute[184354]: INFO:__main__:Deleting /etc/nova/nova.conf
Feb 16 13:12:42 compute-0 nova_compute[184354]: INFO:__main__:Copying /var/lib/kolla/config_files/src/nova-blank.conf to /etc/nova/nova.conf
Feb 16 13:12:42 compute-0 nova_compute[184354]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Feb 16 13:12:42 compute-0 nova_compute[184354]: INFO:__main__:Copying /var/lib/kolla/config_files/src/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Feb 16 13:12:42 compute-0 nova_compute[184354]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Feb 16 13:12:42 compute-0 nova_compute[184354]: INFO:__main__:Copying /var/lib/kolla/config_files/src/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Feb 16 13:12:42 compute-0 nova_compute[184354]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Feb 16 13:12:42 compute-0 nova_compute[184354]: INFO:__main__:Copying /var/lib/kolla/config_files/src/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Feb 16 13:12:42 compute-0 nova_compute[184354]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Feb 16 13:12:42 compute-0 nova_compute[184354]: INFO:__main__:Copying /var/lib/kolla/config_files/src/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Feb 16 13:12:42 compute-0 nova_compute[184354]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Feb 16 13:12:42 compute-0 nova_compute[184354]: INFO:__main__:Deleting /etc/ceph
Feb 16 13:12:42 compute-0 nova_compute[184354]: INFO:__main__:Creating directory /etc/ceph
Feb 16 13:12:42 compute-0 nova_compute[184354]: INFO:__main__:Setting permission for /etc/ceph
Feb 16 13:12:42 compute-0 nova_compute[184354]: INFO:__main__:Copying /var/lib/kolla/config_files/src/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Feb 16 13:12:42 compute-0 nova_compute[184354]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Feb 16 13:12:42 compute-0 nova_compute[184354]: INFO:__main__:Copying /var/lib/kolla/config_files/src/ssh-config to /var/lib/nova/.ssh/config
Feb 16 13:12:42 compute-0 nova_compute[184354]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Feb 16 13:12:42 compute-0 nova_compute[184354]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Feb 16 13:12:42 compute-0 nova_compute[184354]: INFO:__main__:Copying /var/lib/kolla/config_files/src/run-on-host to /usr/sbin/iscsiadm
Feb 16 13:12:42 compute-0 nova_compute[184354]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Feb 16 13:12:42 compute-0 nova_compute[184354]: INFO:__main__:Writing out command to execute
Feb 16 13:12:42 compute-0 nova_compute[184354]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Feb 16 13:12:42 compute-0 nova_compute[184354]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Feb 16 13:12:42 compute-0 nova_compute[184354]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Feb 16 13:12:42 compute-0 nova_compute[184354]: ++ cat /run_command
Feb 16 13:12:42 compute-0 nova_compute[184354]: + CMD=nova-compute
Feb 16 13:12:42 compute-0 nova_compute[184354]: + ARGS=
Feb 16 13:12:42 compute-0 nova_compute[184354]: + sudo kolla_copy_cacerts
Feb 16 13:12:42 compute-0 nova_compute[184354]: + [[ ! -n '' ]]
Feb 16 13:12:42 compute-0 nova_compute[184354]: + . kolla_extend_start
Feb 16 13:12:42 compute-0 nova_compute[184354]: Running command: 'nova-compute'
Feb 16 13:12:42 compute-0 nova_compute[184354]: + echo 'Running command: '\''nova-compute'\'''
Feb 16 13:12:42 compute-0 nova_compute[184354]: + umask 0022
Feb 16 13:12:42 compute-0 nova_compute[184354]: + exec nova-compute
Feb 16 13:12:43 compute-0 python3.9[184515]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Feb 16 13:12:44 compute-0 sudo[184668]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qxelwzuroslkrsyizeylzfnxuilfillc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247563.9968643-2541-171230754685910/AnsiballZ_stat.py'
Feb 16 13:12:44 compute-0 sudo[184668]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:12:44 compute-0 nova_compute[184354]: 2026-02-16 13:12:44.239 184358 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Feb 16 13:12:44 compute-0 nova_compute[184354]: 2026-02-16 13:12:44.239 184358 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Feb 16 13:12:44 compute-0 nova_compute[184354]: 2026-02-16 13:12:44.240 184358 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Feb 16 13:12:44 compute-0 nova_compute[184354]: 2026-02-16 13:12:44.240 184358 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs
Feb 16 13:12:44 compute-0 nova_compute[184354]: 2026-02-16 13:12:44.353 184358 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:12:44 compute-0 nova_compute[184354]: 2026-02-16 13:12:44.361 184358 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.007s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:12:44 compute-0 nova_compute[184354]: 2026-02-16 13:12:44.361 184358 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473
Feb 16 13:12:44 compute-0 python3.9[184670]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:12:44 compute-0 sudo[184668]: pam_unix(sudo:session): session closed for user root
Feb 16 13:12:44 compute-0 sudo[184795]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ihmdptrqqwamuvtmtjqlcrzbmmhdhiuz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247563.9968643-2541-171230754685910/AnsiballZ_copy.py'
Feb 16 13:12:44 compute-0 sudo[184795]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:12:44 compute-0 nova_compute[184354]: 2026-02-16 13:12:44.946 184358 INFO nova.virt.driver [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'
Feb 16 13:12:44 compute-0 python3.9[184797]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771247563.9968643-2541-171230754685910/.source.yaml _original_basename=.b_zda745 follow=False checksum=1f7d38bfcf59309f34da6c109f1ea60c6e218870 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:12:44 compute-0 sudo[184795]: pam_unix(sudo:session): session closed for user root
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.045 184358 INFO nova.compute.provider_config [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.071 184358 DEBUG oslo_concurrency.lockutils [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.071 184358 DEBUG oslo_concurrency.lockutils [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.071 184358 DEBUG oslo_concurrency.lockutils [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.072 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.072 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.072 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.072 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.072 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.072 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.073 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.073 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.073 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.073 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.073 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.073 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.073 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.073 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.074 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.074 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.074 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.074 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.074 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.074 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] console_host                   = compute-0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.075 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.075 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.075 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.075 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.075 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.075 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.075 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.076 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.076 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.076 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.076 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.076 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.077 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.077 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.077 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.077 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.077 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.077 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.077 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] host                           = compute-0.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.078 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.078 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.078 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.078 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.078 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.078 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.079 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.079 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.079 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.079 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.079 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.079 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.080 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.080 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.080 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.080 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.080 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.080 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.081 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.081 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.081 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.081 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.081 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.081 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.081 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.082 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.082 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.082 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.082 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.082 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.082 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.082 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.082 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.083 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.083 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.083 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.083 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.083 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.083 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.083 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.084 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.084 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] my_block_storage_ip            = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.084 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] my_ip                          = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.084 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.084 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.084 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.085 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.085 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.085 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.085 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.085 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.085 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.085 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.086 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.086 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.086 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.086 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.086 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.086 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.086 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.087 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.087 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.087 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.087 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.087 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.087 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.087 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.088 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.088 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.088 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.088 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.088 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.088 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.088 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.089 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.089 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.089 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.089 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.089 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.089 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.089 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.089 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.090 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.090 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.090 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.090 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.090 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.090 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.090 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.091 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.091 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.091 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.091 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.091 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.091 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.091 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.092 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.092 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.092 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.092 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.092 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.092 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.093 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.093 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.093 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.093 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.093 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.093 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.093 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.094 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.094 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.094 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.094 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.094 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.094 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.095 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.095 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.095 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.095 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.095 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.095 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.095 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.096 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.096 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.096 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.096 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.096 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.096 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.097 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.097 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.097 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.097 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.097 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.097 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.097 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.098 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.098 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.098 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.098 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.098 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.098 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.099 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.099 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.099 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.099 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.099 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.099 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.099 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.100 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.100 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.100 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.100 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.100 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.100 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.101 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.101 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.101 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.101 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.101 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.101 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.102 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.102 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.102 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.102 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.102 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.102 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.102 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.102 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.103 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.103 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.103 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.103 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.103 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.103 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.103 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.104 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.104 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.104 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.104 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.104 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.105 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.105 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.105 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.105 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] cinder.os_region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.105 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.105 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.105 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.106 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.106 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.106 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.106 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.106 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.106 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.106 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.107 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.107 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.107 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.107 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.107 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.107 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.108 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.108 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.108 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.108 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.108 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.108 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.109 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.109 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.109 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.109 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.109 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.109 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.109 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.110 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.110 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.110 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.110 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.110 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.110 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.110 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.111 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.111 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.111 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.111 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.111 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.111 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.112 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.112 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.112 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.112 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.112 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.112 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.112 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.113 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.113 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.113 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.113 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.113 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.113 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.113 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.113 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.114 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.114 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.114 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.114 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.114 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.114 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.115 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.115 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.115 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.115 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.115 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.115 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.115 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.116 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.116 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.116 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.116 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.116 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.116 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.116 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.117 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.117 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.117 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.117 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.117 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.117 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.118 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.118 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.118 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.118 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.118 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.118 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.119 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.119 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.119 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.119 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.119 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.119 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.119 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.120 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.120 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.120 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.120 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.120 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.120 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.120 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.121 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.121 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.121 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.121 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.121 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.121 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.121 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.121 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.122 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.122 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.122 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.122 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.122 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.122 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.122 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.123 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.123 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.123 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.123 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.123 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.123 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.123 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.124 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.124 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.124 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.124 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.124 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.124 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.124 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.125 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.125 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.125 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.125 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.125 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.125 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.126 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.126 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.126 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.126 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.126 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.126 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.126 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.127 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.127 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.127 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.127 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.127 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.127 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.127 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.128 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.128 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.128 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.128 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.128 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.128 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.129 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.129 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.129 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.129 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.129 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.130 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.130 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.130 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.130 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.130 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.131 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.131 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.131 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.131 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] barbican.barbican_region_name  = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.131 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.131 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.132 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.132 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.132 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.132 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.132 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.133 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.133 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.133 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.133 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.133 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.133 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.134 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.134 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.134 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.134 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.134 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.134 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.134 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.135 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.135 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.135 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.135 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.135 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.135 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.136 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.136 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.136 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.136 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.136 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.136 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.137 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.137 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.137 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.137 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.137 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.138 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.138 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.138 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.138 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.138 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.138 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.139 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.139 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.139 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.139 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.139 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.139 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.140 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.140 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.140 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.140 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.140 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.140 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.141 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.141 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.141 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] libvirt.cpu_mode               = custom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.141 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.141 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] libvirt.cpu_models             = ['Nehalem'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.141 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.142 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.142 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.142 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.142 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.142 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.143 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.143 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.143 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.143 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.143 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.144 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.144 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.144 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] libvirt.images_rbd_ceph_conf   =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.144 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.144 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.145 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] libvirt.images_rbd_glance_store_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.145 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] libvirt.images_rbd_pool        = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.145 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] libvirt.images_type            = qcow2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.145 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.145 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.145 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.145 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.146 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.146 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.146 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.146 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.146 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.146 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.146 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.147 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.147 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.147 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.147 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.147 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.147 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.148 184358 WARNING oslo_config.cfg [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Feb 16 13:12:45 compute-0 nova_compute[184354]: live_migration_uri is deprecated for removal in favor of two other options that
Feb 16 13:12:45 compute-0 nova_compute[184354]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Feb 16 13:12:45 compute-0 nova_compute[184354]: and ``live_migration_inbound_addr`` respectively.
Feb 16 13:12:45 compute-0 nova_compute[184354]: ).  Its value may be silently ignored in the future.
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.148 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.148 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.148 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.148 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.148 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.148 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.149 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.149 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.149 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.149 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.149 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.149 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.149 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.150 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.150 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.150 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.150 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.150 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.150 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] libvirt.rbd_secret_uuid        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.150 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] libvirt.rbd_user               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.151 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.151 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.151 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.151 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.151 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.151 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.151 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.152 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.152 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.152 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.152 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.152 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.152 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.152 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.153 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.153 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.153 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.153 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.153 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.153 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.153 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.154 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.154 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.154 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.154 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.154 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.154 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.154 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.155 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.155 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.155 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.155 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.155 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.155 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.156 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.156 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.156 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.156 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.156 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.156 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.156 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.156 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.157 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.157 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.157 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.157 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.157 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.157 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.157 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.158 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.158 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.158 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.158 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.158 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.158 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.158 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.159 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.159 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.159 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.159 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.159 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.159 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.159 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.160 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.160 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.160 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.160 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.160 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.160 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.160 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.161 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.161 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.161 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.161 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.161 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.161 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.161 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.161 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.162 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.162 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.162 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.162 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.162 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.162 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.162 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.163 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.163 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.163 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.163 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.163 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.163 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.163 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.164 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.164 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.164 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.164 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.164 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.164 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.165 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.165 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.165 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.165 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.165 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.165 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.165 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.166 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.166 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.166 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.166 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.166 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.166 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.166 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.167 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.167 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.167 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.167 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.167 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.167 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.168 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.168 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.168 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.168 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.168 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.169 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.169 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.169 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.169 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.169 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.169 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.169 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.170 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.170 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.170 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.170 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.170 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.170 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.170 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.171 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.171 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.171 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.171 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.171 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.172 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.172 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.172 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.172 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.172 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.173 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.173 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.173 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.173 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.173 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.173 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.174 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.174 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.174 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.174 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.174 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.175 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.175 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.175 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.175 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.176 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.176 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.176 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.176 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.176 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.176 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.177 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.177 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.177 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.177 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.177 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.178 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.178 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.178 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.178 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.178 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.179 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.179 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.179 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.179 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.179 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.180 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.180 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.180 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.180 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.180 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.181 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.181 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.181 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.181 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.181 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.181 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.182 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.182 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.182 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.182 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.182 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.183 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.183 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.183 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.183 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.183 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.183 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.184 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.184 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.184 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.184 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.184 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.185 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.185 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.185 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.185 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.185 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.186 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.186 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.186 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.186 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.186 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.186 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.187 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.187 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.187 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.187 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.187 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.188 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.188 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.188 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.188 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.189 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.189 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.189 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] vnc.server_proxyclient_address = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.189 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.189 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.190 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.190 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.190 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.190 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.190 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.191 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.191 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.191 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.191 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.191 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.191 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.192 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.192 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.192 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.192 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.192 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.193 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.193 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.193 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.193 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.193 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.193 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.194 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.194 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.194 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.194 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.194 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.195 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.195 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.195 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.195 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.195 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.195 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.196 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.196 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.196 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.196 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.197 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.197 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.197 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.197 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.197 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.197 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.198 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.198 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.198 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.198 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.198 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.199 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.199 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.199 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.199 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.199 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.200 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.200 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.200 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.200 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.200 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.200 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.201 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.201 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.201 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.201 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.201 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.202 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.202 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.202 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.202 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.202 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.202 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.203 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.203 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.203 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.203 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.203 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.204 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.204 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.204 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.204 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.204 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.205 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.205 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.205 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.205 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.205 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.206 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.206 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.206 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.206 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.206 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.206 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.207 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.207 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.207 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.207 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.207 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.208 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.208 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.208 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.208 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.208 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.208 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.208 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.209 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.209 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.209 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.209 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.209 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.209 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.209 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.209 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.210 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.210 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.210 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.210 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.210 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.210 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.210 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.210 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.211 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.211 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.211 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.211 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.211 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.211 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.211 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.212 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.212 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.212 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.212 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.212 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.212 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.212 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.213 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.213 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.213 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.213 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.213 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.213 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.213 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.214 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.214 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.214 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.214 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.214 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.214 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.214 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.214 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.215 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.215 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.215 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.215 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.215 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.215 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.215 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.216 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.216 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.216 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.216 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.216 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.216 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.216 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.217 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.217 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.217 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.217 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.217 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.217 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.217 184358 DEBUG oslo_service.service [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.218 184358 INFO nova.service [-] Starting compute node (version 27.5.2-0.20260127144738.eaa65f0.el9)
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.240 184358 DEBUG nova.virt.libvirt.host [None req-bfb29ca0-528a-4b66-b36a-3a224a38a045 - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.241 184358 DEBUG nova.virt.libvirt.host [None req-bfb29ca0-528a-4b66-b36a-3a224a38a045 - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.242 184358 DEBUG nova.virt.libvirt.host [None req-bfb29ca0-528a-4b66-b36a-3a224a38a045 - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.242 184358 DEBUG nova.virt.libvirt.host [None req-bfb29ca0-528a-4b66-b36a-3a224a38a045 - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503
Feb 16 13:12:45 compute-0 systemd[1]: Starting libvirt QEMU daemon...
Feb 16 13:12:45 compute-0 systemd[1]: Started libvirt QEMU daemon.
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.305 184358 DEBUG nova.virt.libvirt.host [None req-bfb29ca0-528a-4b66-b36a-3a224a38a045 - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7f7b00d97e50> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.308 184358 DEBUG nova.virt.libvirt.host [None req-bfb29ca0-528a-4b66-b36a-3a224a38a045 - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7f7b00d97e50> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.309 184358 INFO nova.virt.libvirt.driver [None req-bfb29ca0-528a-4b66-b36a-3a224a38a045 - - - - - -] Connection event '1' reason 'None'
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.336 184358 WARNING nova.virt.libvirt.driver [None req-bfb29ca0-528a-4b66-b36a-3a224a38a045 - - - - - -] Cannot update service status on host "compute-0.ctlplane.example.com" since it is not registered.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-0.ctlplane.example.com could not be found.
Feb 16 13:12:45 compute-0 nova_compute[184354]: 2026-02-16 13:12:45.337 184358 DEBUG nova.virt.libvirt.volume.mount [None req-bfb29ca0-528a-4b66-b36a-3a224a38a045 - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130
Feb 16 13:12:45 compute-0 python3.9[185007]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner_healthcheck.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 16 13:12:46 compute-0 nova_compute[184354]: 2026-02-16 13:12:46.127 184358 INFO nova.virt.libvirt.host [None req-bfb29ca0-528a-4b66-b36a-3a224a38a045 - - - - - -] Libvirt host capabilities <capabilities>
Feb 16 13:12:46 compute-0 nova_compute[184354]: 
Feb 16 13:12:46 compute-0 nova_compute[184354]:   <host>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     <uuid>880a2e20-6f11-46bb-b51c-b4136280b28f</uuid>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     <cpu>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <arch>x86_64</arch>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model>EPYC-Rome-v4</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <vendor>AMD</vendor>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <microcode version='16777317'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <signature family='23' model='49' stepping='0'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <maxphysaddr mode='emulate' bits='40'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <feature name='x2apic'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <feature name='tsc-deadline'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <feature name='osxsave'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <feature name='hypervisor'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <feature name='tsc_adjust'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <feature name='spec-ctrl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <feature name='stibp'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <feature name='arch-capabilities'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <feature name='ssbd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <feature name='cmp_legacy'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <feature name='topoext'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <feature name='virt-ssbd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <feature name='lbrv'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <feature name='tsc-scale'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <feature name='vmcb-clean'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <feature name='pause-filter'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <feature name='pfthreshold'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <feature name='svme-addr-chk'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <feature name='rdctl-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <feature name='skip-l1dfl-vmentry'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <feature name='mds-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <feature name='pschange-mc-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <pages unit='KiB' size='4'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <pages unit='KiB' size='2048'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <pages unit='KiB' size='1048576'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     </cpu>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     <power_management>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <suspend_mem/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <suspend_disk/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <suspend_hybrid/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     </power_management>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     <iommu support='no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     <migration_features>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <live/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <uri_transports>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <uri_transport>tcp</uri_transport>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <uri_transport>rdma</uri_transport>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </uri_transports>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     </migration_features>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     <topology>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <cells num='1'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <cell id='0'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:           <memory unit='KiB'>7864292</memory>
Feb 16 13:12:46 compute-0 nova_compute[184354]:           <pages unit='KiB' size='4'>1966073</pages>
Feb 16 13:12:46 compute-0 nova_compute[184354]:           <pages unit='KiB' size='2048'>0</pages>
Feb 16 13:12:46 compute-0 nova_compute[184354]:           <pages unit='KiB' size='1048576'>0</pages>
Feb 16 13:12:46 compute-0 nova_compute[184354]:           <distances>
Feb 16 13:12:46 compute-0 nova_compute[184354]:             <sibling id='0' value='10'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:           </distances>
Feb 16 13:12:46 compute-0 nova_compute[184354]:           <cpus num='8'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:             <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:             <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:             <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:             <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:             <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:             <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:             <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:             <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:           </cpus>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         </cell>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </cells>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     </topology>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     <cache>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     </cache>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     <secmodel>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model>selinux</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <doi>0</doi>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     </secmodel>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     <secmodel>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model>dac</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <doi>0</doi>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <baselabel type='kvm'>+107:+107</baselabel>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <baselabel type='qemu'>+107:+107</baselabel>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     </secmodel>
Feb 16 13:12:46 compute-0 nova_compute[184354]:   </host>
Feb 16 13:12:46 compute-0 nova_compute[184354]: 
Feb 16 13:12:46 compute-0 nova_compute[184354]:   <guest>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     <os_type>hvm</os_type>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     <arch name='i686'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <wordsize>32</wordsize>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <domain type='qemu'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <domain type='kvm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     </arch>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     <features>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <pae/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <nonpae/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <acpi default='on' toggle='yes'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <apic default='on' toggle='no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <cpuselection/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <deviceboot/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <disksnapshot default='on' toggle='no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <externalSnapshot/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     </features>
Feb 16 13:12:46 compute-0 nova_compute[184354]:   </guest>
Feb 16 13:12:46 compute-0 nova_compute[184354]: 
Feb 16 13:12:46 compute-0 nova_compute[184354]:   <guest>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     <os_type>hvm</os_type>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     <arch name='x86_64'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <wordsize>64</wordsize>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <domain type='qemu'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <domain type='kvm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     </arch>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     <features>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <acpi default='on' toggle='yes'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <apic default='on' toggle='no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <cpuselection/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <deviceboot/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <disksnapshot default='on' toggle='no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <externalSnapshot/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     </features>
Feb 16 13:12:46 compute-0 nova_compute[184354]:   </guest>
Feb 16 13:12:46 compute-0 nova_compute[184354]: 
Feb 16 13:12:46 compute-0 nova_compute[184354]: </capabilities>
Feb 16 13:12:46 compute-0 nova_compute[184354]: 
Feb 16 13:12:46 compute-0 nova_compute[184354]: 2026-02-16 13:12:46.135 184358 DEBUG nova.virt.libvirt.host [None req-bfb29ca0-528a-4b66-b36a-3a224a38a045 - - - - - -] Getting domain capabilities for i686 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Feb 16 13:12:46 compute-0 nova_compute[184354]: 2026-02-16 13:12:46.156 184358 DEBUG nova.virt.libvirt.host [None req-bfb29ca0-528a-4b66-b36a-3a224a38a045 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Feb 16 13:12:46 compute-0 nova_compute[184354]: <domainCapabilities>
Feb 16 13:12:46 compute-0 nova_compute[184354]:   <path>/usr/libexec/qemu-kvm</path>
Feb 16 13:12:46 compute-0 nova_compute[184354]:   <domain>kvm</domain>
Feb 16 13:12:46 compute-0 nova_compute[184354]:   <machine>pc-i440fx-rhel7.6.0</machine>
Feb 16 13:12:46 compute-0 nova_compute[184354]:   <arch>i686</arch>
Feb 16 13:12:46 compute-0 nova_compute[184354]:   <vcpu max='240'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:   <iothreads supported='yes'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:   <os supported='yes'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     <enum name='firmware'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     <loader supported='yes'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <enum name='type'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>rom</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>pflash</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </enum>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <enum name='readonly'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>yes</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>no</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </enum>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <enum name='secure'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>no</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </enum>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     </loader>
Feb 16 13:12:46 compute-0 nova_compute[184354]:   </os>
Feb 16 13:12:46 compute-0 nova_compute[184354]:   <cpu>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     <mode name='host-passthrough' supported='yes'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <enum name='hostPassthroughMigratable'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>on</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>off</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </enum>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     </mode>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     <mode name='maximum' supported='yes'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <enum name='maximumMigratable'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>on</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>off</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </enum>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     </mode>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     <mode name='host-model' supported='yes'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model fallback='forbid'>EPYC-Rome</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <vendor>AMD</vendor>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <maxphysaddr mode='passthrough' limit='40'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <feature policy='require' name='x2apic'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <feature policy='require' name='tsc-deadline'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <feature policy='require' name='hypervisor'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <feature policy='require' name='tsc_adjust'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <feature policy='require' name='spec-ctrl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <feature policy='require' name='stibp'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <feature policy='require' name='ssbd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <feature policy='require' name='cmp_legacy'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <feature policy='require' name='overflow-recov'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <feature policy='require' name='succor'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <feature policy='require' name='ibrs'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <feature policy='require' name='amd-ssbd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <feature policy='require' name='virt-ssbd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <feature policy='require' name='lbrv'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <feature policy='require' name='tsc-scale'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <feature policy='require' name='vmcb-clean'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <feature policy='require' name='flushbyasid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <feature policy='require' name='pause-filter'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <feature policy='require' name='pfthreshold'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <feature policy='require' name='svme-addr-chk'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <feature policy='require' name='lfence-always-serializing'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <feature policy='disable' name='xsaves'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     </mode>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     <mode name='custom' supported='yes'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Broadwell'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='hle'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Broadwell-IBRS'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='hle'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Broadwell-noTSX'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Broadwell-noTSX-IBRS'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Broadwell-v1'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='hle'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Broadwell-v2'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Broadwell-v3'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='hle'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Broadwell-v4'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Cascadelake-Server'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='hle'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Cascadelake-Server-noTSX'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Cascadelake-Server-v1'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='hle'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Cascadelake-Server-v2'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='hle'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Cascadelake-Server-v3'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Cascadelake-Server-v4'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Cascadelake-Server-v5'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='ClearwaterForest'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx-ifma'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx-ne-convert'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx-vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx-vnni-int16'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx-vnni-int8'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='bhi-ctrl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='bhi-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='cldemote'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='cmpccxadd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ddpd-u'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fbsdp-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrs'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='intel-psfd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ipred-ctrl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='lam'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='mcdt-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='movdir64b'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='movdiri'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pbrsb-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='prefetchiti'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='psdp-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='rrsba-ctrl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='serialize'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='sha512'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='sm3'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='sm4'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ss'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='ClearwaterForest-v1'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx-ifma'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx-ne-convert'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx-vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx-vnni-int16'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx-vnni-int8'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='bhi-ctrl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='bhi-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='cldemote'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='cmpccxadd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ddpd-u'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fbsdp-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrs'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='intel-psfd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ipred-ctrl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='lam'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='mcdt-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='movdir64b'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='movdiri'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pbrsb-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='prefetchiti'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='psdp-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='rrsba-ctrl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='serialize'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='sha512'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='sm3'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='sm4'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ss'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Cooperlake'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-bf16'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='hle'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='taa-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Cooperlake-v1'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-bf16'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='hle'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='taa-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Cooperlake-v2'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-bf16'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='hle'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='taa-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Denverton'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='mpx'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Denverton-v1'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='mpx'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Denverton-v2'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Denverton-v3'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Dhyana-v2'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='EPYC-Genoa'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='amd-psfd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='auto-ibrs'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-bf16'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512ifma'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='la57'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='no-nested-data-bp'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='null-sel-clr-base'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='stibp-always-on'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='EPYC-Genoa-v1'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='amd-psfd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='auto-ibrs'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-bf16'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512ifma'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='la57'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='no-nested-data-bp'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='null-sel-clr-base'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='stibp-always-on'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='EPYC-Genoa-v2'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='amd-psfd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='auto-ibrs'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-bf16'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512ifma'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fs-gs-base-ns'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='la57'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='no-nested-data-bp'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='null-sel-clr-base'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='perfmon-v2'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='stibp-always-on'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='EPYC-Milan'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='EPYC-Milan-v1'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='EPYC-Milan-v2'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='amd-psfd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='no-nested-data-bp'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='null-sel-clr-base'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='stibp-always-on'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='EPYC-Milan-v3'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='amd-psfd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='no-nested-data-bp'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='null-sel-clr-base'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='stibp-always-on'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='EPYC-Rome'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='EPYC-Rome-v1'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='EPYC-Rome-v2'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='EPYC-Rome-v3'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='EPYC-Turin'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='amd-psfd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='auto-ibrs'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx-vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-bf16'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-vp2intersect'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512ifma'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fs-gs-base-ns'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ibpb-brtype'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='la57'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='movdir64b'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='movdiri'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='no-nested-data-bp'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='null-sel-clr-base'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='perfmon-v2'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='prefetchi'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='sbpb'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='srso-user-kernel-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='stibp-always-on'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='EPYC-Turin-v1'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='amd-psfd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='auto-ibrs'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx-vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-bf16'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-vp2intersect'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512ifma'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fs-gs-base-ns'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ibpb-brtype'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='la57'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='movdir64b'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='movdiri'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='no-nested-data-bp'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='null-sel-clr-base'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='perfmon-v2'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='prefetchi'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='sbpb'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='srso-user-kernel-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='stibp-always-on'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='EPYC-v3'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='EPYC-v4'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='EPYC-v5'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='GraniteRapids'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='amx-bf16'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='amx-fp16'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='amx-int8'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='amx-tile'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx-vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-bf16'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-fp16'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512ifma'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fbsdp-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrc'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrs'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fzrm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='hle'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='la57'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='mcdt-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pbrsb-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='prefetchiti'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='psdp-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='serialize'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='taa-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='tsx-ldtrk'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xfd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='GraniteRapids-v1'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='amx-bf16'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='amx-fp16'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='amx-int8'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='amx-tile'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx-vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-bf16'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-fp16'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512ifma'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fbsdp-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrc'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrs'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fzrm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='hle'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='la57'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='mcdt-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pbrsb-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='prefetchiti'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='psdp-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='serialize'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='taa-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='tsx-ldtrk'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xfd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='GraniteRapids-v2'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='amx-bf16'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='amx-fp16'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='amx-int8'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='amx-tile'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx-vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx10'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx10-128'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx10-256'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx10-512'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-bf16'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-fp16'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512ifma'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='cldemote'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fbsdp-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrc'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrs'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fzrm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='hle'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='la57'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='mcdt-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='movdir64b'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='movdiri'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pbrsb-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='prefetchiti'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='psdp-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='serialize'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ss'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='taa-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='tsx-ldtrk'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xfd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='GraniteRapids-v3'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='amx-bf16'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='amx-fp16'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='amx-int8'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='amx-tile'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx-vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx10'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx10-128'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx10-256'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx10-512'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-bf16'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-fp16'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512ifma'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='cldemote'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fbsdp-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrc'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrs'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fzrm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='hle'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='la57'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='mcdt-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='movdir64b'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='movdiri'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pbrsb-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='prefetchiti'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='psdp-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='serialize'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ss'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='taa-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='tsx-ldtrk'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xfd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Haswell'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='hle'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Haswell-IBRS'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='hle'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Haswell-noTSX'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Haswell-noTSX-IBRS'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Haswell-v1'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='hle'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Haswell-v2'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Haswell-v3'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='hle'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Haswell-v4'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Icelake-Server'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='hle'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='la57'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Icelake-Server-noTSX'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='la57'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Icelake-Server-v1'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='hle'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='la57'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Icelake-Server-v2'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='la57'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Icelake-Server-v3'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='la57'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='taa-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Icelake-Server-v4'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512ifma'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='la57'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='taa-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Icelake-Server-v5'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512ifma'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='la57'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='taa-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Icelake-Server-v6'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512ifma'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='la57'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='taa-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Icelake-Server-v7'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512ifma'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='hle'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='la57'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='taa-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='IvyBridge'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='IvyBridge-IBRS'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='IvyBridge-v1'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='IvyBridge-v2'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='KnightsMill'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-4fmaps'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-4vnniw'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512er'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512pf'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ss'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='KnightsMill-v1'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-4fmaps'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-4vnniw'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512er'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512pf'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ss'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Opteron_G4'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fma4'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xop'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Opteron_G4-v1'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fma4'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xop'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Opteron_G5'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fma4'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='tbm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xop'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Opteron_G5-v1'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fma4'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='tbm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xop'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='SapphireRapids'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='amx-bf16'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='amx-int8'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='amx-tile'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx-vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-bf16'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-fp16'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512ifma'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrc'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrs'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fzrm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='hle'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='la57'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='serialize'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='taa-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='tsx-ldtrk'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xfd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='SapphireRapids-v1'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='amx-bf16'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='amx-int8'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='amx-tile'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx-vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-bf16'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-fp16'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512ifma'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrc'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrs'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fzrm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='hle'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='la57'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='serialize'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='taa-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='tsx-ldtrk'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xfd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='SapphireRapids-v2'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='amx-bf16'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='amx-int8'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='amx-tile'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx-vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-bf16'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-fp16'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512ifma'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fbsdp-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrc'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrs'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fzrm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='hle'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='la57'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='psdp-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='serialize'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='taa-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='tsx-ldtrk'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xfd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='SapphireRapids-v3'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='amx-bf16'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='amx-int8'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='amx-tile'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx-vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-bf16'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-fp16'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512ifma'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='cldemote'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fbsdp-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrc'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrs'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fzrm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='hle'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='la57'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='movdir64b'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='movdiri'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='psdp-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='serialize'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ss'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='taa-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='tsx-ldtrk'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xfd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='SapphireRapids-v4'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='amx-bf16'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='amx-int8'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='amx-tile'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx-vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-bf16'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-fp16'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512ifma'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='cldemote'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fbsdp-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrc'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrs'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fzrm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='hle'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='la57'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='movdir64b'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='movdiri'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='psdp-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='serialize'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ss'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='taa-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='tsx-ldtrk'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xfd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='SierraForest'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx-ifma'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx-ne-convert'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx-vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx-vnni-int8'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='cmpccxadd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fbsdp-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrs'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='mcdt-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pbrsb-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='psdp-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='serialize'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='SierraForest-v1'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx-ifma'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx-ne-convert'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx-vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx-vnni-int8'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='cmpccxadd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fbsdp-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrs'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='mcdt-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pbrsb-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='psdp-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='serialize'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='SierraForest-v2'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx-ifma'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx-ne-convert'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx-vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx-vnni-int8'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='bhi-ctrl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='cldemote'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='cmpccxadd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fbsdp-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrs'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='intel-psfd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ipred-ctrl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='lam'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='mcdt-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='movdir64b'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='movdiri'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pbrsb-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='psdp-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='rrsba-ctrl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='serialize'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ss'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='SierraForest-v3'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx-ifma'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx-ne-convert'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx-vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx-vnni-int8'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='bhi-ctrl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='cldemote'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='cmpccxadd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fbsdp-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrs'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='intel-psfd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ipred-ctrl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='lam'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='mcdt-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='movdir64b'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='movdiri'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pbrsb-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='psdp-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='rrsba-ctrl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='serialize'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ss'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Skylake-Client'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='hle'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Skylake-Client-IBRS'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='hle'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Skylake-Client-v1'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='hle'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Skylake-Client-v2'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='hle'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Skylake-Client-v3'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Skylake-Client-v4'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Skylake-Server'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='hle'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Skylake-Server-IBRS'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='hle'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Skylake-Server-v1'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='hle'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Skylake-Server-v2'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='hle'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Skylake-Server-v3'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Skylake-Server-v4'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Skylake-Server-v5'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Snowridge'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='cldemote'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='core-capability'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='movdir64b'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='movdiri'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='mpx'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='split-lock-detect'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Snowridge-v1'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='cldemote'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='core-capability'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='movdir64b'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='movdiri'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='mpx'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='split-lock-detect'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Snowridge-v2'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='cldemote'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='core-capability'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='movdir64b'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='movdiri'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='split-lock-detect'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Snowridge-v3'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='cldemote'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='core-capability'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='movdir64b'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='movdiri'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='split-lock-detect'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Snowridge-v4'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='cldemote'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='movdir64b'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='movdiri'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='athlon'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='3dnow'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='3dnowext'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='athlon-v1'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='3dnow'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='3dnowext'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='core2duo'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ss'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='core2duo-v1'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ss'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='coreduo'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ss'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='coreduo-v1'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ss'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='n270'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ss'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='n270-v1'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ss'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='phenom'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='3dnow'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='3dnowext'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='phenom-v1'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='3dnow'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='3dnowext'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     </mode>
Feb 16 13:12:46 compute-0 nova_compute[184354]:   </cpu>
Feb 16 13:12:46 compute-0 nova_compute[184354]:   <memoryBacking supported='yes'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     <enum name='sourceType'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <value>file</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <value>anonymous</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <value>memfd</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     </enum>
Feb 16 13:12:46 compute-0 nova_compute[184354]:   </memoryBacking>
Feb 16 13:12:46 compute-0 nova_compute[184354]:   <devices>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     <disk supported='yes'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <enum name='diskDevice'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>disk</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>cdrom</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>floppy</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>lun</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </enum>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <enum name='bus'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>ide</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>fdc</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>scsi</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>virtio</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>usb</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>sata</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </enum>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <enum name='model'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>virtio</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>virtio-transitional</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>virtio-non-transitional</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </enum>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     </disk>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     <graphics supported='yes'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <enum name='type'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>vnc</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>egl-headless</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>dbus</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </enum>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     </graphics>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     <video supported='yes'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <enum name='modelType'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>vga</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>cirrus</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>virtio</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>none</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>bochs</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>ramfb</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </enum>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     </video>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     <hostdev supported='yes'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <enum name='mode'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>subsystem</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </enum>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <enum name='startupPolicy'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>default</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>mandatory</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>requisite</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>optional</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </enum>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <enum name='subsysType'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>usb</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>pci</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>scsi</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </enum>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <enum name='capsType'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <enum name='pciBackend'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     </hostdev>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     <rng supported='yes'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <enum name='model'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>virtio</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>virtio-transitional</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>virtio-non-transitional</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </enum>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <enum name='backendModel'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>random</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>egd</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>builtin</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </enum>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     </rng>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     <filesystem supported='yes'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <enum name='driverType'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>path</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>handle</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>virtiofs</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </enum>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     </filesystem>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     <tpm supported='yes'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <enum name='model'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>tpm-tis</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>tpm-crb</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </enum>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <enum name='backendModel'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>emulator</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>external</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </enum>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <enum name='backendVersion'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>2.0</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </enum>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     </tpm>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     <redirdev supported='yes'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <enum name='bus'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>usb</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </enum>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     </redirdev>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     <channel supported='yes'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <enum name='type'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>pty</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>unix</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </enum>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     </channel>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     <crypto supported='yes'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <enum name='model'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <enum name='type'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>qemu</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </enum>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <enum name='backendModel'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>builtin</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </enum>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     </crypto>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     <interface supported='yes'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <enum name='backendType'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>default</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>passt</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </enum>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     </interface>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     <panic supported='yes'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <enum name='model'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>isa</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>hyperv</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </enum>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     </panic>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     <console supported='yes'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <enum name='type'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>null</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>vc</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>pty</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>dev</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>file</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>pipe</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>stdio</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>udp</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>tcp</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>unix</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>qemu-vdagent</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>dbus</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </enum>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     </console>
Feb 16 13:12:46 compute-0 nova_compute[184354]:   </devices>
Feb 16 13:12:46 compute-0 nova_compute[184354]:   <features>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     <gic supported='no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     <vmcoreinfo supported='yes'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     <genid supported='yes'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     <backingStoreInput supported='yes'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     <backup supported='yes'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     <async-teardown supported='yes'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     <s390-pv supported='no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     <ps2 supported='yes'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     <tdx supported='no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     <sev supported='no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     <sgx supported='no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     <hyperv supported='yes'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <enum name='features'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>relaxed</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>vapic</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>spinlocks</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>vpindex</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>runtime</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>synic</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>stimer</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>reset</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>vendor_id</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>frequencies</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>reenlightenment</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>tlbflush</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>ipi</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>avic</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>emsr_bitmap</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>xmm_input</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </enum>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <defaults>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <spinlocks>4095</spinlocks>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <stimer_direct>on</stimer_direct>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <tlbflush_direct>on</tlbflush_direct>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <tlbflush_extended>on</tlbflush_extended>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <vendor_id>Linux KVM Hv</vendor_id>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </defaults>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     </hyperv>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     <launchSecurity supported='no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:   </features>
Feb 16 13:12:46 compute-0 nova_compute[184354]: </domainCapabilities>
Feb 16 13:12:46 compute-0 nova_compute[184354]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Feb 16 13:12:46 compute-0 nova_compute[184354]: 2026-02-16 13:12:46.168 184358 DEBUG nova.virt.libvirt.host [None req-bfb29ca0-528a-4b66-b36a-3a224a38a045 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Feb 16 13:12:46 compute-0 nova_compute[184354]: <domainCapabilities>
Feb 16 13:12:46 compute-0 nova_compute[184354]:   <path>/usr/libexec/qemu-kvm</path>
Feb 16 13:12:46 compute-0 nova_compute[184354]:   <domain>kvm</domain>
Feb 16 13:12:46 compute-0 nova_compute[184354]:   <machine>pc-q35-rhel9.8.0</machine>
Feb 16 13:12:46 compute-0 nova_compute[184354]:   <arch>i686</arch>
Feb 16 13:12:46 compute-0 nova_compute[184354]:   <vcpu max='4096'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:   <iothreads supported='yes'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:   <os supported='yes'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     <enum name='firmware'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     <loader supported='yes'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <enum name='type'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>rom</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>pflash</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </enum>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <enum name='readonly'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>yes</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>no</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </enum>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <enum name='secure'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>no</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </enum>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     </loader>
Feb 16 13:12:46 compute-0 nova_compute[184354]:   </os>
Feb 16 13:12:46 compute-0 nova_compute[184354]:   <cpu>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     <mode name='host-passthrough' supported='yes'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <enum name='hostPassthroughMigratable'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>on</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>off</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </enum>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     </mode>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     <mode name='maximum' supported='yes'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <enum name='maximumMigratable'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>on</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>off</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </enum>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     </mode>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     <mode name='host-model' supported='yes'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model fallback='forbid'>EPYC-Rome</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <vendor>AMD</vendor>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <maxphysaddr mode='passthrough' limit='40'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <feature policy='require' name='x2apic'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <feature policy='require' name='tsc-deadline'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <feature policy='require' name='hypervisor'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <feature policy='require' name='tsc_adjust'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <feature policy='require' name='spec-ctrl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <feature policy='require' name='stibp'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <feature policy='require' name='ssbd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <feature policy='require' name='cmp_legacy'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <feature policy='require' name='overflow-recov'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <feature policy='require' name='succor'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <feature policy='require' name='ibrs'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <feature policy='require' name='amd-ssbd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <feature policy='require' name='virt-ssbd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <feature policy='require' name='lbrv'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <feature policy='require' name='tsc-scale'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <feature policy='require' name='vmcb-clean'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <feature policy='require' name='flushbyasid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <feature policy='require' name='pause-filter'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <feature policy='require' name='pfthreshold'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <feature policy='require' name='svme-addr-chk'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <feature policy='require' name='lfence-always-serializing'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <feature policy='disable' name='xsaves'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     </mode>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     <mode name='custom' supported='yes'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Broadwell'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='hle'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Broadwell-IBRS'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='hle'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Broadwell-noTSX'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Broadwell-noTSX-IBRS'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Broadwell-v1'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='hle'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Broadwell-v2'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Broadwell-v3'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='hle'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Broadwell-v4'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Cascadelake-Server'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='hle'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Cascadelake-Server-noTSX'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Cascadelake-Server-v1'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='hle'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Cascadelake-Server-v2'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='hle'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Cascadelake-Server-v3'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Cascadelake-Server-v4'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Cascadelake-Server-v5'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='ClearwaterForest'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx-ifma'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx-ne-convert'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx-vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx-vnni-int16'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx-vnni-int8'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='bhi-ctrl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='bhi-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='cldemote'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='cmpccxadd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ddpd-u'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fbsdp-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrs'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='intel-psfd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ipred-ctrl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='lam'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='mcdt-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='movdir64b'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='movdiri'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pbrsb-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='prefetchiti'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='psdp-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='rrsba-ctrl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='serialize'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='sha512'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='sm3'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='sm4'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ss'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='ClearwaterForest-v1'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx-ifma'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx-ne-convert'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx-vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx-vnni-int16'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx-vnni-int8'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='bhi-ctrl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='bhi-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='cldemote'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='cmpccxadd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ddpd-u'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fbsdp-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrs'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='intel-psfd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ipred-ctrl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='lam'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='mcdt-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='movdir64b'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='movdiri'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pbrsb-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='prefetchiti'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='psdp-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='rrsba-ctrl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='serialize'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='sha512'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='sm3'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='sm4'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ss'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Cooperlake'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-bf16'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='hle'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='taa-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Cooperlake-v1'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-bf16'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='hle'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='taa-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Cooperlake-v2'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-bf16'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='hle'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='taa-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Denverton'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='mpx'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Denverton-v1'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='mpx'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Denverton-v2'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Denverton-v3'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Dhyana-v2'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='EPYC-Genoa'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='amd-psfd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='auto-ibrs'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-bf16'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512ifma'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='la57'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='no-nested-data-bp'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='null-sel-clr-base'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='stibp-always-on'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='EPYC-Genoa-v1'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='amd-psfd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='auto-ibrs'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-bf16'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512ifma'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='la57'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='no-nested-data-bp'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='null-sel-clr-base'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='stibp-always-on'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='EPYC-Genoa-v2'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='amd-psfd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='auto-ibrs'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-bf16'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512ifma'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fs-gs-base-ns'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='la57'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='no-nested-data-bp'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='null-sel-clr-base'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='perfmon-v2'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='stibp-always-on'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='EPYC-Milan'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='EPYC-Milan-v1'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='EPYC-Milan-v2'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='amd-psfd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='no-nested-data-bp'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='null-sel-clr-base'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='stibp-always-on'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='EPYC-Milan-v3'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='amd-psfd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='no-nested-data-bp'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='null-sel-clr-base'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='stibp-always-on'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='EPYC-Rome'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='EPYC-Rome-v1'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='EPYC-Rome-v2'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='EPYC-Rome-v3'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='EPYC-Turin'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='amd-psfd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='auto-ibrs'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx-vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-bf16'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-vp2intersect'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512ifma'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fs-gs-base-ns'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ibpb-brtype'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='la57'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='movdir64b'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='movdiri'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='no-nested-data-bp'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='null-sel-clr-base'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='perfmon-v2'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='prefetchi'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='sbpb'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='srso-user-kernel-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='stibp-always-on'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='EPYC-Turin-v1'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='amd-psfd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='auto-ibrs'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx-vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-bf16'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-vp2intersect'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512ifma'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fs-gs-base-ns'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ibpb-brtype'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='la57'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='movdir64b'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='movdiri'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='no-nested-data-bp'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='null-sel-clr-base'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='perfmon-v2'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='prefetchi'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='sbpb'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='srso-user-kernel-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='stibp-always-on'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='EPYC-v3'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='EPYC-v4'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='EPYC-v5'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='GraniteRapids'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='amx-bf16'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='amx-fp16'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='amx-int8'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='amx-tile'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx-vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-bf16'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-fp16'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512ifma'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fbsdp-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrc'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrs'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fzrm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='hle'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='la57'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='mcdt-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pbrsb-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='prefetchiti'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='psdp-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='serialize'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='taa-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='tsx-ldtrk'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xfd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='GraniteRapids-v1'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='amx-bf16'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='amx-fp16'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='amx-int8'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='amx-tile'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx-vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-bf16'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-fp16'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512ifma'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fbsdp-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrc'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrs'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fzrm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='hle'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='la57'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='mcdt-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pbrsb-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='prefetchiti'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='psdp-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='serialize'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='taa-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='tsx-ldtrk'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xfd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='GraniteRapids-v2'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='amx-bf16'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='amx-fp16'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='amx-int8'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='amx-tile'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx-vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx10'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx10-128'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx10-256'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx10-512'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-bf16'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-fp16'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512ifma'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='cldemote'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fbsdp-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrc'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrs'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fzrm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='hle'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='la57'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='mcdt-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='movdir64b'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='movdiri'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pbrsb-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='prefetchiti'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='psdp-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='serialize'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ss'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='taa-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='tsx-ldtrk'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xfd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='GraniteRapids-v3'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='amx-bf16'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='amx-fp16'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='amx-int8'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='amx-tile'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx-vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx10'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx10-128'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx10-256'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx10-512'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-bf16'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-fp16'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512ifma'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='cldemote'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fbsdp-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrc'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrs'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fzrm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='hle'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='la57'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='mcdt-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='movdir64b'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='movdiri'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pbrsb-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='prefetchiti'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='psdp-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='serialize'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ss'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='taa-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='tsx-ldtrk'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xfd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Haswell'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='hle'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Haswell-IBRS'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='hle'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Haswell-noTSX'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Haswell-noTSX-IBRS'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Haswell-v1'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='hle'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Haswell-v2'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Haswell-v3'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='hle'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Haswell-v4'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Icelake-Server'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='hle'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='la57'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Icelake-Server-noTSX'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='la57'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Icelake-Server-v1'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='hle'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='la57'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Icelake-Server-v2'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='la57'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Icelake-Server-v3'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='la57'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='taa-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Icelake-Server-v4'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512ifma'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='la57'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='taa-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Icelake-Server-v5'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512ifma'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='la57'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='taa-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Icelake-Server-v6'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512ifma'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='la57'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='taa-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Icelake-Server-v7'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512ifma'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='hle'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='la57'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='taa-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='IvyBridge'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='IvyBridge-IBRS'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='IvyBridge-v1'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='IvyBridge-v2'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='KnightsMill'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-4fmaps'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-4vnniw'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512er'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512pf'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ss'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='KnightsMill-v1'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-4fmaps'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-4vnniw'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512er'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512pf'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ss'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Opteron_G4'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fma4'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xop'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Opteron_G4-v1'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fma4'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xop'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Opteron_G5'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fma4'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='tbm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xop'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Opteron_G5-v1'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fma4'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='tbm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xop'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='SapphireRapids'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='amx-bf16'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='amx-int8'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='amx-tile'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx-vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-bf16'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-fp16'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512ifma'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrc'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrs'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fzrm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='hle'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='la57'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='serialize'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='taa-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='tsx-ldtrk'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xfd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='SapphireRapids-v1'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='amx-bf16'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='amx-int8'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='amx-tile'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx-vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-bf16'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-fp16'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512ifma'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrc'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrs'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fzrm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='hle'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='la57'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='serialize'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='taa-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='tsx-ldtrk'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xfd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='SapphireRapids-v2'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='amx-bf16'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='amx-int8'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='amx-tile'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx-vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-bf16'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-fp16'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512ifma'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fbsdp-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrc'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrs'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fzrm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='hle'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='la57'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='psdp-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='serialize'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='taa-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='tsx-ldtrk'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xfd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='SapphireRapids-v3'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='amx-bf16'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='amx-int8'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='amx-tile'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx-vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-bf16'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-fp16'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512ifma'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='cldemote'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fbsdp-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrc'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrs'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fzrm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='hle'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='la57'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='movdir64b'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='movdiri'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='psdp-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='serialize'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ss'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='taa-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='tsx-ldtrk'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xfd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='SapphireRapids-v4'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='amx-bf16'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='amx-int8'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='amx-tile'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx-vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-bf16'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-fp16'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512ifma'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='cldemote'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fbsdp-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrc'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrs'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fzrm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='hle'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='la57'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='movdir64b'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='movdiri'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='psdp-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='serialize'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ss'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='taa-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='tsx-ldtrk'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xfd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='SierraForest'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx-ifma'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx-ne-convert'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx-vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx-vnni-int8'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='cmpccxadd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fbsdp-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrs'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='mcdt-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pbrsb-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='psdp-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='serialize'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='SierraForest-v1'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx-ifma'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx-ne-convert'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx-vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx-vnni-int8'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='cmpccxadd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fbsdp-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrs'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='mcdt-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pbrsb-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='psdp-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='serialize'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='SierraForest-v2'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx-ifma'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx-ne-convert'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx-vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx-vnni-int8'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='bhi-ctrl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='cldemote'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='cmpccxadd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fbsdp-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrs'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='intel-psfd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ipred-ctrl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='lam'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='mcdt-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='movdir64b'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='movdiri'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pbrsb-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='psdp-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='rrsba-ctrl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='serialize'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ss'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='SierraForest-v3'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx-ifma'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx-ne-convert'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx-vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx-vnni-int8'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='bhi-ctrl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='cldemote'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='cmpccxadd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fbsdp-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrs'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='intel-psfd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ipred-ctrl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='lam'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='mcdt-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='movdir64b'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='movdiri'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pbrsb-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='psdp-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='rrsba-ctrl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='serialize'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ss'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Skylake-Client'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='hle'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Skylake-Client-IBRS'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='hle'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Skylake-Client-v1'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='hle'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Skylake-Client-v2'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='hle'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Skylake-Client-v3'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Skylake-Client-v4'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Skylake-Server'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='hle'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Skylake-Server-IBRS'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='hle'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Skylake-Server-v1'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='hle'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Skylake-Server-v2'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='hle'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Skylake-Server-v3'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Skylake-Server-v4'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Skylake-Server-v5'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Snowridge'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='cldemote'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='core-capability'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='movdir64b'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='movdiri'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='mpx'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='split-lock-detect'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Snowridge-v1'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='cldemote'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='core-capability'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='movdir64b'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='movdiri'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='mpx'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='split-lock-detect'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Snowridge-v2'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='cldemote'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='core-capability'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='movdir64b'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='movdiri'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='split-lock-detect'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Snowridge-v3'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='cldemote'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='core-capability'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='movdir64b'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='movdiri'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='split-lock-detect'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Snowridge-v4'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='cldemote'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='movdir64b'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='movdiri'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='athlon'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='3dnow'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='3dnowext'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='athlon-v1'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='3dnow'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='3dnowext'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='core2duo'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ss'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='core2duo-v1'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ss'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='coreduo'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ss'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='coreduo-v1'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ss'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='n270'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ss'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='n270-v1'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ss'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='phenom'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='3dnow'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='3dnowext'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='phenom-v1'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='3dnow'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='3dnowext'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     </mode>
Feb 16 13:12:46 compute-0 nova_compute[184354]:   </cpu>
Feb 16 13:12:46 compute-0 nova_compute[184354]:   <memoryBacking supported='yes'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     <enum name='sourceType'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <value>file</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <value>anonymous</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <value>memfd</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     </enum>
Feb 16 13:12:46 compute-0 nova_compute[184354]:   </memoryBacking>
Feb 16 13:12:46 compute-0 nova_compute[184354]:   <devices>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     <disk supported='yes'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <enum name='diskDevice'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>disk</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>cdrom</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>floppy</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>lun</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </enum>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <enum name='bus'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>fdc</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>scsi</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>virtio</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>usb</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>sata</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </enum>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <enum name='model'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>virtio</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>virtio-transitional</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>virtio-non-transitional</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </enum>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     </disk>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     <graphics supported='yes'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <enum name='type'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>vnc</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>egl-headless</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>dbus</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </enum>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     </graphics>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     <video supported='yes'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <enum name='modelType'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>vga</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>cirrus</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>virtio</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>none</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>bochs</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>ramfb</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </enum>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     </video>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     <hostdev supported='yes'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <enum name='mode'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>subsystem</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </enum>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <enum name='startupPolicy'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>default</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>mandatory</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>requisite</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>optional</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </enum>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <enum name='subsysType'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>usb</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>pci</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>scsi</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </enum>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <enum name='capsType'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <enum name='pciBackend'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     </hostdev>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     <rng supported='yes'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <enum name='model'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>virtio</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>virtio-transitional</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>virtio-non-transitional</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </enum>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <enum name='backendModel'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>random</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>egd</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>builtin</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </enum>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     </rng>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     <filesystem supported='yes'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <enum name='driverType'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>path</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>handle</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>virtiofs</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </enum>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     </filesystem>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     <tpm supported='yes'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <enum name='model'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>tpm-tis</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>tpm-crb</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </enum>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <enum name='backendModel'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>emulator</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>external</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </enum>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <enum name='backendVersion'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>2.0</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </enum>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     </tpm>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     <redirdev supported='yes'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <enum name='bus'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>usb</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </enum>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     </redirdev>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     <channel supported='yes'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <enum name='type'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>pty</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>unix</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </enum>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     </channel>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     <crypto supported='yes'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <enum name='model'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <enum name='type'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>qemu</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </enum>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <enum name='backendModel'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>builtin</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </enum>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     </crypto>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     <interface supported='yes'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <enum name='backendType'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>default</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>passt</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </enum>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     </interface>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     <panic supported='yes'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <enum name='model'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>isa</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>hyperv</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </enum>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     </panic>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     <console supported='yes'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <enum name='type'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>null</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>vc</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>pty</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>dev</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>file</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>pipe</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>stdio</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>udp</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>tcp</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>unix</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>qemu-vdagent</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>dbus</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </enum>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     </console>
Feb 16 13:12:46 compute-0 nova_compute[184354]:   </devices>
Feb 16 13:12:46 compute-0 nova_compute[184354]:   <features>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     <gic supported='no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     <vmcoreinfo supported='yes'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     <genid supported='yes'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     <backingStoreInput supported='yes'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     <backup supported='yes'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     <async-teardown supported='yes'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     <s390-pv supported='no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     <ps2 supported='yes'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     <tdx supported='no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     <sev supported='no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     <sgx supported='no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     <hyperv supported='yes'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <enum name='features'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>relaxed</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>vapic</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>spinlocks</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>vpindex</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>runtime</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>synic</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>stimer</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>reset</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>vendor_id</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>frequencies</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>reenlightenment</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>tlbflush</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>ipi</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>avic</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>emsr_bitmap</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>xmm_input</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </enum>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <defaults>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <spinlocks>4095</spinlocks>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <stimer_direct>on</stimer_direct>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <tlbflush_direct>on</tlbflush_direct>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <tlbflush_extended>on</tlbflush_extended>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <vendor_id>Linux KVM Hv</vendor_id>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </defaults>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     </hyperv>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     <launchSecurity supported='no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:   </features>
Feb 16 13:12:46 compute-0 nova_compute[184354]: </domainCapabilities>
Feb 16 13:12:46 compute-0 nova_compute[184354]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Feb 16 13:12:46 compute-0 nova_compute[184354]: 2026-02-16 13:12:46.210 184358 DEBUG nova.virt.libvirt.host [None req-bfb29ca0-528a-4b66-b36a-3a224a38a045 - - - - - -] Getting domain capabilities for x86_64 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Feb 16 13:12:46 compute-0 nova_compute[184354]: 2026-02-16 13:12:46.215 184358 DEBUG nova.virt.libvirt.host [None req-bfb29ca0-528a-4b66-b36a-3a224a38a045 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Feb 16 13:12:46 compute-0 nova_compute[184354]: <domainCapabilities>
Feb 16 13:12:46 compute-0 nova_compute[184354]:   <path>/usr/libexec/qemu-kvm</path>
Feb 16 13:12:46 compute-0 nova_compute[184354]:   <domain>kvm</domain>
Feb 16 13:12:46 compute-0 nova_compute[184354]:   <machine>pc-i440fx-rhel7.6.0</machine>
Feb 16 13:12:46 compute-0 nova_compute[184354]:   <arch>x86_64</arch>
Feb 16 13:12:46 compute-0 nova_compute[184354]:   <vcpu max='240'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:   <iothreads supported='yes'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:   <os supported='yes'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     <enum name='firmware'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     <loader supported='yes'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <enum name='type'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>rom</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>pflash</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </enum>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <enum name='readonly'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>yes</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>no</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </enum>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <enum name='secure'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>no</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </enum>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     </loader>
Feb 16 13:12:46 compute-0 nova_compute[184354]:   </os>
Feb 16 13:12:46 compute-0 nova_compute[184354]:   <cpu>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     <mode name='host-passthrough' supported='yes'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <enum name='hostPassthroughMigratable'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>on</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>off</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </enum>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     </mode>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     <mode name='maximum' supported='yes'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <enum name='maximumMigratable'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>on</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>off</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </enum>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     </mode>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     <mode name='host-model' supported='yes'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model fallback='forbid'>EPYC-Rome</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <vendor>AMD</vendor>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <maxphysaddr mode='passthrough' limit='40'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <feature policy='require' name='x2apic'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <feature policy='require' name='tsc-deadline'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <feature policy='require' name='hypervisor'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <feature policy='require' name='tsc_adjust'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <feature policy='require' name='spec-ctrl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <feature policy='require' name='stibp'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <feature policy='require' name='ssbd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <feature policy='require' name='cmp_legacy'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <feature policy='require' name='overflow-recov'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <feature policy='require' name='succor'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <feature policy='require' name='ibrs'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <feature policy='require' name='amd-ssbd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <feature policy='require' name='virt-ssbd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <feature policy='require' name='lbrv'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <feature policy='require' name='tsc-scale'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <feature policy='require' name='vmcb-clean'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <feature policy='require' name='flushbyasid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <feature policy='require' name='pause-filter'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <feature policy='require' name='pfthreshold'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <feature policy='require' name='svme-addr-chk'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <feature policy='require' name='lfence-always-serializing'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <feature policy='disable' name='xsaves'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     </mode>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     <mode name='custom' supported='yes'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Broadwell'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='hle'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Broadwell-IBRS'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='hle'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Broadwell-noTSX'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Broadwell-noTSX-IBRS'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Broadwell-v1'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='hle'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Broadwell-v2'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Broadwell-v3'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='hle'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Broadwell-v4'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Cascadelake-Server'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='hle'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Cascadelake-Server-noTSX'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Cascadelake-Server-v1'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='hle'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Cascadelake-Server-v2'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='hle'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Cascadelake-Server-v3'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Cascadelake-Server-v4'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Cascadelake-Server-v5'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='ClearwaterForest'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx-ifma'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx-ne-convert'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx-vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx-vnni-int16'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx-vnni-int8'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='bhi-ctrl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='bhi-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='cldemote'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='cmpccxadd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ddpd-u'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fbsdp-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrs'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='intel-psfd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ipred-ctrl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='lam'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='mcdt-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='movdir64b'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='movdiri'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pbrsb-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='prefetchiti'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='psdp-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='rrsba-ctrl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='serialize'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='sha512'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='sm3'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='sm4'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ss'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='ClearwaterForest-v1'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx-ifma'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx-ne-convert'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx-vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx-vnni-int16'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx-vnni-int8'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='bhi-ctrl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='bhi-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='cldemote'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='cmpccxadd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ddpd-u'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fbsdp-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrs'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='intel-psfd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ipred-ctrl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='lam'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='mcdt-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='movdir64b'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='movdiri'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pbrsb-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='prefetchiti'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='psdp-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='rrsba-ctrl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='serialize'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='sha512'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='sm3'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='sm4'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ss'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Cooperlake'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-bf16'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='hle'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='taa-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Cooperlake-v1'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-bf16'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='hle'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='taa-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Cooperlake-v2'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-bf16'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='hle'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='taa-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Denverton'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='mpx'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Denverton-v1'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='mpx'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Denverton-v2'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Denverton-v3'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Dhyana-v2'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='EPYC-Genoa'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='amd-psfd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='auto-ibrs'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-bf16'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512ifma'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='la57'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='no-nested-data-bp'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='null-sel-clr-base'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='stibp-always-on'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='EPYC-Genoa-v1'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='amd-psfd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='auto-ibrs'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-bf16'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512ifma'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='la57'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='no-nested-data-bp'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='null-sel-clr-base'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='stibp-always-on'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='EPYC-Genoa-v2'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='amd-psfd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='auto-ibrs'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-bf16'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512ifma'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fs-gs-base-ns'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='la57'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='no-nested-data-bp'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='null-sel-clr-base'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='perfmon-v2'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='stibp-always-on'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='EPYC-Milan'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='EPYC-Milan-v1'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='EPYC-Milan-v2'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='amd-psfd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='no-nested-data-bp'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='null-sel-clr-base'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='stibp-always-on'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='EPYC-Milan-v3'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='amd-psfd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='no-nested-data-bp'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='null-sel-clr-base'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='stibp-always-on'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='EPYC-Rome'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='EPYC-Rome-v1'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='EPYC-Rome-v2'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='EPYC-Rome-v3'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='EPYC-Turin'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='amd-psfd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='auto-ibrs'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx-vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-bf16'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-vp2intersect'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512ifma'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fs-gs-base-ns'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ibpb-brtype'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='la57'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='movdir64b'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='movdiri'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='no-nested-data-bp'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='null-sel-clr-base'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='perfmon-v2'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='prefetchi'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='sbpb'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='srso-user-kernel-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='stibp-always-on'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='EPYC-Turin-v1'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='amd-psfd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='auto-ibrs'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx-vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-bf16'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-vp2intersect'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512ifma'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fs-gs-base-ns'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ibpb-brtype'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='la57'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='movdir64b'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='movdiri'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='no-nested-data-bp'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='null-sel-clr-base'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='perfmon-v2'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='prefetchi'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='sbpb'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='srso-user-kernel-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='stibp-always-on'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='EPYC-v3'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='EPYC-v4'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='EPYC-v5'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='GraniteRapids'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='amx-bf16'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='amx-fp16'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='amx-int8'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='amx-tile'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx-vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-bf16'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-fp16'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512ifma'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fbsdp-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrc'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrs'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fzrm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='hle'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='la57'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='mcdt-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pbrsb-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='prefetchiti'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='psdp-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='serialize'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='taa-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='tsx-ldtrk'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xfd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='GraniteRapids-v1'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='amx-bf16'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='amx-fp16'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='amx-int8'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='amx-tile'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx-vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-bf16'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-fp16'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512ifma'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fbsdp-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrc'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrs'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fzrm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='hle'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='la57'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='mcdt-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pbrsb-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='prefetchiti'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='psdp-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='serialize'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='taa-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='tsx-ldtrk'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xfd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='GraniteRapids-v2'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='amx-bf16'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='amx-fp16'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='amx-int8'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='amx-tile'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx-vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx10'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx10-128'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx10-256'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx10-512'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-bf16'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-fp16'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512ifma'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='cldemote'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fbsdp-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrc'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrs'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fzrm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='hle'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='la57'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='mcdt-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='movdir64b'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='movdiri'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pbrsb-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='prefetchiti'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='psdp-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='serialize'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ss'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='taa-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='tsx-ldtrk'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xfd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='GraniteRapids-v3'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='amx-bf16'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='amx-fp16'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='amx-int8'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='amx-tile'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx-vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx10'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx10-128'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx10-256'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx10-512'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-bf16'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-fp16'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512ifma'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='cldemote'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fbsdp-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrc'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrs'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fzrm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='hle'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='la57'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='mcdt-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='movdir64b'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='movdiri'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pbrsb-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='prefetchiti'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='psdp-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='serialize'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ss'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='taa-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='tsx-ldtrk'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xfd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Haswell'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='hle'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Haswell-IBRS'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='hle'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Haswell-noTSX'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Haswell-noTSX-IBRS'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Haswell-v1'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='hle'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Haswell-v2'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Haswell-v3'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='hle'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Haswell-v4'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Icelake-Server'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='hle'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='la57'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Icelake-Server-noTSX'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='la57'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Icelake-Server-v1'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='hle'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='la57'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Icelake-Server-v2'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='la57'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Icelake-Server-v3'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='la57'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='taa-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Icelake-Server-v4'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512ifma'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='la57'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='taa-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Icelake-Server-v5'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512ifma'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='la57'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='taa-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Icelake-Server-v6'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512ifma'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='la57'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='taa-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Icelake-Server-v7'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512ifma'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='hle'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='la57'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='taa-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='IvyBridge'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='IvyBridge-IBRS'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='IvyBridge-v1'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='IvyBridge-v2'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='KnightsMill'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-4fmaps'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-4vnniw'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512er'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512pf'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ss'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='KnightsMill-v1'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-4fmaps'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-4vnniw'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512er'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512pf'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ss'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Opteron_G4'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fma4'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xop'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Opteron_G4-v1'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fma4'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xop'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Opteron_G5'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fma4'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='tbm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xop'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Opteron_G5-v1'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fma4'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='tbm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xop'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='SapphireRapids'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='amx-bf16'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='amx-int8'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='amx-tile'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx-vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-bf16'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-fp16'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512ifma'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrc'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrs'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fzrm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='hle'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='la57'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='serialize'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='taa-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='tsx-ldtrk'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xfd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='SapphireRapids-v1'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='amx-bf16'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='amx-int8'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='amx-tile'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx-vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-bf16'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-fp16'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512ifma'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrc'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrs'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fzrm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='hle'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='la57'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='serialize'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='taa-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='tsx-ldtrk'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xfd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='SapphireRapids-v2'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='amx-bf16'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='amx-int8'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='amx-tile'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx-vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-bf16'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-fp16'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512ifma'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fbsdp-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrc'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrs'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fzrm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='hle'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='la57'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='psdp-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='serialize'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='taa-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='tsx-ldtrk'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xfd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='SapphireRapids-v3'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='amx-bf16'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='amx-int8'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='amx-tile'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx-vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-bf16'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-fp16'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512ifma'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='cldemote'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fbsdp-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrc'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrs'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fzrm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='hle'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='la57'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='movdir64b'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='movdiri'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='psdp-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='serialize'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ss'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='taa-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='tsx-ldtrk'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xfd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='SapphireRapids-v4'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='amx-bf16'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='amx-int8'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='amx-tile'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx-vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-bf16'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-fp16'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512ifma'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='cldemote'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fbsdp-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrc'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrs'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fzrm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='hle'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='la57'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='movdir64b'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='movdiri'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='psdp-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='serialize'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ss'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='taa-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='tsx-ldtrk'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xfd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='SierraForest'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx-ifma'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx-ne-convert'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx-vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx-vnni-int8'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='cmpccxadd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fbsdp-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrs'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='mcdt-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pbrsb-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='psdp-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='serialize'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='SierraForest-v1'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx-ifma'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx-ne-convert'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx-vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx-vnni-int8'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='cmpccxadd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fbsdp-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrs'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='mcdt-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pbrsb-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='psdp-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='serialize'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='SierraForest-v2'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx-ifma'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx-ne-convert'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx-vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx-vnni-int8'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='bhi-ctrl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='cldemote'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='cmpccxadd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fbsdp-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrs'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='intel-psfd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ipred-ctrl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='lam'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='mcdt-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='movdir64b'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='movdiri'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pbrsb-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='psdp-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='rrsba-ctrl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='serialize'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ss'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='SierraForest-v3'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx-ifma'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx-ne-convert'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx-vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx-vnni-int8'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='bhi-ctrl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='cldemote'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='cmpccxadd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fbsdp-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrs'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='intel-psfd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ipred-ctrl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='lam'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='mcdt-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='movdir64b'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='movdiri'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pbrsb-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='psdp-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='rrsba-ctrl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='serialize'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ss'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Skylake-Client'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='hle'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Skylake-Client-IBRS'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='hle'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Skylake-Client-v1'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='hle'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Skylake-Client-v2'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='hle'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Skylake-Client-v3'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Skylake-Client-v4'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Skylake-Server'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='hle'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Skylake-Server-IBRS'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='hle'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Skylake-Server-v1'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='hle'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Skylake-Server-v2'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='hle'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Skylake-Server-v3'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Skylake-Server-v4'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Skylake-Server-v5'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Snowridge'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='cldemote'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='core-capability'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='movdir64b'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='movdiri'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='mpx'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='split-lock-detect'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Snowridge-v1'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='cldemote'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='core-capability'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='movdir64b'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='movdiri'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='mpx'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='split-lock-detect'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Snowridge-v2'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='cldemote'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='core-capability'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='movdir64b'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='movdiri'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='split-lock-detect'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Snowridge-v3'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='cldemote'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='core-capability'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='movdir64b'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='movdiri'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='split-lock-detect'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Snowridge-v4'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='cldemote'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='movdir64b'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='movdiri'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='athlon'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='3dnow'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='3dnowext'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='athlon-v1'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='3dnow'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='3dnowext'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='core2duo'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ss'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='core2duo-v1'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ss'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='coreduo'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ss'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='coreduo-v1'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ss'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='n270'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ss'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='n270-v1'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ss'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='phenom'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='3dnow'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='3dnowext'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='phenom-v1'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='3dnow'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='3dnowext'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     </mode>
Feb 16 13:12:46 compute-0 nova_compute[184354]:   </cpu>
Feb 16 13:12:46 compute-0 nova_compute[184354]:   <memoryBacking supported='yes'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     <enum name='sourceType'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <value>file</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <value>anonymous</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <value>memfd</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     </enum>
Feb 16 13:12:46 compute-0 nova_compute[184354]:   </memoryBacking>
Feb 16 13:12:46 compute-0 nova_compute[184354]:   <devices>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     <disk supported='yes'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <enum name='diskDevice'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>disk</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>cdrom</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>floppy</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>lun</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </enum>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <enum name='bus'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>ide</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>fdc</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>scsi</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>virtio</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>usb</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>sata</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </enum>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <enum name='model'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>virtio</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>virtio-transitional</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>virtio-non-transitional</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </enum>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     </disk>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     <graphics supported='yes'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <enum name='type'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>vnc</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>egl-headless</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>dbus</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </enum>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     </graphics>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     <video supported='yes'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <enum name='modelType'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>vga</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>cirrus</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>virtio</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>none</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>bochs</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>ramfb</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </enum>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     </video>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     <hostdev supported='yes'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <enum name='mode'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>subsystem</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </enum>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <enum name='startupPolicy'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>default</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>mandatory</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>requisite</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>optional</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </enum>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <enum name='subsysType'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>usb</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>pci</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>scsi</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </enum>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <enum name='capsType'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <enum name='pciBackend'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     </hostdev>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     <rng supported='yes'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <enum name='model'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>virtio</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>virtio-transitional</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>virtio-non-transitional</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </enum>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <enum name='backendModel'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>random</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>egd</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>builtin</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </enum>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     </rng>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     <filesystem supported='yes'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <enum name='driverType'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>path</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>handle</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>virtiofs</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </enum>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     </filesystem>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     <tpm supported='yes'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <enum name='model'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>tpm-tis</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>tpm-crb</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </enum>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <enum name='backendModel'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>emulator</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>external</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </enum>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <enum name='backendVersion'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>2.0</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </enum>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     </tpm>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     <redirdev supported='yes'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <enum name='bus'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>usb</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </enum>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     </redirdev>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     <channel supported='yes'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <enum name='type'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>pty</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>unix</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </enum>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     </channel>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     <crypto supported='yes'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <enum name='model'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <enum name='type'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>qemu</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </enum>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <enum name='backendModel'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>builtin</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </enum>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     </crypto>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     <interface supported='yes'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <enum name='backendType'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>default</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>passt</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </enum>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     </interface>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     <panic supported='yes'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <enum name='model'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>isa</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>hyperv</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </enum>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     </panic>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     <console supported='yes'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <enum name='type'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>null</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>vc</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>pty</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>dev</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>file</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>pipe</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>stdio</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>udp</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>tcp</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>unix</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>qemu-vdagent</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>dbus</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </enum>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     </console>
Feb 16 13:12:46 compute-0 nova_compute[184354]:   </devices>
Feb 16 13:12:46 compute-0 nova_compute[184354]:   <features>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     <gic supported='no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     <vmcoreinfo supported='yes'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     <genid supported='yes'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     <backingStoreInput supported='yes'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     <backup supported='yes'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     <async-teardown supported='yes'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     <s390-pv supported='no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     <ps2 supported='yes'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     <tdx supported='no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     <sev supported='no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     <sgx supported='no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     <hyperv supported='yes'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <enum name='features'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>relaxed</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>vapic</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>spinlocks</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>vpindex</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>runtime</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>synic</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>stimer</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>reset</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>vendor_id</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>frequencies</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>reenlightenment</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>tlbflush</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>ipi</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>avic</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>emsr_bitmap</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>xmm_input</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </enum>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <defaults>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <spinlocks>4095</spinlocks>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <stimer_direct>on</stimer_direct>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <tlbflush_direct>on</tlbflush_direct>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <tlbflush_extended>on</tlbflush_extended>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <vendor_id>Linux KVM Hv</vendor_id>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </defaults>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     </hyperv>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     <launchSecurity supported='no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:   </features>
Feb 16 13:12:46 compute-0 nova_compute[184354]: </domainCapabilities>
Feb 16 13:12:46 compute-0 nova_compute[184354]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Feb 16 13:12:46 compute-0 nova_compute[184354]: 2026-02-16 13:12:46.295 184358 DEBUG nova.virt.libvirt.host [None req-bfb29ca0-528a-4b66-b36a-3a224a38a045 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Feb 16 13:12:46 compute-0 nova_compute[184354]: <domainCapabilities>
Feb 16 13:12:46 compute-0 nova_compute[184354]:   <path>/usr/libexec/qemu-kvm</path>
Feb 16 13:12:46 compute-0 nova_compute[184354]:   <domain>kvm</domain>
Feb 16 13:12:46 compute-0 nova_compute[184354]:   <machine>pc-q35-rhel9.8.0</machine>
Feb 16 13:12:46 compute-0 nova_compute[184354]:   <arch>x86_64</arch>
Feb 16 13:12:46 compute-0 nova_compute[184354]:   <vcpu max='4096'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:   <iothreads supported='yes'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:   <os supported='yes'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     <enum name='firmware'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <value>efi</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     </enum>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     <loader supported='yes'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <enum name='type'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>rom</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>pflash</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </enum>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <enum name='readonly'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>yes</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>no</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </enum>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <enum name='secure'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>yes</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>no</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </enum>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     </loader>
Feb 16 13:12:46 compute-0 nova_compute[184354]:   </os>
Feb 16 13:12:46 compute-0 nova_compute[184354]:   <cpu>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     <mode name='host-passthrough' supported='yes'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <enum name='hostPassthroughMigratable'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>on</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>off</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </enum>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     </mode>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     <mode name='maximum' supported='yes'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <enum name='maximumMigratable'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>on</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>off</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </enum>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     </mode>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     <mode name='host-model' supported='yes'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model fallback='forbid'>EPYC-Rome</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <vendor>AMD</vendor>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <maxphysaddr mode='passthrough' limit='40'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <feature policy='require' name='x2apic'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <feature policy='require' name='tsc-deadline'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <feature policy='require' name='hypervisor'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <feature policy='require' name='tsc_adjust'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <feature policy='require' name='spec-ctrl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <feature policy='require' name='stibp'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <feature policy='require' name='ssbd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <feature policy='require' name='cmp_legacy'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <feature policy='require' name='overflow-recov'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <feature policy='require' name='succor'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <feature policy='require' name='ibrs'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <feature policy='require' name='amd-ssbd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <feature policy='require' name='virt-ssbd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <feature policy='require' name='lbrv'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <feature policy='require' name='tsc-scale'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <feature policy='require' name='vmcb-clean'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <feature policy='require' name='flushbyasid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <feature policy='require' name='pause-filter'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <feature policy='require' name='pfthreshold'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <feature policy='require' name='svme-addr-chk'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <feature policy='require' name='lfence-always-serializing'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <feature policy='disable' name='xsaves'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     </mode>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     <mode name='custom' supported='yes'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Broadwell'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='hle'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Broadwell-IBRS'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='hle'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Broadwell-noTSX'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Broadwell-noTSX-IBRS'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Broadwell-v1'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='hle'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Broadwell-v2'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Broadwell-v3'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='hle'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Broadwell-v4'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Cascadelake-Server'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='hle'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Cascadelake-Server-noTSX'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Cascadelake-Server-v1'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='hle'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Cascadelake-Server-v2'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='hle'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Cascadelake-Server-v3'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Cascadelake-Server-v4'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Cascadelake-Server-v5'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='ClearwaterForest'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx-ifma'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx-ne-convert'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx-vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx-vnni-int16'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx-vnni-int8'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='bhi-ctrl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='bhi-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='cldemote'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='cmpccxadd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ddpd-u'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fbsdp-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrs'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='intel-psfd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ipred-ctrl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='lam'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='mcdt-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='movdir64b'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='movdiri'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pbrsb-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='prefetchiti'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='psdp-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='rrsba-ctrl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='serialize'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='sha512'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='sm3'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='sm4'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ss'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='ClearwaterForest-v1'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx-ifma'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx-ne-convert'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx-vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx-vnni-int16'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx-vnni-int8'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='bhi-ctrl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='bhi-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='cldemote'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='cmpccxadd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ddpd-u'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fbsdp-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrs'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='intel-psfd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ipred-ctrl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='lam'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='mcdt-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='movdir64b'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='movdiri'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pbrsb-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='prefetchiti'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='psdp-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='rrsba-ctrl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='serialize'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='sha512'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='sm3'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='sm4'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ss'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Cooperlake'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-bf16'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='hle'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='taa-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Cooperlake-v1'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-bf16'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='hle'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='taa-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Cooperlake-v2'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-bf16'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='hle'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='taa-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Denverton'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='mpx'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Denverton-v1'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='mpx'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Denverton-v2'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Denverton-v3'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Dhyana-v2'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='EPYC-Genoa'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='amd-psfd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='auto-ibrs'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-bf16'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512ifma'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='la57'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='no-nested-data-bp'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='null-sel-clr-base'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='stibp-always-on'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='EPYC-Genoa-v1'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='amd-psfd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='auto-ibrs'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-bf16'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512ifma'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='la57'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='no-nested-data-bp'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='null-sel-clr-base'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='stibp-always-on'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='EPYC-Genoa-v2'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='amd-psfd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='auto-ibrs'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-bf16'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512ifma'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fs-gs-base-ns'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='la57'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='no-nested-data-bp'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='null-sel-clr-base'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='perfmon-v2'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='stibp-always-on'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='EPYC-Milan'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='EPYC-Milan-v1'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='EPYC-Milan-v2'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='amd-psfd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='no-nested-data-bp'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='null-sel-clr-base'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='stibp-always-on'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='EPYC-Milan-v3'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='amd-psfd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='no-nested-data-bp'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='null-sel-clr-base'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='stibp-always-on'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='EPYC-Rome'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='EPYC-Rome-v1'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='EPYC-Rome-v2'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='EPYC-Rome-v3'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='EPYC-Turin'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='amd-psfd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='auto-ibrs'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx-vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-bf16'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-vp2intersect'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512ifma'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fs-gs-base-ns'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ibpb-brtype'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='la57'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='movdir64b'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='movdiri'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='no-nested-data-bp'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='null-sel-clr-base'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='perfmon-v2'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='prefetchi'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='sbpb'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='srso-user-kernel-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='stibp-always-on'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='EPYC-Turin-v1'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='amd-psfd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='auto-ibrs'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx-vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-bf16'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-vp2intersect'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512ifma'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fs-gs-base-ns'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ibpb-brtype'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='la57'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='movdir64b'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='movdiri'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='no-nested-data-bp'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='null-sel-clr-base'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='perfmon-v2'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='prefetchi'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='sbpb'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='srso-user-kernel-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='stibp-always-on'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='EPYC-v3'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='EPYC-v4'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='EPYC-v5'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='GraniteRapids'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='amx-bf16'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='amx-fp16'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='amx-int8'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='amx-tile'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx-vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-bf16'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-fp16'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512ifma'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fbsdp-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrc'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrs'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fzrm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='hle'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='la57'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='mcdt-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pbrsb-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='prefetchiti'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='psdp-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='serialize'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='taa-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='tsx-ldtrk'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xfd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='GraniteRapids-v1'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='amx-bf16'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='amx-fp16'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='amx-int8'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='amx-tile'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx-vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-bf16'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-fp16'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512ifma'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fbsdp-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrc'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrs'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fzrm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='hle'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='la57'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='mcdt-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pbrsb-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='prefetchiti'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='psdp-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='serialize'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='taa-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='tsx-ldtrk'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xfd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='GraniteRapids-v2'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='amx-bf16'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='amx-fp16'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='amx-int8'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='amx-tile'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx-vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx10'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx10-128'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx10-256'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx10-512'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-bf16'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-fp16'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512ifma'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='cldemote'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fbsdp-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrc'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrs'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fzrm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='hle'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='la57'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='mcdt-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='movdir64b'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='movdiri'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pbrsb-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='prefetchiti'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='psdp-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='serialize'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ss'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='taa-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='tsx-ldtrk'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xfd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='GraniteRapids-v3'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='amx-bf16'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='amx-fp16'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='amx-int8'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='amx-tile'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx-vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx10'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx10-128'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx10-256'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx10-512'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-bf16'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-fp16'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512ifma'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='cldemote'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fbsdp-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrc'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrs'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fzrm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='hle'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='la57'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='mcdt-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='movdir64b'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='movdiri'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pbrsb-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='prefetchiti'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='psdp-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='serialize'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ss'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='taa-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='tsx-ldtrk'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xfd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Haswell'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='hle'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Haswell-IBRS'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='hle'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Haswell-noTSX'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Haswell-noTSX-IBRS'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Haswell-v1'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='hle'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Haswell-v2'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Haswell-v3'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='hle'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Haswell-v4'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Icelake-Server'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='hle'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='la57'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Icelake-Server-noTSX'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='la57'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Icelake-Server-v1'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='hle'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='la57'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Icelake-Server-v2'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='la57'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Icelake-Server-v3'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='la57'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='taa-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Icelake-Server-v4'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512ifma'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='la57'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='taa-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Icelake-Server-v5'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512ifma'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='la57'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='taa-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Icelake-Server-v6'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512ifma'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='la57'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='taa-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Icelake-Server-v7'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512ifma'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='hle'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='la57'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='taa-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='IvyBridge'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='IvyBridge-IBRS'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='IvyBridge-v1'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='IvyBridge-v2'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='KnightsMill'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-4fmaps'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-4vnniw'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512er'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512pf'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ss'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='KnightsMill-v1'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-4fmaps'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-4vnniw'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512er'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512pf'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ss'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Opteron_G4'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fma4'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xop'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Opteron_G4-v1'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fma4'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xop'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Opteron_G5'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fma4'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='tbm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xop'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Opteron_G5-v1'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fma4'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='tbm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xop'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='SapphireRapids'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='amx-bf16'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='amx-int8'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='amx-tile'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx-vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-bf16'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-fp16'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512ifma'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrc'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrs'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fzrm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='hle'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='la57'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='serialize'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='taa-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='tsx-ldtrk'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xfd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='SapphireRapids-v1'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='amx-bf16'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='amx-int8'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='amx-tile'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx-vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-bf16'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-fp16'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512ifma'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrc'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrs'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fzrm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='hle'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='la57'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='serialize'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='taa-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='tsx-ldtrk'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xfd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='SapphireRapids-v2'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='amx-bf16'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='amx-int8'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='amx-tile'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx-vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-bf16'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-fp16'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512ifma'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fbsdp-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrc'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrs'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fzrm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='hle'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='la57'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='psdp-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='serialize'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='taa-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='tsx-ldtrk'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xfd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='SapphireRapids-v3'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='amx-bf16'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='amx-int8'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='amx-tile'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx-vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-bf16'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-fp16'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512ifma'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='cldemote'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fbsdp-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrc'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrs'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fzrm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='hle'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='la57'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='movdir64b'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='movdiri'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='psdp-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='serialize'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ss'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='taa-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='tsx-ldtrk'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xfd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='SapphireRapids-v4'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='amx-bf16'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='amx-int8'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='amx-tile'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx-vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-bf16'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-fp16'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512ifma'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='cldemote'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fbsdp-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrc'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrs'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fzrm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='hle'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='la57'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='movdir64b'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='movdiri'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='psdp-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='serialize'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ss'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='taa-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='tsx-ldtrk'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xfd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='SierraForest'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx-ifma'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx-ne-convert'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx-vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx-vnni-int8'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='cmpccxadd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fbsdp-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrs'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='mcdt-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pbrsb-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='psdp-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='serialize'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='SierraForest-v1'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx-ifma'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx-ne-convert'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx-vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx-vnni-int8'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='cmpccxadd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fbsdp-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrs'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='mcdt-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pbrsb-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='psdp-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='serialize'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='SierraForest-v2'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx-ifma'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx-ne-convert'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx-vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx-vnni-int8'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='bhi-ctrl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='cldemote'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='cmpccxadd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fbsdp-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrs'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='intel-psfd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ipred-ctrl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='lam'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='mcdt-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='movdir64b'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='movdiri'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pbrsb-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='psdp-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='rrsba-ctrl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='serialize'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ss'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='SierraForest-v3'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx-ifma'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx-ne-convert'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx-vnni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx-vnni-int8'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='bhi-ctrl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='cldemote'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='cmpccxadd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fbsdp-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='fsrs'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='intel-psfd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ipred-ctrl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='lam'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='mcdt-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='movdir64b'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='movdiri'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pbrsb-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='psdp-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='rrsba-ctrl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='serialize'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ss'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Skylake-Client'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='hle'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Skylake-Client-IBRS'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='hle'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Skylake-Client-v1'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='hle'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Skylake-Client-v2'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='hle'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Skylake-Client-v3'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Skylake-Client-v4'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Skylake-Server'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='hle'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Skylake-Server-IBRS'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='hle'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Skylake-Server-v1'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='hle'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Skylake-Server-v2'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='hle'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Skylake-Server-v3'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Skylake-Server-v4'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Skylake-Server-v5'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='pku'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Snowridge'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='cldemote'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='core-capability'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='movdir64b'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='movdiri'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='mpx'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='split-lock-detect'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Snowridge-v1'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='cldemote'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='core-capability'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='movdir64b'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='movdiri'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='mpx'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='split-lock-detect'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Snowridge-v2'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='cldemote'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='core-capability'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='movdir64b'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='movdiri'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='split-lock-detect'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Snowridge-v3'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='cldemote'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='core-capability'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='movdir64b'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='movdiri'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='split-lock-detect'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='Snowridge-v4'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='cldemote'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='erms'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='movdir64b'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='movdiri'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='athlon'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='3dnow'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='3dnowext'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='athlon-v1'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='3dnow'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='3dnowext'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='core2duo'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ss'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='core2duo-v1'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ss'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='coreduo'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ss'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='coreduo-v1'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ss'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='n270'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ss'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='n270-v1'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='ss'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='phenom'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='3dnow'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='3dnowext'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <blockers model='phenom-v1'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='3dnow'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <feature name='3dnowext'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </blockers>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     </mode>
Feb 16 13:12:46 compute-0 nova_compute[184354]:   </cpu>
Feb 16 13:12:46 compute-0 nova_compute[184354]:   <memoryBacking supported='yes'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     <enum name='sourceType'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <value>file</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <value>anonymous</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <value>memfd</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     </enum>
Feb 16 13:12:46 compute-0 nova_compute[184354]:   </memoryBacking>
Feb 16 13:12:46 compute-0 nova_compute[184354]:   <devices>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     <disk supported='yes'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <enum name='diskDevice'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>disk</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>cdrom</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>floppy</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>lun</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </enum>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <enum name='bus'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>fdc</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>scsi</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>virtio</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>usb</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>sata</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </enum>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <enum name='model'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>virtio</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>virtio-transitional</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>virtio-non-transitional</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </enum>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     </disk>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     <graphics supported='yes'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <enum name='type'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>vnc</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>egl-headless</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>dbus</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </enum>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     </graphics>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     <video supported='yes'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <enum name='modelType'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>vga</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>cirrus</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>virtio</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>none</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>bochs</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>ramfb</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </enum>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     </video>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     <hostdev supported='yes'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <enum name='mode'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>subsystem</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </enum>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <enum name='startupPolicy'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>default</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>mandatory</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>requisite</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>optional</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </enum>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <enum name='subsysType'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>usb</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>pci</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>scsi</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </enum>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <enum name='capsType'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <enum name='pciBackend'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     </hostdev>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     <rng supported='yes'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <enum name='model'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>virtio</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>virtio-transitional</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>virtio-non-transitional</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </enum>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <enum name='backendModel'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>random</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>egd</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>builtin</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </enum>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     </rng>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     <filesystem supported='yes'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <enum name='driverType'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>path</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>handle</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>virtiofs</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </enum>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     </filesystem>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     <tpm supported='yes'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <enum name='model'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>tpm-tis</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>tpm-crb</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </enum>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <enum name='backendModel'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>emulator</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>external</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </enum>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <enum name='backendVersion'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>2.0</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </enum>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     </tpm>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     <redirdev supported='yes'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <enum name='bus'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>usb</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </enum>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     </redirdev>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     <channel supported='yes'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <enum name='type'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>pty</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>unix</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </enum>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     </channel>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     <crypto supported='yes'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <enum name='model'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <enum name='type'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>qemu</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </enum>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <enum name='backendModel'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>builtin</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </enum>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     </crypto>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     <interface supported='yes'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <enum name='backendType'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>default</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>passt</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </enum>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     </interface>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     <panic supported='yes'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <enum name='model'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>isa</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>hyperv</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </enum>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     </panic>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     <console supported='yes'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <enum name='type'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>null</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>vc</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>pty</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>dev</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>file</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>pipe</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>stdio</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>udp</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>tcp</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>unix</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>qemu-vdagent</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>dbus</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </enum>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     </console>
Feb 16 13:12:46 compute-0 nova_compute[184354]:   </devices>
Feb 16 13:12:46 compute-0 nova_compute[184354]:   <features>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     <gic supported='no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     <vmcoreinfo supported='yes'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     <genid supported='yes'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     <backingStoreInput supported='yes'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     <backup supported='yes'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     <async-teardown supported='yes'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     <s390-pv supported='no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     <ps2 supported='yes'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     <tdx supported='no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     <sev supported='no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     <sgx supported='no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     <hyperv supported='yes'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <enum name='features'>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>relaxed</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>vapic</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>spinlocks</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>vpindex</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>runtime</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>synic</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>stimer</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>reset</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>vendor_id</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>frequencies</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>reenlightenment</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>tlbflush</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>ipi</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>avic</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>emsr_bitmap</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <value>xmm_input</value>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </enum>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       <defaults>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <spinlocks>4095</spinlocks>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <stimer_direct>on</stimer_direct>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <tlbflush_direct>on</tlbflush_direct>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <tlbflush_extended>on</tlbflush_extended>
Feb 16 13:12:46 compute-0 nova_compute[184354]:         <vendor_id>Linux KVM Hv</vendor_id>
Feb 16 13:12:46 compute-0 nova_compute[184354]:       </defaults>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     </hyperv>
Feb 16 13:12:46 compute-0 nova_compute[184354]:     <launchSecurity supported='no'/>
Feb 16 13:12:46 compute-0 nova_compute[184354]:   </features>
Feb 16 13:12:46 compute-0 nova_compute[184354]: </domainCapabilities>
Feb 16 13:12:46 compute-0 nova_compute[184354]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Feb 16 13:12:46 compute-0 nova_compute[184354]: 2026-02-16 13:12:46.382 184358 DEBUG nova.virt.libvirt.host [None req-bfb29ca0-528a-4b66-b36a-3a224a38a045 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Feb 16 13:12:46 compute-0 nova_compute[184354]: 2026-02-16 13:12:46.382 184358 DEBUG nova.virt.libvirt.host [None req-bfb29ca0-528a-4b66-b36a-3a224a38a045 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Feb 16 13:12:46 compute-0 nova_compute[184354]: 2026-02-16 13:12:46.383 184358 DEBUG nova.virt.libvirt.host [None req-bfb29ca0-528a-4b66-b36a-3a224a38a045 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Feb 16 13:12:46 compute-0 nova_compute[184354]: 2026-02-16 13:12:46.389 184358 INFO nova.virt.libvirt.host [None req-bfb29ca0-528a-4b66-b36a-3a224a38a045 - - - - - -] Secure Boot support detected
Feb 16 13:12:46 compute-0 nova_compute[184354]: 2026-02-16 13:12:46.392 184358 INFO nova.virt.libvirt.driver [None req-bfb29ca0-528a-4b66-b36a-3a224a38a045 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Feb 16 13:12:46 compute-0 nova_compute[184354]: 2026-02-16 13:12:46.392 184358 INFO nova.virt.libvirt.driver [None req-bfb29ca0-528a-4b66-b36a-3a224a38a045 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Feb 16 13:12:46 compute-0 nova_compute[184354]: 2026-02-16 13:12:46.406 184358 DEBUG nova.virt.libvirt.driver [None req-bfb29ca0-528a-4b66-b36a-3a224a38a045 - - - - - -] cpu compare xml: <cpu match="exact">
Feb 16 13:12:46 compute-0 nova_compute[184354]:   <model>Nehalem</model>
Feb 16 13:12:46 compute-0 nova_compute[184354]: </cpu>
Feb 16 13:12:46 compute-0 nova_compute[184354]:  _compare_cpu /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10019
Feb 16 13:12:46 compute-0 nova_compute[184354]: 2026-02-16 13:12:46.408 184358 DEBUG nova.virt.libvirt.driver [None req-bfb29ca0-528a-4b66-b36a-3a224a38a045 - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097
Feb 16 13:12:46 compute-0 nova_compute[184354]: 2026-02-16 13:12:46.475 184358 INFO nova.virt.node [None req-bfb29ca0-528a-4b66-b36a-3a224a38a045 - - - - - -] Determined node identity c9501a85-df32-4b8f-bce0-9425ef1e7866 from /var/lib/nova/compute_id
Feb 16 13:12:46 compute-0 nova_compute[184354]: 2026-02-16 13:12:46.496 184358 WARNING nova.compute.manager [None req-bfb29ca0-528a-4b66-b36a-3a224a38a045 - - - - - -] Compute nodes ['c9501a85-df32-4b8f-bce0-9425ef1e7866'] for host compute-0.ctlplane.example.com were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning.
Feb 16 13:12:46 compute-0 nova_compute[184354]: 2026-02-16 13:12:46.543 184358 INFO nova.compute.manager [None req-bfb29ca0-528a-4b66-b36a-3a224a38a045 - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host
Feb 16 13:12:46 compute-0 nova_compute[184354]: 2026-02-16 13:12:46.634 184358 WARNING nova.compute.manager [None req-bfb29ca0-528a-4b66-b36a-3a224a38a045 - - - - - -] No compute node record found for host compute-0.ctlplane.example.com. If this is the first time this service is starting on this host, then you can ignore this warning.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-0.ctlplane.example.com could not be found.
Feb 16 13:12:46 compute-0 nova_compute[184354]: 2026-02-16 13:12:46.635 184358 DEBUG oslo_concurrency.lockutils [None req-bfb29ca0-528a-4b66-b36a-3a224a38a045 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:12:46 compute-0 nova_compute[184354]: 2026-02-16 13:12:46.635 184358 DEBUG oslo_concurrency.lockutils [None req-bfb29ca0-528a-4b66-b36a-3a224a38a045 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:12:46 compute-0 nova_compute[184354]: 2026-02-16 13:12:46.635 184358 DEBUG oslo_concurrency.lockutils [None req-bfb29ca0-528a-4b66-b36a-3a224a38a045 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:12:46 compute-0 nova_compute[184354]: 2026-02-16 13:12:46.635 184358 DEBUG nova.compute.resource_tracker [None req-bfb29ca0-528a-4b66-b36a-3a224a38a045 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 16 13:12:46 compute-0 systemd[1]: Starting libvirt nodedev daemon...
Feb 16 13:12:46 compute-0 systemd[1]: Started libvirt nodedev daemon.
Feb 16 13:12:46 compute-0 python3.9[185161]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 16 13:12:46 compute-0 nova_compute[184354]: 2026-02-16 13:12:46.879 184358 WARNING nova.virt.libvirt.driver [None req-bfb29ca0-528a-4b66-b36a-3a224a38a045 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 13:12:46 compute-0 nova_compute[184354]: 2026-02-16 13:12:46.880 184358 DEBUG nova.compute.resource_tracker [None req-bfb29ca0-528a-4b66-b36a-3a224a38a045 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6167MB free_disk=73.43989944458008GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 16 13:12:46 compute-0 nova_compute[184354]: 2026-02-16 13:12:46.880 184358 DEBUG oslo_concurrency.lockutils [None req-bfb29ca0-528a-4b66-b36a-3a224a38a045 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:12:46 compute-0 nova_compute[184354]: 2026-02-16 13:12:46.880 184358 DEBUG oslo_concurrency.lockutils [None req-bfb29ca0-528a-4b66-b36a-3a224a38a045 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:12:46 compute-0 nova_compute[184354]: 2026-02-16 13:12:46.902 184358 WARNING nova.compute.resource_tracker [None req-bfb29ca0-528a-4b66-b36a-3a224a38a045 - - - - - -] No compute node record for compute-0.ctlplane.example.com:c9501a85-df32-4b8f-bce0-9425ef1e7866: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host c9501a85-df32-4b8f-bce0-9425ef1e7866 could not be found.
Feb 16 13:12:46 compute-0 nova_compute[184354]: 2026-02-16 13:12:46.934 184358 INFO nova.compute.resource_tracker [None req-bfb29ca0-528a-4b66-b36a-3a224a38a045 - - - - - -] Compute node record created for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com with uuid: c9501a85-df32-4b8f-bce0-9425ef1e7866
Feb 16 13:12:47 compute-0 nova_compute[184354]: 2026-02-16 13:12:47.086 184358 DEBUG nova.compute.resource_tracker [None req-bfb29ca0-528a-4b66-b36a-3a224a38a045 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 16 13:12:47 compute-0 nova_compute[184354]: 2026-02-16 13:12:47.087 184358 DEBUG nova.compute.resource_tracker [None req-bfb29ca0-528a-4b66-b36a-3a224a38a045 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 16 13:12:47 compute-0 python3.9[185334]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service.requires follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 16 13:12:48 compute-0 nova_compute[184354]: 2026-02-16 13:12:48.397 184358 INFO nova.scheduler.client.report [None req-bfb29ca0-528a-4b66-b36a-3a224a38a045 - - - - - -] [req-82baeeba-8774-43e3-8a66-e5878c8ef851] Created resource provider record via placement API for resource provider with UUID c9501a85-df32-4b8f-bce0-9425ef1e7866 and name compute-0.ctlplane.example.com.
Feb 16 13:12:48 compute-0 nova_compute[184354]: 2026-02-16 13:12:48.470 184358 DEBUG nova.virt.libvirt.host [None req-bfb29ca0-528a-4b66-b36a-3a224a38a045 - - - - - -] /sys/module/kvm_amd/parameters/sev contains [N
Feb 16 13:12:48 compute-0 nova_compute[184354]: ] _kernel_supports_amd_sev /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1803
Feb 16 13:12:48 compute-0 nova_compute[184354]: 2026-02-16 13:12:48.471 184358 INFO nova.virt.libvirt.host [None req-bfb29ca0-528a-4b66-b36a-3a224a38a045 - - - - - -] kernel doesn't support AMD SEV
Feb 16 13:12:48 compute-0 nova_compute[184354]: 2026-02-16 13:12:48.471 184358 DEBUG nova.compute.provider_tree [None req-bfb29ca0-528a-4b66-b36a-3a224a38a045 - - - - - -] Updating inventory in ProviderTree for provider c9501a85-df32-4b8f-bce0-9425ef1e7866 with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Feb 16 13:12:48 compute-0 nova_compute[184354]: 2026-02-16 13:12:48.472 184358 DEBUG nova.virt.libvirt.driver [None req-bfb29ca0-528a-4b66-b36a-3a224a38a045 - - - - - -] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Feb 16 13:12:48 compute-0 nova_compute[184354]: 2026-02-16 13:12:48.474 184358 DEBUG nova.virt.libvirt.driver [None req-bfb29ca0-528a-4b66-b36a-3a224a38a045 - - - - - -] Libvirt baseline CPU <cpu>
Feb 16 13:12:48 compute-0 nova_compute[184354]:   <arch>x86_64</arch>
Feb 16 13:12:48 compute-0 nova_compute[184354]:   <model>Nehalem</model>
Feb 16 13:12:48 compute-0 nova_compute[184354]:   <vendor>AMD</vendor>
Feb 16 13:12:48 compute-0 nova_compute[184354]:   <topology sockets="8" cores="1" threads="1"/>
Feb 16 13:12:48 compute-0 nova_compute[184354]: </cpu>
Feb 16 13:12:48 compute-0 nova_compute[184354]:  _get_guest_baseline_cpu_features /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12537
Feb 16 13:12:48 compute-0 nova_compute[184354]: 2026-02-16 13:12:48.527 184358 DEBUG nova.scheduler.client.report [None req-bfb29ca0-528a-4b66-b36a-3a224a38a045 - - - - - -] Updated inventory for provider c9501a85-df32-4b8f-bce0-9425ef1e7866 with generation 0 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:957
Feb 16 13:12:48 compute-0 nova_compute[184354]: 2026-02-16 13:12:48.528 184358 DEBUG nova.compute.provider_tree [None req-bfb29ca0-528a-4b66-b36a-3a224a38a045 - - - - - -] Updating resource provider c9501a85-df32-4b8f-bce0-9425ef1e7866 generation from 0 to 1 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Feb 16 13:12:48 compute-0 nova_compute[184354]: 2026-02-16 13:12:48.528 184358 DEBUG nova.compute.provider_tree [None req-bfb29ca0-528a-4b66-b36a-3a224a38a045 - - - - - -] Updating inventory in ProviderTree for provider c9501a85-df32-4b8f-bce0-9425ef1e7866 with inventory: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Feb 16 13:12:48 compute-0 nova_compute[184354]: 2026-02-16 13:12:48.613 184358 DEBUG nova.compute.provider_tree [None req-bfb29ca0-528a-4b66-b36a-3a224a38a045 - - - - - -] Updating resource provider c9501a85-df32-4b8f-bce0-9425ef1e7866 generation from 1 to 2 during operation: update_traits _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Feb 16 13:12:48 compute-0 sudo[185484]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fmghyecdwgknhwtjfrtherauosljoaps ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247567.9708173-2641-269868594013215/AnsiballZ_podman_container.py'
Feb 16 13:12:48 compute-0 sudo[185484]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:12:48 compute-0 nova_compute[184354]: 2026-02-16 13:12:48.646 184358 DEBUG nova.compute.resource_tracker [None req-bfb29ca0-528a-4b66-b36a-3a224a38a045 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 16 13:12:48 compute-0 nova_compute[184354]: 2026-02-16 13:12:48.647 184358 DEBUG oslo_concurrency.lockutils [None req-bfb29ca0-528a-4b66-b36a-3a224a38a045 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.767s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:12:48 compute-0 nova_compute[184354]: 2026-02-16 13:12:48.647 184358 DEBUG nova.service [None req-bfb29ca0-528a-4b66-b36a-3a224a38a045 - - - - - -] Creating RPC server for service compute start /usr/lib/python3.9/site-packages/nova/service.py:182
Feb 16 13:12:48 compute-0 nova_compute[184354]: 2026-02-16 13:12:48.728 184358 DEBUG nova.service [None req-bfb29ca0-528a-4b66-b36a-3a224a38a045 - - - - - -] Join ServiceGroup membership for this service compute start /usr/lib/python3.9/site-packages/nova/service.py:199
Feb 16 13:12:48 compute-0 nova_compute[184354]: 2026-02-16 13:12:48.728 184358 DEBUG nova.servicegroup.drivers.db [None req-bfb29ca0-528a-4b66-b36a-3a224a38a045 - - - - - -] DB_Driver: join new ServiceGroup member compute-0.ctlplane.example.com to the compute group, service = <Service: host=compute-0.ctlplane.example.com, binary=nova-compute, manager_class_name=nova.compute.manager.ComputeManager> join /usr/lib/python3.9/site-packages/nova/servicegroup/drivers/db.py:44
Feb 16 13:12:48 compute-0 python3.9[185486]: ansible-containers.podman.podman_container Invoked with name=nova_nvme_cleaner state=absent executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Feb 16 13:12:48 compute-0 sudo[185484]: pam_unix(sudo:session): session closed for user root
Feb 16 13:12:48 compute-0 rsyslogd[1017]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Feb 16 13:12:49 compute-0 sudo[185660]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aklkcritoxxwvhhjegevvieonfzrionm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247569.1934557-2657-206912442488611/AnsiballZ_systemd.py'
Feb 16 13:12:49 compute-0 sudo[185660]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:12:49 compute-0 python3.9[185662]: ansible-ansible.builtin.systemd Invoked with name=edpm_nova_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 16 13:12:49 compute-0 systemd[1]: Stopping nova_compute container...
Feb 16 13:12:50 compute-0 nova_compute[184354]: 2026-02-16 13:12:50.795 184358 WARNING amqp [-] Received method (60, 30) during closing channel 1. This method will be ignored
Feb 16 13:12:50 compute-0 nova_compute[184354]: 2026-02-16 13:12:50.797 184358 DEBUG oslo_concurrency.lockutils [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 13:12:50 compute-0 nova_compute[184354]: 2026-02-16 13:12:50.797 184358 DEBUG oslo_concurrency.lockutils [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 13:12:50 compute-0 nova_compute[184354]: 2026-02-16 13:12:50.797 184358 DEBUG oslo_concurrency.lockutils [None req-cce3c730-1018-4636-adf4-dc9a711e881e - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 13:12:51 compute-0 systemd[1]: libpod-c4f5f852a5697c13fdc119ceb2edf67db899b1ee30325cc905694a680f74e3f8.scope: Deactivated successfully.
Feb 16 13:12:51 compute-0 virtqemud[184843]: libvirt version: 11.10.0, package: 4.el9 (builder@centos.org, 2026-01-29-15:25:17, )
Feb 16 13:12:51 compute-0 virtqemud[184843]: hostname: compute-0
Feb 16 13:12:51 compute-0 virtqemud[184843]: End of file while reading data: Input/output error
Feb 16 13:12:51 compute-0 systemd[1]: libpod-c4f5f852a5697c13fdc119ceb2edf67db899b1ee30325cc905694a680f74e3f8.scope: Consumed 2.887s CPU time.
Feb 16 13:12:51 compute-0 podman[185666]: 2026-02-16 13:12:51.221674317 +0000 UTC m=+1.396264936 container died c4f5f852a5697c13fdc119ceb2edf67db899b1ee30325cc905694a680f74e3f8 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.vendor=CentOS, container_name=nova_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855-1fd557e901d7d9afcb043f19c585a1ce3f9fc9352d4613ac527b7499231e4d28'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'nova', 'volumes': ['/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=nova_compute, io.buildah.version=1.41.3)
Feb 16 13:12:51 compute-0 systemd[1]: var-lib-containers-storage-overlay-a804bc366f293b19dde39cfde901c4b366dd2f54137430ab9bb6735d55d8502e-merged.mount: Deactivated successfully.
Feb 16 13:12:51 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c4f5f852a5697c13fdc119ceb2edf67db899b1ee30325cc905694a680f74e3f8-userdata-shm.mount: Deactivated successfully.
Feb 16 13:12:51 compute-0 podman[185666]: 2026-02-16 13:12:51.275203613 +0000 UTC m=+1.449794222 container cleanup c4f5f852a5697c13fdc119ceb2edf67db899b1ee30325cc905694a680f74e3f8 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, container_name=nova_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, config_id=nova_compute, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855-1fd557e901d7d9afcb043f19c585a1ce3f9fc9352d4613ac527b7499231e4d28'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'nova', 'volumes': ['/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Feb 16 13:12:51 compute-0 podman[185666]: nova_compute
Feb 16 13:12:51 compute-0 podman[185695]: nova_compute
Feb 16 13:12:51 compute-0 systemd[1]: edpm_nova_compute.service: Deactivated successfully.
Feb 16 13:12:51 compute-0 systemd[1]: Stopped nova_compute container.
Feb 16 13:12:51 compute-0 systemd[1]: Starting nova_compute container...
Feb 16 13:12:51 compute-0 systemd[1]: Started libcrun container.
Feb 16 13:12:51 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a804bc366f293b19dde39cfde901c4b366dd2f54137430ab9bb6735d55d8502e/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Feb 16 13:12:51 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a804bc366f293b19dde39cfde901c4b366dd2f54137430ab9bb6735d55d8502e/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Feb 16 13:12:51 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a804bc366f293b19dde39cfde901c4b366dd2f54137430ab9bb6735d55d8502e/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Feb 16 13:12:51 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a804bc366f293b19dde39cfde901c4b366dd2f54137430ab9bb6735d55d8502e/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Feb 16 13:12:51 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a804bc366f293b19dde39cfde901c4b366dd2f54137430ab9bb6735d55d8502e/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Feb 16 13:12:51 compute-0 podman[185707]: 2026-02-16 13:12:51.446986029 +0000 UTC m=+0.090440968 container init c4f5f852a5697c13fdc119ceb2edf67db899b1ee30325cc905694a680f74e3f8 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855-1fd557e901d7d9afcb043f19c585a1ce3f9fc9352d4613ac527b7499231e4d28'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'nova', 'volumes': ['/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, container_name=nova_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=nova_compute, io.buildah.version=1.41.3)
Feb 16 13:12:51 compute-0 podman[185707]: 2026-02-16 13:12:51.452085746 +0000 UTC m=+0.095540655 container start c4f5f852a5697c13fdc119ceb2edf67db899b1ee30325cc905694a680f74e3f8 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, tcib_managed=true, config_id=nova_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855-1fd557e901d7d9afcb043f19c585a1ce3f9fc9352d4613ac527b7499231e4d28'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'nova', 'volumes': ['/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, container_name=nova_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 16 13:12:51 compute-0 podman[185707]: nova_compute
Feb 16 13:12:51 compute-0 systemd[1]: Started nova_compute container.
Feb 16 13:12:51 compute-0 nova_compute[185723]: + sudo -E kolla_set_configs
Feb 16 13:12:51 compute-0 sudo[185660]: pam_unix(sudo:session): session closed for user root
Feb 16 13:12:51 compute-0 nova_compute[185723]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Feb 16 13:12:51 compute-0 nova_compute[185723]: INFO:__main__:Validating config file
Feb 16 13:12:51 compute-0 nova_compute[185723]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Feb 16 13:12:51 compute-0 nova_compute[185723]: INFO:__main__:Copying service configuration files
Feb 16 13:12:51 compute-0 nova_compute[185723]: INFO:__main__:Deleting /etc/nova/nova.conf
Feb 16 13:12:51 compute-0 nova_compute[185723]: INFO:__main__:Copying /var/lib/kolla/config_files/src/nova-blank.conf to /etc/nova/nova.conf
Feb 16 13:12:51 compute-0 nova_compute[185723]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Feb 16 13:12:51 compute-0 nova_compute[185723]: INFO:__main__:Deleting /etc/nova/nova.conf.d/01-nova.conf
Feb 16 13:12:51 compute-0 nova_compute[185723]: INFO:__main__:Copying /var/lib/kolla/config_files/src/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Feb 16 13:12:51 compute-0 nova_compute[185723]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Feb 16 13:12:51 compute-0 nova_compute[185723]: INFO:__main__:Deleting /etc/nova/nova.conf.d/25-nova-extra.conf
Feb 16 13:12:51 compute-0 nova_compute[185723]: INFO:__main__:Copying /var/lib/kolla/config_files/src/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Feb 16 13:12:51 compute-0 nova_compute[185723]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Feb 16 13:12:51 compute-0 nova_compute[185723]: INFO:__main__:Deleting /etc/nova/nova.conf.d/nova-blank.conf
Feb 16 13:12:51 compute-0 nova_compute[185723]: INFO:__main__:Copying /var/lib/kolla/config_files/src/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Feb 16 13:12:51 compute-0 nova_compute[185723]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Feb 16 13:12:51 compute-0 nova_compute[185723]: INFO:__main__:Deleting /etc/nova/nova.conf.d/02-nova-host-specific.conf
Feb 16 13:12:51 compute-0 nova_compute[185723]: INFO:__main__:Copying /var/lib/kolla/config_files/src/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Feb 16 13:12:51 compute-0 nova_compute[185723]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Feb 16 13:12:51 compute-0 nova_compute[185723]: INFO:__main__:Deleting /etc/ceph
Feb 16 13:12:51 compute-0 nova_compute[185723]: INFO:__main__:Creating directory /etc/ceph
Feb 16 13:12:51 compute-0 nova_compute[185723]: INFO:__main__:Setting permission for /etc/ceph
Feb 16 13:12:51 compute-0 nova_compute[185723]: INFO:__main__:Deleting /var/lib/nova/.ssh/ssh-privatekey
Feb 16 13:12:51 compute-0 nova_compute[185723]: INFO:__main__:Copying /var/lib/kolla/config_files/src/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Feb 16 13:12:51 compute-0 nova_compute[185723]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Feb 16 13:12:51 compute-0 nova_compute[185723]: INFO:__main__:Deleting /var/lib/nova/.ssh/config
Feb 16 13:12:51 compute-0 nova_compute[185723]: INFO:__main__:Copying /var/lib/kolla/config_files/src/ssh-config to /var/lib/nova/.ssh/config
Feb 16 13:12:51 compute-0 nova_compute[185723]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Feb 16 13:12:51 compute-0 nova_compute[185723]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Feb 16 13:12:51 compute-0 nova_compute[185723]: INFO:__main__:Copying /var/lib/kolla/config_files/src/run-on-host to /usr/sbin/iscsiadm
Feb 16 13:12:51 compute-0 nova_compute[185723]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Feb 16 13:12:51 compute-0 nova_compute[185723]: INFO:__main__:Writing out command to execute
Feb 16 13:12:51 compute-0 nova_compute[185723]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Feb 16 13:12:51 compute-0 nova_compute[185723]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Feb 16 13:12:51 compute-0 nova_compute[185723]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Feb 16 13:12:51 compute-0 nova_compute[185723]: ++ cat /run_command
Feb 16 13:12:51 compute-0 nova_compute[185723]: + CMD=nova-compute
Feb 16 13:12:51 compute-0 nova_compute[185723]: + ARGS=
Feb 16 13:12:51 compute-0 nova_compute[185723]: + sudo kolla_copy_cacerts
Feb 16 13:12:51 compute-0 nova_compute[185723]: + [[ ! -n '' ]]
Feb 16 13:12:51 compute-0 nova_compute[185723]: + . kolla_extend_start
Feb 16 13:12:51 compute-0 nova_compute[185723]: Running command: 'nova-compute'
Feb 16 13:12:51 compute-0 nova_compute[185723]: + echo 'Running command: '\''nova-compute'\'''
Feb 16 13:12:51 compute-0 nova_compute[185723]: + umask 0022
Feb 16 13:12:51 compute-0 nova_compute[185723]: + exec nova-compute
Feb 16 13:12:51 compute-0 sudo[185884]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jxnqcfjkrseqsfcjcfsvdkrlwvkdtdqd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247571.6814914-2675-185787292099637/AnsiballZ_podman_container.py'
Feb 16 13:12:51 compute-0 sudo[185884]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:12:52 compute-0 python3.9[185886]: ansible-containers.podman.podman_container Invoked with name=nova_compute_init state=started executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Feb 16 13:12:52 compute-0 systemd[1]: Started libpod-conmon-71fc04e292fa5c5973fc1e2ff0bb481dd4d6957cb4d5278dd3b5424f780b2be6.scope.
Feb 16 13:12:52 compute-0 systemd[1]: Started libcrun container.
Feb 16 13:12:52 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bd7b9eb82c7fb8a618b88f7ef2f4240f63712391167fa9b03ed13aa347fce5e6/merged/usr/sbin/nova_statedir_ownership.py supports timestamps until 2038 (0x7fffffff)
Feb 16 13:12:52 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bd7b9eb82c7fb8a618b88f7ef2f4240f63712391167fa9b03ed13aa347fce5e6/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Feb 16 13:12:52 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bd7b9eb82c7fb8a618b88f7ef2f4240f63712391167fa9b03ed13aa347fce5e6/merged/var/lib/_nova_secontext supports timestamps until 2038 (0x7fffffff)
Feb 16 13:12:52 compute-0 podman[185912]: 2026-02-16 13:12:52.348632372 +0000 UTC m=+0.137172303 container init 71fc04e292fa5c5973fc1e2ff0bb481dd4d6957cb4d5278dd3b5424f780b2be6 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.build-date=20260127, config_id=nova_compute_init, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False, 'EDPM_CONFIG_HASH': '1fd557e901d7d9afcb043f19c585a1ce3f9fc9352d4613ac527b7499231e4d28'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'none', 'privileged': False, 'restart': 'never', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, container_name=nova_compute_init, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team)
Feb 16 13:12:52 compute-0 podman[185912]: 2026-02-16 13:12:52.355266698 +0000 UTC m=+0.143806609 container start 71fc04e292fa5c5973fc1e2ff0bb481dd4d6957cb4d5278dd3b5424f780b2be6 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, tcib_managed=true, config_id=nova_compute_init, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False, 'EDPM_CONFIG_HASH': '1fd557e901d7d9afcb043f19c585a1ce3f9fc9352d4613ac527b7499231e4d28'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'none', 'privileged': False, 'restart': 'never', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, container_name=nova_compute_init)
Feb 16 13:12:52 compute-0 python3.9[185886]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: podman start nova_compute_init
Feb 16 13:12:52 compute-0 nova_compute_init[185933]: INFO:nova_statedir:Applying nova statedir ownership
Feb 16 13:12:52 compute-0 nova_compute_init[185933]: INFO:nova_statedir:Target ownership for /var/lib/nova: 42436:42436
Feb 16 13:12:52 compute-0 nova_compute_init[185933]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/
Feb 16 13:12:52 compute-0 nova_compute_init[185933]: INFO:nova_statedir:Changing ownership of /var/lib/nova from 1000:1000 to 42436:42436
Feb 16 13:12:52 compute-0 nova_compute_init[185933]: INFO:nova_statedir:Setting selinux context of /var/lib/nova to system_u:object_r:container_file_t:s0
Feb 16 13:12:52 compute-0 nova_compute_init[185933]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/instances/
Feb 16 13:12:52 compute-0 nova_compute_init[185933]: INFO:nova_statedir:Changing ownership of /var/lib/nova/instances from 1000:1000 to 42436:42436
Feb 16 13:12:52 compute-0 nova_compute_init[185933]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances to system_u:object_r:container_file_t:s0
Feb 16 13:12:52 compute-0 nova_compute_init[185933]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/
Feb 16 13:12:52 compute-0 nova_compute_init[185933]: INFO:nova_statedir:Ownership of /var/lib/nova/.ssh already 42436:42436
Feb 16 13:12:52 compute-0 nova_compute_init[185933]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.ssh to system_u:object_r:container_file_t:s0
Feb 16 13:12:52 compute-0 nova_compute_init[185933]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/ssh-privatekey
Feb 16 13:12:52 compute-0 nova_compute_init[185933]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/config
Feb 16 13:12:52 compute-0 nova_compute_init[185933]: INFO:nova_statedir:Nova statedir ownership complete
Feb 16 13:12:52 compute-0 systemd[1]: libpod-71fc04e292fa5c5973fc1e2ff0bb481dd4d6957cb4d5278dd3b5424f780b2be6.scope: Deactivated successfully.
Feb 16 13:12:52 compute-0 podman[185934]: 2026-02-16 13:12:52.407807779 +0000 UTC m=+0.028100192 container died 71fc04e292fa5c5973fc1e2ff0bb481dd4d6957cb4d5278dd3b5424f780b2be6 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, tcib_managed=true, config_id=nova_compute_init, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False, 'EDPM_CONFIG_HASH': '1fd557e901d7d9afcb043f19c585a1ce3f9fc9352d4613ac527b7499231e4d28'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'none', 'privileged': False, 'restart': 'never', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=nova_compute_init, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 16 13:12:52 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-71fc04e292fa5c5973fc1e2ff0bb481dd4d6957cb4d5278dd3b5424f780b2be6-userdata-shm.mount: Deactivated successfully.
Feb 16 13:12:52 compute-0 systemd[1]: var-lib-containers-storage-overlay-bd7b9eb82c7fb8a618b88f7ef2f4240f63712391167fa9b03ed13aa347fce5e6-merged.mount: Deactivated successfully.
Feb 16 13:12:52 compute-0 podman[185944]: 2026-02-16 13:12:52.514218804 +0000 UTC m=+0.104745015 container cleanup 71fc04e292fa5c5973fc1e2ff0bb481dd4d6957cb4d5278dd3b5424f780b2be6 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False, 'EDPM_CONFIG_HASH': '1fd557e901d7d9afcb043f19c585a1ce3f9fc9352d4613ac527b7499231e4d28'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'none', 'privileged': False, 'restart': 'never', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=nova_compute_init, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, config_id=nova_compute_init)
Feb 16 13:12:52 compute-0 systemd[1]: libpod-conmon-71fc04e292fa5c5973fc1e2ff0bb481dd4d6957cb4d5278dd3b5424f780b2be6.scope: Deactivated successfully.
Feb 16 13:12:52 compute-0 sudo[185884]: pam_unix(sudo:session): session closed for user root
Feb 16 13:12:53 compute-0 sshd-session[160853]: Connection closed by 192.168.122.30 port 48644
Feb 16 13:12:53 compute-0 sshd-session[160850]: pam_unix(sshd:session): session closed for user zuul
Feb 16 13:12:53 compute-0 systemd[1]: session-24.scope: Deactivated successfully.
Feb 16 13:12:53 compute-0 systemd[1]: session-24.scope: Consumed 1min 27.432s CPU time.
Feb 16 13:12:53 compute-0 systemd-logind[818]: Session 24 logged out. Waiting for processes to exit.
Feb 16 13:12:53 compute-0 systemd-logind[818]: Removed session 24.
Feb 16 13:12:53 compute-0 nova_compute[185723]: 2026-02-16 13:12:53.269 185727 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Feb 16 13:12:53 compute-0 nova_compute[185723]: 2026-02-16 13:12:53.269 185727 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Feb 16 13:12:53 compute-0 nova_compute[185723]: 2026-02-16 13:12:53.269 185727 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Feb 16 13:12:53 compute-0 nova_compute[185723]: 2026-02-16 13:12:53.270 185727 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs
Feb 16 13:12:53 compute-0 nova_compute[185723]: 2026-02-16 13:12:53.409 185727 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:12:53 compute-0 nova_compute[185723]: 2026-02-16 13:12:53.423 185727 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.014s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:12:53 compute-0 nova_compute[185723]: 2026-02-16 13:12:53.423 185727 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473
Feb 16 13:12:53 compute-0 nova_compute[185723]: 2026-02-16 13:12:53.915 185727 INFO nova.virt.driver [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.009 185727 INFO nova.compute.provider_config [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.023 185727 DEBUG oslo_concurrency.lockutils [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.024 185727 DEBUG oslo_concurrency.lockutils [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.024 185727 DEBUG oslo_concurrency.lockutils [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.024 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.024 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.024 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.025 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.025 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.025 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.025 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.025 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.025 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.025 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.025 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.026 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.026 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.026 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.026 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.026 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.026 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.026 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.027 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.027 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] console_host                   = compute-0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.027 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.027 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.027 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.027 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.027 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.028 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.028 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.028 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.028 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.028 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.028 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.029 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.029 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.029 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.029 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.029 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.029 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.029 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.030 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] host                           = compute-0.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.030 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.030 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.030 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.030 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.030 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.031 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.031 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.031 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.031 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.031 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.031 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.031 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.032 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.032 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.032 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.032 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.032 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.032 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.032 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.032 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.033 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.033 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.033 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.033 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.033 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.033 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.033 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.033 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.034 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.034 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.034 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.034 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.034 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.034 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.034 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.035 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.035 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.035 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.035 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.035 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.035 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.035 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] my_block_storage_ip            = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.036 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] my_ip                          = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.036 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.036 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.036 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.036 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.037 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.037 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.037 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.037 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.037 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.037 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.037 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.038 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.038 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.038 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.038 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.038 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.038 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.038 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.039 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.039 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.039 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.039 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.039 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.039 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.039 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.039 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.040 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.040 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.040 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.040 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.040 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.040 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.040 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.041 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.041 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.041 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.041 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.041 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.041 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.041 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.042 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.042 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.042 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.042 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.042 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.042 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.042 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.043 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.043 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.043 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.043 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.043 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.043 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.043 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.044 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.044 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.044 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.044 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.044 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.044 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.044 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.045 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.045 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.045 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.045 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.045 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.045 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.046 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.046 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.046 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.046 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.046 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.046 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.046 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.047 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.047 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.047 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.047 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.047 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.047 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.048 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.048 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.048 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.048 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.048 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.048 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.048 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.049 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.049 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.049 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.049 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.049 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.049 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.049 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.050 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.050 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.050 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.050 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.050 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.050 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.051 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.051 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.051 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.051 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.051 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.051 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.051 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.051 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.052 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.052 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.052 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.052 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.052 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.052 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.052 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.053 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.053 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.053 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.053 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.053 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.053 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.054 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.054 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.054 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.054 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.054 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.054 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.054 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.055 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.055 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.055 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.055 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.055 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.055 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.055 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.056 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.056 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.056 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.056 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.056 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.056 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] cinder.os_region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.056 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.057 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.057 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.057 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.057 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.057 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.057 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.057 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.058 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.058 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.058 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.058 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.058 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.058 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.058 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.059 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.059 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.059 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.059 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.059 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.059 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.059 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.060 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.060 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.060 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.060 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.060 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.060 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.060 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.060 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.061 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.061 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.061 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.061 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.061 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.061 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.062 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.062 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.062 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.062 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.062 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.062 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.062 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.063 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.063 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.063 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.063 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.063 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.063 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.063 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.064 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.064 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.064 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.064 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.064 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.064 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.064 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.065 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.065 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.065 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.065 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.065 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.065 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.065 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.065 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.066 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.066 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.066 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.066 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.066 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.066 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.066 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.067 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.067 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.067 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.067 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.067 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.067 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.068 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.068 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.068 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.068 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.068 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.068 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.068 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.069 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.069 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.069 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.069 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.069 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.069 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.069 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.070 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.070 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.070 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.070 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.070 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.070 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.070 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.070 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.071 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.071 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.071 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.071 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.071 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.071 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.071 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.072 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.072 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.072 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.072 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.072 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.072 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.073 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.073 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.073 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.073 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.073 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.073 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.073 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.074 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.074 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.074 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.074 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.074 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.074 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.075 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.075 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.075 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.075 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.075 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.075 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.076 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.076 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.076 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.076 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.076 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.076 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.076 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.077 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.077 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.077 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.077 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.077 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.078 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.078 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.078 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.078 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.078 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.078 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.079 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.079 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.079 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.079 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.079 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.080 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.080 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.080 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.080 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.080 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.080 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.081 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.081 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.081 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.081 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.081 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.082 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.082 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.082 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.082 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.082 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.083 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] barbican.barbican_region_name  = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.083 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.083 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.083 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.083 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.083 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.084 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.084 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.084 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.084 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.084 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.084 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.085 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.085 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.085 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.085 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.085 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.086 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.086 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.086 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.086 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.086 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.087 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.087 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.087 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.087 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.087 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.087 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.088 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.088 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.088 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.088 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.088 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.089 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.089 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.089 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.089 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.089 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.089 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.090 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.090 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.090 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.090 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.090 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.091 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.091 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.091 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.091 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.091 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.091 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.092 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.092 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.092 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.092 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.092 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.093 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.093 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.093 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.093 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] libvirt.cpu_mode               = custom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.093 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.093 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] libvirt.cpu_models             = ['Nehalem'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.094 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.094 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.094 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.094 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.094 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.095 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.095 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.095 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.095 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.095 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.096 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.096 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.096 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.096 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] libvirt.images_rbd_ceph_conf   =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.096 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.096 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.097 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] libvirt.images_rbd_glance_store_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.097 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] libvirt.images_rbd_pool        = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.097 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] libvirt.images_type            = qcow2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.097 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.097 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.098 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.098 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.098 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.098 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.098 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.098 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.099 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.099 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.099 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.099 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.099 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.100 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.100 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.100 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.100 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.100 185727 WARNING oslo_config.cfg [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Feb 16 13:12:54 compute-0 nova_compute[185723]: live_migration_uri is deprecated for removal in favor of two other options that
Feb 16 13:12:54 compute-0 nova_compute[185723]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Feb 16 13:12:54 compute-0 nova_compute[185723]: and ``live_migration_inbound_addr`` respectively.
Feb 16 13:12:54 compute-0 nova_compute[185723]: ).  Its value may be silently ignored in the future.
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.101 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.101 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.101 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.101 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.101 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.102 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.102 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.102 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.102 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.102 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.103 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.103 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.103 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.103 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.103 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.104 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.104 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.104 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.104 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] libvirt.rbd_secret_uuid        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.104 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] libvirt.rbd_user               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.104 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.105 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.105 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.105 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.105 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.105 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.106 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.106 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.106 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.106 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.106 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.107 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.107 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.107 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.107 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.107 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.107 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.108 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.108 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.108 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.108 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.108 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.109 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.109 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.109 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.109 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.109 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.109 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.110 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.110 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.110 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.110 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.110 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.110 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.111 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.111 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.111 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.111 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.111 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.112 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.112 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.112 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.112 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.112 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.112 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.113 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.113 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.113 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.113 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.113 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.113 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.114 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.114 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.114 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.114 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.114 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.115 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.115 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.115 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.115 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.115 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.116 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.116 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.116 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.116 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.116 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.117 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.117 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.117 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.117 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.117 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.117 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.118 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.118 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.118 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.118 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.118 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.119 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.119 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.119 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.119 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.119 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.119 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.120 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.120 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.120 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.120 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.120 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.120 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.121 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.121 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.121 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.121 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.121 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.122 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.122 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.122 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.122 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.122 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.122 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.123 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.123 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.123 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.123 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.124 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.124 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.124 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.124 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.124 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.124 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.125 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.125 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.125 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.125 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.125 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.126 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.126 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.126 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.126 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.126 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.127 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.127 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.127 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.127 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.127 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.127 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.128 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.128 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.128 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.128 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.128 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.128 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.129 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.129 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.129 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.129 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.129 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.130 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.130 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.130 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.130 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.130 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.131 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.131 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.131 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.131 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.131 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.132 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.132 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.132 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.132 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.132 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.132 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.133 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.133 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.133 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.133 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.133 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.134 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.134 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.134 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.134 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.134 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.135 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.135 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.135 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.135 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.135 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.135 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.136 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.136 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.136 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.136 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.136 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.136 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.137 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.137 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.137 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.137 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.138 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.138 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.138 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.138 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.138 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.138 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.139 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.139 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.139 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.139 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.139 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.139 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.140 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.140 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.140 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.140 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.140 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.140 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.141 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.141 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.141 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.141 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.141 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.142 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.142 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.142 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.142 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.142 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.142 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.143 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.143 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.143 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.143 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.143 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.144 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.144 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.144 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.144 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.144 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.144 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.145 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.145 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.145 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.145 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.145 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.146 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.146 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.146 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.146 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.146 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.147 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.147 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.147 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.147 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] vnc.server_proxyclient_address = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.147 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.147 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.148 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.148 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.148 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.148 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.148 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.149 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.149 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.149 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.149 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.149 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.149 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.150 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.150 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.150 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.150 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.150 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.151 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.151 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.151 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.151 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.151 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.151 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.152 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.152 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.152 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.152 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.152 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.153 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.153 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.153 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.153 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.153 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.153 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.154 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.154 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.154 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.154 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.154 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.155 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.155 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.155 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.155 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.155 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.155 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.156 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.156 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.156 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.156 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.156 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.157 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.157 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.157 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.157 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.157 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.158 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.158 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.158 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.158 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.158 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.158 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.159 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.159 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.159 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.159 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.159 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.160 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.160 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.160 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.160 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.160 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.160 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.161 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.161 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.161 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.161 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.161 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.162 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.162 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.162 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.162 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.162 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.162 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.163 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.163 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.163 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.163 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.163 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.164 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.164 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.164 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.164 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.164 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.164 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.165 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.165 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.165 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.165 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.165 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.166 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.166 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.166 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.166 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.166 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.166 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.167 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.167 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.167 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.167 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.167 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.168 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.168 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.168 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.168 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.168 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.168 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.169 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.169 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.169 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.169 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.169 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.170 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.170 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.170 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.170 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.170 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.171 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.171 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.171 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.171 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.171 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.172 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.172 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.172 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.172 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.172 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.173 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.173 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.173 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.173 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.173 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.174 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.174 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.174 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.174 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.174 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.175 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.175 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.175 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.175 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.175 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.176 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.176 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.176 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.176 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.176 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.177 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.177 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.177 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.177 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.177 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.178 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.178 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.178 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.178 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.178 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.179 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.179 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.179 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.179 185727 DEBUG oslo_service.service [None req-291a30be-5450-4cdf-b771-76ef14c79a8c - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.180 185727 INFO nova.service [-] Starting compute node (version 27.5.2-0.20260127144738.eaa65f0.el9)
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.199 185727 INFO nova.virt.node [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] Determined node identity c9501a85-df32-4b8f-bce0-9425ef1e7866 from /var/lib/nova/compute_id
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.200 185727 DEBUG nova.virt.libvirt.host [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.200 185727 DEBUG nova.virt.libvirt.host [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.201 185727 DEBUG nova.virt.libvirt.host [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.201 185727 DEBUG nova.virt.libvirt.host [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.213 185727 DEBUG nova.virt.libvirt.host [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7f2576e9c2e0> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.216 185727 DEBUG nova.virt.libvirt.host [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7f2576e9c2e0> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.217 185727 INFO nova.virt.libvirt.driver [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] Connection event '1' reason 'None'
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.222 185727 INFO nova.virt.libvirt.host [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] Libvirt host capabilities <capabilities>
Feb 16 13:12:54 compute-0 nova_compute[185723]: 
Feb 16 13:12:54 compute-0 nova_compute[185723]:   <host>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     <uuid>880a2e20-6f11-46bb-b51c-b4136280b28f</uuid>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     <cpu>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <arch>x86_64</arch>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model>EPYC-Rome-v4</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <vendor>AMD</vendor>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <microcode version='16777317'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <signature family='23' model='49' stepping='0'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <maxphysaddr mode='emulate' bits='40'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <feature name='x2apic'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <feature name='tsc-deadline'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <feature name='osxsave'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <feature name='hypervisor'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <feature name='tsc_adjust'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <feature name='spec-ctrl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <feature name='stibp'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <feature name='arch-capabilities'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <feature name='ssbd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <feature name='cmp_legacy'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <feature name='topoext'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <feature name='virt-ssbd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <feature name='lbrv'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <feature name='tsc-scale'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <feature name='vmcb-clean'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <feature name='pause-filter'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <feature name='pfthreshold'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <feature name='svme-addr-chk'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <feature name='rdctl-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <feature name='skip-l1dfl-vmentry'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <feature name='mds-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <feature name='pschange-mc-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <pages unit='KiB' size='4'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <pages unit='KiB' size='2048'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <pages unit='KiB' size='1048576'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     </cpu>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     <power_management>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <suspend_mem/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <suspend_disk/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <suspend_hybrid/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     </power_management>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     <iommu support='no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     <migration_features>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <live/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <uri_transports>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <uri_transport>tcp</uri_transport>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <uri_transport>rdma</uri_transport>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </uri_transports>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     </migration_features>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     <topology>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <cells num='1'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <cell id='0'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:           <memory unit='KiB'>7864292</memory>
Feb 16 13:12:54 compute-0 nova_compute[185723]:           <pages unit='KiB' size='4'>1966073</pages>
Feb 16 13:12:54 compute-0 nova_compute[185723]:           <pages unit='KiB' size='2048'>0</pages>
Feb 16 13:12:54 compute-0 nova_compute[185723]:           <pages unit='KiB' size='1048576'>0</pages>
Feb 16 13:12:54 compute-0 nova_compute[185723]:           <distances>
Feb 16 13:12:54 compute-0 nova_compute[185723]:             <sibling id='0' value='10'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:           </distances>
Feb 16 13:12:54 compute-0 nova_compute[185723]:           <cpus num='8'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:             <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:             <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:             <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:             <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:             <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:             <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:             <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:             <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:           </cpus>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         </cell>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </cells>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     </topology>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     <cache>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     </cache>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     <secmodel>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model>selinux</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <doi>0</doi>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     </secmodel>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     <secmodel>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model>dac</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <doi>0</doi>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <baselabel type='kvm'>+107:+107</baselabel>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <baselabel type='qemu'>+107:+107</baselabel>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     </secmodel>
Feb 16 13:12:54 compute-0 nova_compute[185723]:   </host>
Feb 16 13:12:54 compute-0 nova_compute[185723]: 
Feb 16 13:12:54 compute-0 nova_compute[185723]:   <guest>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     <os_type>hvm</os_type>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     <arch name='i686'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <wordsize>32</wordsize>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <domain type='qemu'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <domain type='kvm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     </arch>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     <features>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <pae/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <nonpae/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <acpi default='on' toggle='yes'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <apic default='on' toggle='no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <cpuselection/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <deviceboot/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <disksnapshot default='on' toggle='no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <externalSnapshot/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     </features>
Feb 16 13:12:54 compute-0 nova_compute[185723]:   </guest>
Feb 16 13:12:54 compute-0 nova_compute[185723]: 
Feb 16 13:12:54 compute-0 nova_compute[185723]:   <guest>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     <os_type>hvm</os_type>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     <arch name='x86_64'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <wordsize>64</wordsize>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <domain type='qemu'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <domain type='kvm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     </arch>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     <features>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <acpi default='on' toggle='yes'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <apic default='on' toggle='no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <cpuselection/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <deviceboot/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <disksnapshot default='on' toggle='no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <externalSnapshot/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     </features>
Feb 16 13:12:54 compute-0 nova_compute[185723]:   </guest>
Feb 16 13:12:54 compute-0 nova_compute[185723]: 
Feb 16 13:12:54 compute-0 nova_compute[185723]: </capabilities>
Feb 16 13:12:54 compute-0 nova_compute[185723]: 
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.231 185727 DEBUG nova.virt.libvirt.host [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] Getting domain capabilities for i686 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.237 185727 DEBUG nova.virt.libvirt.host [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Feb 16 13:12:54 compute-0 nova_compute[185723]: <domainCapabilities>
Feb 16 13:12:54 compute-0 nova_compute[185723]:   <path>/usr/libexec/qemu-kvm</path>
Feb 16 13:12:54 compute-0 nova_compute[185723]:   <domain>kvm</domain>
Feb 16 13:12:54 compute-0 nova_compute[185723]:   <machine>pc-i440fx-rhel7.6.0</machine>
Feb 16 13:12:54 compute-0 nova_compute[185723]:   <arch>i686</arch>
Feb 16 13:12:54 compute-0 nova_compute[185723]:   <vcpu max='240'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:   <iothreads supported='yes'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:   <os supported='yes'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     <enum name='firmware'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     <loader supported='yes'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <enum name='type'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>rom</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>pflash</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </enum>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <enum name='readonly'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>yes</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>no</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </enum>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <enum name='secure'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>no</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </enum>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     </loader>
Feb 16 13:12:54 compute-0 nova_compute[185723]:   </os>
Feb 16 13:12:54 compute-0 nova_compute[185723]:   <cpu>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     <mode name='host-passthrough' supported='yes'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <enum name='hostPassthroughMigratable'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>on</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>off</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </enum>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     </mode>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     <mode name='maximum' supported='yes'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <enum name='maximumMigratable'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>on</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>off</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </enum>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     </mode>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     <mode name='host-model' supported='yes'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model fallback='forbid'>EPYC-Rome</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <vendor>AMD</vendor>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <maxphysaddr mode='passthrough' limit='40'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <feature policy='require' name='x2apic'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <feature policy='require' name='tsc-deadline'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <feature policy='require' name='hypervisor'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <feature policy='require' name='tsc_adjust'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <feature policy='require' name='spec-ctrl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <feature policy='require' name='stibp'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <feature policy='require' name='ssbd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <feature policy='require' name='cmp_legacy'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <feature policy='require' name='overflow-recov'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <feature policy='require' name='succor'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <feature policy='require' name='ibrs'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <feature policy='require' name='amd-ssbd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <feature policy='require' name='virt-ssbd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <feature policy='require' name='lbrv'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <feature policy='require' name='tsc-scale'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <feature policy='require' name='vmcb-clean'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <feature policy='require' name='flushbyasid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <feature policy='require' name='pause-filter'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <feature policy='require' name='pfthreshold'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <feature policy='require' name='svme-addr-chk'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <feature policy='require' name='lfence-always-serializing'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <feature policy='disable' name='xsaves'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     </mode>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     <mode name='custom' supported='yes'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Broadwell'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='hle'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Broadwell-IBRS'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='hle'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Broadwell-noTSX'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Broadwell-noTSX-IBRS'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Broadwell-v1'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='hle'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Broadwell-v2'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Broadwell-v3'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='hle'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Broadwell-v4'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Cascadelake-Server'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='hle'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Cascadelake-Server-noTSX'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Cascadelake-Server-v1'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='hle'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Cascadelake-Server-v2'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='hle'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Cascadelake-Server-v3'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Cascadelake-Server-v4'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Cascadelake-Server-v5'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='ClearwaterForest'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx-ifma'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx-ne-convert'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx-vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx-vnni-int16'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx-vnni-int8'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='bhi-ctrl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='bhi-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='cldemote'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='cmpccxadd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ddpd-u'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fbsdp-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrs'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='intel-psfd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ipred-ctrl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='lam'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='mcdt-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='movdir64b'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='movdiri'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pbrsb-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='prefetchiti'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='psdp-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='rrsba-ctrl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='serialize'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='sha512'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='sm3'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='sm4'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ss'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='ClearwaterForest-v1'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx-ifma'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx-ne-convert'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx-vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx-vnni-int16'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx-vnni-int8'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='bhi-ctrl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='bhi-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='cldemote'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='cmpccxadd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ddpd-u'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fbsdp-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrs'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='intel-psfd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ipred-ctrl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='lam'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='mcdt-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='movdir64b'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='movdiri'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pbrsb-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='prefetchiti'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='psdp-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='rrsba-ctrl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='serialize'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='sha512'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='sm3'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='sm4'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ss'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Cooperlake'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-bf16'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='hle'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='taa-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Cooperlake-v1'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-bf16'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='hle'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='taa-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Cooperlake-v2'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-bf16'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='hle'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='taa-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Denverton'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='mpx'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Denverton-v1'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='mpx'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Denverton-v2'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Denverton-v3'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Dhyana-v2'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='EPYC-Genoa'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='amd-psfd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='auto-ibrs'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-bf16'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512ifma'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='la57'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='no-nested-data-bp'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='null-sel-clr-base'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='stibp-always-on'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='EPYC-Genoa-v1'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='amd-psfd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='auto-ibrs'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-bf16'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512ifma'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='la57'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='no-nested-data-bp'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='null-sel-clr-base'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='stibp-always-on'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='EPYC-Genoa-v2'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='amd-psfd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='auto-ibrs'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-bf16'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512ifma'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fs-gs-base-ns'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='la57'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='no-nested-data-bp'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='null-sel-clr-base'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='perfmon-v2'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='stibp-always-on'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='EPYC-Milan'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='EPYC-Milan-v1'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='EPYC-Milan-v2'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='amd-psfd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='no-nested-data-bp'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='null-sel-clr-base'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='stibp-always-on'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='EPYC-Milan-v3'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='amd-psfd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='no-nested-data-bp'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='null-sel-clr-base'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='stibp-always-on'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='EPYC-Rome'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='EPYC-Rome-v1'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='EPYC-Rome-v2'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='EPYC-Rome-v3'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='EPYC-Turin'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='amd-psfd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='auto-ibrs'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx-vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-bf16'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-vp2intersect'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512ifma'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fs-gs-base-ns'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ibpb-brtype'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='la57'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='movdir64b'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='movdiri'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='no-nested-data-bp'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='null-sel-clr-base'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='perfmon-v2'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='prefetchi'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='sbpb'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='srso-user-kernel-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='stibp-always-on'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='EPYC-Turin-v1'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='amd-psfd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='auto-ibrs'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx-vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-bf16'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-vp2intersect'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512ifma'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fs-gs-base-ns'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ibpb-brtype'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='la57'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='movdir64b'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='movdiri'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='no-nested-data-bp'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='null-sel-clr-base'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='perfmon-v2'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='prefetchi'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='sbpb'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='srso-user-kernel-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='stibp-always-on'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='EPYC-v3'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='EPYC-v4'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='EPYC-v5'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='GraniteRapids'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='amx-bf16'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='amx-fp16'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='amx-int8'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='amx-tile'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx-vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-bf16'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-fp16'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512ifma'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fbsdp-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrc'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrs'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fzrm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='hle'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='la57'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='mcdt-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pbrsb-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='prefetchiti'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='psdp-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='serialize'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='taa-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='tsx-ldtrk'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xfd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='GraniteRapids-v1'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='amx-bf16'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='amx-fp16'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='amx-int8'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='amx-tile'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx-vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-bf16'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-fp16'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512ifma'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fbsdp-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrc'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrs'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fzrm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='hle'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='la57'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='mcdt-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pbrsb-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='prefetchiti'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='psdp-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='serialize'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='taa-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='tsx-ldtrk'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xfd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='GraniteRapids-v2'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='amx-bf16'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='amx-fp16'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='amx-int8'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='amx-tile'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx-vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx10'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx10-128'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx10-256'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx10-512'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-bf16'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-fp16'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512ifma'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='cldemote'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fbsdp-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrc'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrs'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fzrm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='hle'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='la57'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='mcdt-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='movdir64b'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='movdiri'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pbrsb-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='prefetchiti'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='psdp-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='serialize'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ss'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='taa-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='tsx-ldtrk'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xfd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='GraniteRapids-v3'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='amx-bf16'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='amx-fp16'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='amx-int8'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='amx-tile'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx-vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx10'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx10-128'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx10-256'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx10-512'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-bf16'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-fp16'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512ifma'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='cldemote'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fbsdp-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrc'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrs'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fzrm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='hle'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='la57'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='mcdt-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='movdir64b'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='movdiri'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pbrsb-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='prefetchiti'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='psdp-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='serialize'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ss'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='taa-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='tsx-ldtrk'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xfd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Haswell'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='hle'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Haswell-IBRS'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='hle'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Haswell-noTSX'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Haswell-noTSX-IBRS'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Haswell-v1'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='hle'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Haswell-v2'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Haswell-v3'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='hle'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Haswell-v4'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Icelake-Server'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='hle'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='la57'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Icelake-Server-noTSX'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='la57'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Icelake-Server-v1'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='hle'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='la57'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Icelake-Server-v2'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='la57'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Icelake-Server-v3'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='la57'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='taa-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Icelake-Server-v4'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512ifma'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='la57'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='taa-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Icelake-Server-v5'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512ifma'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='la57'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='taa-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Icelake-Server-v6'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512ifma'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='la57'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='taa-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Icelake-Server-v7'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512ifma'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='hle'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='la57'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='taa-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='IvyBridge'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='IvyBridge-IBRS'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='IvyBridge-v1'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='IvyBridge-v2'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='KnightsMill'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-4fmaps'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-4vnniw'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512er'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512pf'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ss'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='KnightsMill-v1'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-4fmaps'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-4vnniw'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512er'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512pf'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ss'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Opteron_G4'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fma4'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xop'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Opteron_G4-v1'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fma4'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xop'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Opteron_G5'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fma4'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='tbm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xop'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Opteron_G5-v1'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fma4'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='tbm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xop'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='SapphireRapids'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='amx-bf16'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='amx-int8'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='amx-tile'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx-vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-bf16'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-fp16'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512ifma'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrc'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrs'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fzrm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='hle'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='la57'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='serialize'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='taa-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='tsx-ldtrk'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xfd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='SapphireRapids-v1'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='amx-bf16'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='amx-int8'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='amx-tile'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx-vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-bf16'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-fp16'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512ifma'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrc'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrs'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fzrm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='hle'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='la57'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='serialize'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='taa-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='tsx-ldtrk'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xfd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='SapphireRapids-v2'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='amx-bf16'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='amx-int8'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='amx-tile'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx-vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-bf16'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-fp16'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512ifma'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fbsdp-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrc'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrs'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fzrm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='hle'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='la57'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='psdp-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='serialize'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='taa-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='tsx-ldtrk'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xfd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='SapphireRapids-v3'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='amx-bf16'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='amx-int8'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='amx-tile'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx-vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-bf16'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-fp16'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512ifma'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='cldemote'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fbsdp-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrc'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrs'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fzrm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='hle'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='la57'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='movdir64b'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='movdiri'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='psdp-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='serialize'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ss'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='taa-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='tsx-ldtrk'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xfd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='SapphireRapids-v4'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='amx-bf16'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='amx-int8'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='amx-tile'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx-vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-bf16'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-fp16'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512ifma'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='cldemote'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fbsdp-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrc'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrs'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fzrm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='hle'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='la57'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='movdir64b'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='movdiri'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='psdp-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='serialize'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ss'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='taa-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='tsx-ldtrk'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xfd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='SierraForest'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx-ifma'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx-ne-convert'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx-vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx-vnni-int8'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='cmpccxadd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fbsdp-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrs'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='mcdt-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pbrsb-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='psdp-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='serialize'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='SierraForest-v1'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx-ifma'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx-ne-convert'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx-vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx-vnni-int8'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='cmpccxadd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fbsdp-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrs'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='mcdt-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pbrsb-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='psdp-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='serialize'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='SierraForest-v2'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx-ifma'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx-ne-convert'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx-vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx-vnni-int8'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='bhi-ctrl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='cldemote'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='cmpccxadd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fbsdp-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrs'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='intel-psfd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ipred-ctrl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='lam'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='mcdt-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='movdir64b'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='movdiri'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pbrsb-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='psdp-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='rrsba-ctrl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='serialize'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ss'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='SierraForest-v3'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx-ifma'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx-ne-convert'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx-vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx-vnni-int8'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='bhi-ctrl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='cldemote'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='cmpccxadd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fbsdp-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrs'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='intel-psfd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ipred-ctrl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='lam'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='mcdt-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='movdir64b'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='movdiri'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pbrsb-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='psdp-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='rrsba-ctrl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='serialize'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ss'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Skylake-Client'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='hle'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Skylake-Client-IBRS'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='hle'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Skylake-Client-v1'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='hle'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Skylake-Client-v2'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='hle'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Skylake-Client-v3'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Skylake-Client-v4'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Skylake-Server'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='hle'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Skylake-Server-IBRS'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='hle'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Skylake-Server-v1'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='hle'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Skylake-Server-v2'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='hle'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Skylake-Server-v3'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Skylake-Server-v4'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Skylake-Server-v5'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Snowridge'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='cldemote'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='core-capability'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='movdir64b'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='movdiri'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='mpx'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='split-lock-detect'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Snowridge-v1'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='cldemote'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='core-capability'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='movdir64b'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='movdiri'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='mpx'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='split-lock-detect'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Snowridge-v2'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='cldemote'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='core-capability'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='movdir64b'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='movdiri'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='split-lock-detect'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Snowridge-v3'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='cldemote'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='core-capability'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='movdir64b'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='movdiri'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='split-lock-detect'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Snowridge-v4'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='cldemote'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='movdir64b'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='movdiri'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='athlon'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='3dnow'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='3dnowext'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='athlon-v1'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='3dnow'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='3dnowext'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='core2duo'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ss'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='core2duo-v1'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ss'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='coreduo'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ss'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='coreduo-v1'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ss'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='n270'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ss'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='n270-v1'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ss'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='phenom'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='3dnow'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='3dnowext'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='phenom-v1'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='3dnow'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='3dnowext'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     </mode>
Feb 16 13:12:54 compute-0 nova_compute[185723]:   </cpu>
Feb 16 13:12:54 compute-0 nova_compute[185723]:   <memoryBacking supported='yes'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     <enum name='sourceType'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <value>file</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <value>anonymous</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <value>memfd</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     </enum>
Feb 16 13:12:54 compute-0 nova_compute[185723]:   </memoryBacking>
Feb 16 13:12:54 compute-0 nova_compute[185723]:   <devices>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     <disk supported='yes'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <enum name='diskDevice'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>disk</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>cdrom</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>floppy</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>lun</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </enum>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <enum name='bus'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>ide</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>fdc</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>scsi</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>virtio</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>usb</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>sata</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </enum>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <enum name='model'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>virtio</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>virtio-transitional</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>virtio-non-transitional</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </enum>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     </disk>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     <graphics supported='yes'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <enum name='type'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>vnc</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>egl-headless</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>dbus</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </enum>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     </graphics>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     <video supported='yes'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <enum name='modelType'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>vga</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>cirrus</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>virtio</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>none</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>bochs</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>ramfb</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </enum>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     </video>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     <hostdev supported='yes'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <enum name='mode'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>subsystem</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </enum>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <enum name='startupPolicy'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>default</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>mandatory</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>requisite</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>optional</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </enum>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <enum name='subsysType'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>usb</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>pci</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>scsi</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </enum>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <enum name='capsType'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <enum name='pciBackend'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     </hostdev>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     <rng supported='yes'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <enum name='model'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>virtio</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>virtio-transitional</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>virtio-non-transitional</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </enum>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <enum name='backendModel'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>random</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>egd</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>builtin</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </enum>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     </rng>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     <filesystem supported='yes'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <enum name='driverType'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>path</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>handle</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>virtiofs</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </enum>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     </filesystem>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     <tpm supported='yes'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <enum name='model'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>tpm-tis</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>tpm-crb</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </enum>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <enum name='backendModel'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>emulator</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>external</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </enum>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <enum name='backendVersion'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>2.0</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </enum>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     </tpm>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     <redirdev supported='yes'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <enum name='bus'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>usb</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </enum>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     </redirdev>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     <channel supported='yes'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <enum name='type'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>pty</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>unix</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </enum>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     </channel>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     <crypto supported='yes'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <enum name='model'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <enum name='type'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>qemu</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </enum>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <enum name='backendModel'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>builtin</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </enum>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     </crypto>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     <interface supported='yes'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <enum name='backendType'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>default</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>passt</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </enum>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     </interface>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     <panic supported='yes'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <enum name='model'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>isa</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>hyperv</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </enum>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     </panic>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     <console supported='yes'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <enum name='type'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>null</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>vc</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>pty</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>dev</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>file</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>pipe</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>stdio</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>udp</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>tcp</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>unix</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>qemu-vdagent</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>dbus</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </enum>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     </console>
Feb 16 13:12:54 compute-0 nova_compute[185723]:   </devices>
Feb 16 13:12:54 compute-0 nova_compute[185723]:   <features>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     <gic supported='no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     <vmcoreinfo supported='yes'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     <genid supported='yes'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     <backingStoreInput supported='yes'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     <backup supported='yes'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     <async-teardown supported='yes'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     <s390-pv supported='no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     <ps2 supported='yes'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     <tdx supported='no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     <sev supported='no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     <sgx supported='no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     <hyperv supported='yes'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <enum name='features'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>relaxed</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>vapic</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>spinlocks</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>vpindex</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>runtime</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>synic</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>stimer</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>reset</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>vendor_id</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>frequencies</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>reenlightenment</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>tlbflush</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>ipi</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>avic</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>emsr_bitmap</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>xmm_input</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </enum>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <defaults>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <spinlocks>4095</spinlocks>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <stimer_direct>on</stimer_direct>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <tlbflush_direct>on</tlbflush_direct>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <tlbflush_extended>on</tlbflush_extended>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <vendor_id>Linux KVM Hv</vendor_id>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </defaults>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     </hyperv>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     <launchSecurity supported='no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:   </features>
Feb 16 13:12:54 compute-0 nova_compute[185723]: </domainCapabilities>
Feb 16 13:12:54 compute-0 nova_compute[185723]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.242 185727 DEBUG nova.virt.libvirt.volume.mount [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.248 185727 DEBUG nova.virt.libvirt.host [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Feb 16 13:12:54 compute-0 nova_compute[185723]: <domainCapabilities>
Feb 16 13:12:54 compute-0 nova_compute[185723]:   <path>/usr/libexec/qemu-kvm</path>
Feb 16 13:12:54 compute-0 nova_compute[185723]:   <domain>kvm</domain>
Feb 16 13:12:54 compute-0 nova_compute[185723]:   <machine>pc-q35-rhel9.8.0</machine>
Feb 16 13:12:54 compute-0 nova_compute[185723]:   <arch>i686</arch>
Feb 16 13:12:54 compute-0 nova_compute[185723]:   <vcpu max='4096'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:   <iothreads supported='yes'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:   <os supported='yes'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     <enum name='firmware'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     <loader supported='yes'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <enum name='type'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>rom</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>pflash</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </enum>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <enum name='readonly'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>yes</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>no</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </enum>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <enum name='secure'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>no</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </enum>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     </loader>
Feb 16 13:12:54 compute-0 nova_compute[185723]:   </os>
Feb 16 13:12:54 compute-0 nova_compute[185723]:   <cpu>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     <mode name='host-passthrough' supported='yes'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <enum name='hostPassthroughMigratable'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>on</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>off</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </enum>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     </mode>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     <mode name='maximum' supported='yes'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <enum name='maximumMigratable'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>on</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>off</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </enum>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     </mode>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     <mode name='host-model' supported='yes'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model fallback='forbid'>EPYC-Rome</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <vendor>AMD</vendor>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <maxphysaddr mode='passthrough' limit='40'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <feature policy='require' name='x2apic'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <feature policy='require' name='tsc-deadline'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <feature policy='require' name='hypervisor'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <feature policy='require' name='tsc_adjust'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <feature policy='require' name='spec-ctrl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <feature policy='require' name='stibp'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <feature policy='require' name='ssbd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <feature policy='require' name='cmp_legacy'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <feature policy='require' name='overflow-recov'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <feature policy='require' name='succor'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <feature policy='require' name='ibrs'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <feature policy='require' name='amd-ssbd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <feature policy='require' name='virt-ssbd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <feature policy='require' name='lbrv'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <feature policy='require' name='tsc-scale'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <feature policy='require' name='vmcb-clean'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <feature policy='require' name='flushbyasid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <feature policy='require' name='pause-filter'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <feature policy='require' name='pfthreshold'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <feature policy='require' name='svme-addr-chk'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <feature policy='require' name='lfence-always-serializing'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <feature policy='disable' name='xsaves'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     </mode>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     <mode name='custom' supported='yes'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Broadwell'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='hle'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Broadwell-IBRS'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='hle'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Broadwell-noTSX'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Broadwell-noTSX-IBRS'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Broadwell-v1'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='hle'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Broadwell-v2'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Broadwell-v3'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='hle'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Broadwell-v4'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Cascadelake-Server'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='hle'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Cascadelake-Server-noTSX'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Cascadelake-Server-v1'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='hle'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Cascadelake-Server-v2'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='hle'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Cascadelake-Server-v3'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Cascadelake-Server-v4'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Cascadelake-Server-v5'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='ClearwaterForest'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx-ifma'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx-ne-convert'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx-vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx-vnni-int16'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx-vnni-int8'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='bhi-ctrl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='bhi-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='cldemote'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='cmpccxadd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ddpd-u'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fbsdp-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrs'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='intel-psfd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ipred-ctrl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='lam'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='mcdt-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='movdir64b'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='movdiri'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pbrsb-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='prefetchiti'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='psdp-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='rrsba-ctrl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='serialize'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='sha512'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='sm3'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='sm4'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ss'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='ClearwaterForest-v1'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx-ifma'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx-ne-convert'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx-vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx-vnni-int16'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx-vnni-int8'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='bhi-ctrl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='bhi-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='cldemote'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='cmpccxadd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ddpd-u'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fbsdp-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrs'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='intel-psfd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ipred-ctrl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='lam'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='mcdt-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='movdir64b'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='movdiri'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pbrsb-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='prefetchiti'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='psdp-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='rrsba-ctrl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='serialize'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='sha512'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='sm3'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='sm4'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ss'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Cooperlake'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-bf16'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='hle'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='taa-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Cooperlake-v1'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-bf16'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='hle'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='taa-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Cooperlake-v2'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-bf16'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='hle'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='taa-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Denverton'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='mpx'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Denverton-v1'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='mpx'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Denverton-v2'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Denverton-v3'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Dhyana-v2'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='EPYC-Genoa'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='amd-psfd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='auto-ibrs'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-bf16'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512ifma'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='la57'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='no-nested-data-bp'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='null-sel-clr-base'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='stibp-always-on'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='EPYC-Genoa-v1'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='amd-psfd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='auto-ibrs'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-bf16'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512ifma'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='la57'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='no-nested-data-bp'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='null-sel-clr-base'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='stibp-always-on'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='EPYC-Genoa-v2'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='amd-psfd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='auto-ibrs'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-bf16'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512ifma'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fs-gs-base-ns'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='la57'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='no-nested-data-bp'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='null-sel-clr-base'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='perfmon-v2'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='stibp-always-on'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='EPYC-Milan'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='EPYC-Milan-v1'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='EPYC-Milan-v2'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='amd-psfd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='no-nested-data-bp'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='null-sel-clr-base'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='stibp-always-on'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='EPYC-Milan-v3'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='amd-psfd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='no-nested-data-bp'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='null-sel-clr-base'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='stibp-always-on'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='EPYC-Rome'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='EPYC-Rome-v1'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='EPYC-Rome-v2'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='EPYC-Rome-v3'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='EPYC-Turin'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='amd-psfd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='auto-ibrs'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx-vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-bf16'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-vp2intersect'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512ifma'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fs-gs-base-ns'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ibpb-brtype'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='la57'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='movdir64b'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='movdiri'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='no-nested-data-bp'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='null-sel-clr-base'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='perfmon-v2'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='prefetchi'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='sbpb'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='srso-user-kernel-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='stibp-always-on'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='EPYC-Turin-v1'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='amd-psfd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='auto-ibrs'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx-vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-bf16'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-vp2intersect'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512ifma'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fs-gs-base-ns'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ibpb-brtype'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='la57'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='movdir64b'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='movdiri'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='no-nested-data-bp'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='null-sel-clr-base'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='perfmon-v2'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='prefetchi'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='sbpb'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='srso-user-kernel-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='stibp-always-on'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='EPYC-v3'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='EPYC-v4'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='EPYC-v5'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='GraniteRapids'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='amx-bf16'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='amx-fp16'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='amx-int8'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='amx-tile'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx-vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-bf16'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-fp16'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512ifma'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fbsdp-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrc'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrs'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fzrm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='hle'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='la57'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='mcdt-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pbrsb-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='prefetchiti'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='psdp-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='serialize'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='taa-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='tsx-ldtrk'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xfd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='GraniteRapids-v1'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='amx-bf16'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='amx-fp16'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='amx-int8'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='amx-tile'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx-vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-bf16'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-fp16'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512ifma'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fbsdp-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrc'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrs'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fzrm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='hle'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='la57'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='mcdt-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pbrsb-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='prefetchiti'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='psdp-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='serialize'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='taa-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='tsx-ldtrk'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xfd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='GraniteRapids-v2'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='amx-bf16'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='amx-fp16'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='amx-int8'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='amx-tile'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx-vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx10'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx10-128'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx10-256'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx10-512'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-bf16'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-fp16'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512ifma'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='cldemote'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fbsdp-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrc'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrs'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fzrm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='hle'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='la57'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='mcdt-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='movdir64b'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='movdiri'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pbrsb-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='prefetchiti'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='psdp-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='serialize'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ss'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='taa-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='tsx-ldtrk'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xfd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='GraniteRapids-v3'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='amx-bf16'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='amx-fp16'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='amx-int8'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='amx-tile'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx-vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx10'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx10-128'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx10-256'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx10-512'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-bf16'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-fp16'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512ifma'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='cldemote'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fbsdp-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrc'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrs'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fzrm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='hle'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='la57'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='mcdt-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='movdir64b'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='movdiri'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pbrsb-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='prefetchiti'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='psdp-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='serialize'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ss'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='taa-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='tsx-ldtrk'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xfd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Haswell'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='hle'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Haswell-IBRS'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='hle'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Haswell-noTSX'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Haswell-noTSX-IBRS'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Haswell-v1'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='hle'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Haswell-v2'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Haswell-v3'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='hle'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Haswell-v4'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Icelake-Server'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='hle'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='la57'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Icelake-Server-noTSX'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='la57'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Icelake-Server-v1'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='hle'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='la57'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Icelake-Server-v2'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='la57'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Icelake-Server-v3'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='la57'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='taa-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Icelake-Server-v4'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512ifma'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='la57'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='taa-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Icelake-Server-v5'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512ifma'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='la57'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='taa-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Icelake-Server-v6'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512ifma'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='la57'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='taa-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Icelake-Server-v7'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512ifma'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='hle'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='la57'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='taa-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='IvyBridge'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='IvyBridge-IBRS'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='IvyBridge-v1'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='IvyBridge-v2'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='KnightsMill'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-4fmaps'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-4vnniw'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512er'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512pf'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ss'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='KnightsMill-v1'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-4fmaps'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-4vnniw'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512er'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512pf'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ss'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Opteron_G4'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fma4'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xop'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Opteron_G4-v1'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fma4'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xop'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Opteron_G5'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fma4'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='tbm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xop'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Opteron_G5-v1'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fma4'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='tbm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xop'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='SapphireRapids'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='amx-bf16'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='amx-int8'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='amx-tile'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx-vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-bf16'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-fp16'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512ifma'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrc'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrs'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fzrm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='hle'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='la57'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='serialize'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='taa-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='tsx-ldtrk'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xfd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='SapphireRapids-v1'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='amx-bf16'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='amx-int8'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='amx-tile'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx-vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-bf16'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-fp16'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512ifma'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrc'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrs'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fzrm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='hle'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='la57'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='serialize'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='taa-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='tsx-ldtrk'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xfd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='SapphireRapids-v2'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='amx-bf16'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='amx-int8'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='amx-tile'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx-vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-bf16'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-fp16'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512ifma'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fbsdp-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrc'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrs'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fzrm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='hle'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='la57'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='psdp-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='serialize'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='taa-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='tsx-ldtrk'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xfd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='SapphireRapids-v3'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='amx-bf16'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='amx-int8'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='amx-tile'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx-vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-bf16'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-fp16'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512ifma'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='cldemote'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fbsdp-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrc'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrs'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fzrm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='hle'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='la57'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='movdir64b'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='movdiri'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='psdp-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='serialize'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ss'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='taa-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='tsx-ldtrk'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xfd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='SapphireRapids-v4'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='amx-bf16'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='amx-int8'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='amx-tile'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx-vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-bf16'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-fp16'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512ifma'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='cldemote'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fbsdp-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrc'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrs'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fzrm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='hle'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='la57'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='movdir64b'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='movdiri'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='psdp-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='serialize'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ss'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='taa-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='tsx-ldtrk'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xfd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='SierraForest'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx-ifma'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx-ne-convert'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx-vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx-vnni-int8'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='cmpccxadd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fbsdp-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrs'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='mcdt-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pbrsb-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='psdp-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='serialize'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='SierraForest-v1'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx-ifma'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx-ne-convert'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx-vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx-vnni-int8'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='cmpccxadd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fbsdp-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrs'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='mcdt-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pbrsb-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='psdp-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='serialize'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='SierraForest-v2'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx-ifma'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx-ne-convert'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx-vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx-vnni-int8'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='bhi-ctrl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='cldemote'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='cmpccxadd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fbsdp-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrs'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='intel-psfd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ipred-ctrl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='lam'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='mcdt-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='movdir64b'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='movdiri'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pbrsb-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='psdp-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='rrsba-ctrl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='serialize'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ss'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='SierraForest-v3'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx-ifma'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx-ne-convert'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx-vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx-vnni-int8'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='bhi-ctrl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='cldemote'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='cmpccxadd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fbsdp-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrs'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='intel-psfd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ipred-ctrl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='lam'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='mcdt-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='movdir64b'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='movdiri'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pbrsb-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='psdp-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='rrsba-ctrl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='serialize'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ss'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Skylake-Client'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='hle'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Skylake-Client-IBRS'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='hle'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Skylake-Client-v1'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='hle'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Skylake-Client-v2'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='hle'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Skylake-Client-v3'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Skylake-Client-v4'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Skylake-Server'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='hle'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Skylake-Server-IBRS'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='hle'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Skylake-Server-v1'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='hle'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Skylake-Server-v2'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='hle'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Skylake-Server-v3'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Skylake-Server-v4'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Skylake-Server-v5'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Snowridge'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='cldemote'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='core-capability'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='movdir64b'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='movdiri'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='mpx'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='split-lock-detect'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Snowridge-v1'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='cldemote'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='core-capability'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='movdir64b'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='movdiri'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='mpx'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='split-lock-detect'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Snowridge-v2'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='cldemote'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='core-capability'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='movdir64b'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='movdiri'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='split-lock-detect'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Snowridge-v3'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='cldemote'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='core-capability'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='movdir64b'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='movdiri'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='split-lock-detect'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Snowridge-v4'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='cldemote'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='movdir64b'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='movdiri'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='athlon'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='3dnow'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='3dnowext'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='athlon-v1'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='3dnow'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='3dnowext'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='core2duo'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ss'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='core2duo-v1'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ss'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='coreduo'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ss'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='coreduo-v1'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ss'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='n270'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ss'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='n270-v1'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ss'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='phenom'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='3dnow'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='3dnowext'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='phenom-v1'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='3dnow'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='3dnowext'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     </mode>
Feb 16 13:12:54 compute-0 nova_compute[185723]:   </cpu>
Feb 16 13:12:54 compute-0 nova_compute[185723]:   <memoryBacking supported='yes'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     <enum name='sourceType'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <value>file</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <value>anonymous</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <value>memfd</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     </enum>
Feb 16 13:12:54 compute-0 nova_compute[185723]:   </memoryBacking>
Feb 16 13:12:54 compute-0 nova_compute[185723]:   <devices>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     <disk supported='yes'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <enum name='diskDevice'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>disk</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>cdrom</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>floppy</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>lun</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </enum>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <enum name='bus'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>fdc</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>scsi</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>virtio</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>usb</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>sata</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </enum>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <enum name='model'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>virtio</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>virtio-transitional</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>virtio-non-transitional</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </enum>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     </disk>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     <graphics supported='yes'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <enum name='type'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>vnc</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>egl-headless</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>dbus</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </enum>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     </graphics>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     <video supported='yes'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <enum name='modelType'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>vga</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>cirrus</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>virtio</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>none</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>bochs</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>ramfb</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </enum>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     </video>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     <hostdev supported='yes'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <enum name='mode'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>subsystem</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </enum>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <enum name='startupPolicy'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>default</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>mandatory</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>requisite</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>optional</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </enum>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <enum name='subsysType'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>usb</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>pci</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>scsi</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </enum>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <enum name='capsType'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <enum name='pciBackend'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     </hostdev>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     <rng supported='yes'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <enum name='model'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>virtio</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>virtio-transitional</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>virtio-non-transitional</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </enum>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <enum name='backendModel'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>random</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>egd</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>builtin</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </enum>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     </rng>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     <filesystem supported='yes'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <enum name='driverType'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>path</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>handle</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>virtiofs</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </enum>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     </filesystem>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     <tpm supported='yes'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <enum name='model'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>tpm-tis</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>tpm-crb</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </enum>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <enum name='backendModel'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>emulator</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>external</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </enum>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <enum name='backendVersion'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>2.0</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </enum>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     </tpm>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     <redirdev supported='yes'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <enum name='bus'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>usb</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </enum>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     </redirdev>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     <channel supported='yes'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <enum name='type'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>pty</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>unix</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </enum>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     </channel>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     <crypto supported='yes'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <enum name='model'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <enum name='type'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>qemu</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </enum>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <enum name='backendModel'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>builtin</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </enum>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     </crypto>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     <interface supported='yes'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <enum name='backendType'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>default</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>passt</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </enum>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     </interface>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     <panic supported='yes'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <enum name='model'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>isa</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>hyperv</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </enum>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     </panic>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     <console supported='yes'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <enum name='type'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>null</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>vc</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>pty</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>dev</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>file</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>pipe</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>stdio</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>udp</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>tcp</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>unix</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>qemu-vdagent</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>dbus</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </enum>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     </console>
Feb 16 13:12:54 compute-0 nova_compute[185723]:   </devices>
Feb 16 13:12:54 compute-0 nova_compute[185723]:   <features>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     <gic supported='no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     <vmcoreinfo supported='yes'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     <genid supported='yes'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     <backingStoreInput supported='yes'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     <backup supported='yes'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     <async-teardown supported='yes'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     <s390-pv supported='no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     <ps2 supported='yes'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     <tdx supported='no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     <sev supported='no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     <sgx supported='no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     <hyperv supported='yes'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <enum name='features'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>relaxed</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>vapic</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>spinlocks</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>vpindex</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>runtime</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>synic</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>stimer</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>reset</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>vendor_id</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>frequencies</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>reenlightenment</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>tlbflush</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>ipi</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>avic</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>emsr_bitmap</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>xmm_input</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </enum>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <defaults>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <spinlocks>4095</spinlocks>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <stimer_direct>on</stimer_direct>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <tlbflush_direct>on</tlbflush_direct>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <tlbflush_extended>on</tlbflush_extended>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <vendor_id>Linux KVM Hv</vendor_id>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </defaults>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     </hyperv>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     <launchSecurity supported='no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:   </features>
Feb 16 13:12:54 compute-0 nova_compute[185723]: </domainCapabilities>
Feb 16 13:12:54 compute-0 nova_compute[185723]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.323 185727 DEBUG nova.virt.libvirt.host [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] Getting domain capabilities for x86_64 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.331 185727 DEBUG nova.virt.libvirt.host [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Feb 16 13:12:54 compute-0 nova_compute[185723]: <domainCapabilities>
Feb 16 13:12:54 compute-0 nova_compute[185723]:   <path>/usr/libexec/qemu-kvm</path>
Feb 16 13:12:54 compute-0 nova_compute[185723]:   <domain>kvm</domain>
Feb 16 13:12:54 compute-0 nova_compute[185723]:   <machine>pc-q35-rhel9.8.0</machine>
Feb 16 13:12:54 compute-0 nova_compute[185723]:   <arch>x86_64</arch>
Feb 16 13:12:54 compute-0 nova_compute[185723]:   <vcpu max='4096'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:   <iothreads supported='yes'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:   <os supported='yes'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     <enum name='firmware'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <value>efi</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     </enum>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     <loader supported='yes'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <enum name='type'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>rom</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>pflash</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </enum>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <enum name='readonly'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>yes</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>no</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </enum>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <enum name='secure'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>yes</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>no</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </enum>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     </loader>
Feb 16 13:12:54 compute-0 nova_compute[185723]:   </os>
Feb 16 13:12:54 compute-0 nova_compute[185723]:   <cpu>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     <mode name='host-passthrough' supported='yes'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <enum name='hostPassthroughMigratable'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>on</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>off</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </enum>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     </mode>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     <mode name='maximum' supported='yes'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <enum name='maximumMigratable'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>on</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>off</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </enum>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     </mode>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     <mode name='host-model' supported='yes'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model fallback='forbid'>EPYC-Rome</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <vendor>AMD</vendor>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <maxphysaddr mode='passthrough' limit='40'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <feature policy='require' name='x2apic'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <feature policy='require' name='tsc-deadline'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <feature policy='require' name='hypervisor'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <feature policy='require' name='tsc_adjust'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <feature policy='require' name='spec-ctrl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <feature policy='require' name='stibp'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <feature policy='require' name='ssbd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <feature policy='require' name='cmp_legacy'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <feature policy='require' name='overflow-recov'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <feature policy='require' name='succor'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <feature policy='require' name='ibrs'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <feature policy='require' name='amd-ssbd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <feature policy='require' name='virt-ssbd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <feature policy='require' name='lbrv'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <feature policy='require' name='tsc-scale'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <feature policy='require' name='vmcb-clean'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <feature policy='require' name='flushbyasid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <feature policy='require' name='pause-filter'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <feature policy='require' name='pfthreshold'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <feature policy='require' name='svme-addr-chk'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <feature policy='require' name='lfence-always-serializing'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <feature policy='disable' name='xsaves'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     </mode>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     <mode name='custom' supported='yes'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Broadwell'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='hle'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Broadwell-IBRS'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='hle'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Broadwell-noTSX'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Broadwell-noTSX-IBRS'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Broadwell-v1'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='hle'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Broadwell-v2'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Broadwell-v3'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='hle'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Broadwell-v4'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Cascadelake-Server'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='hle'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Cascadelake-Server-noTSX'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Cascadelake-Server-v1'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='hle'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Cascadelake-Server-v2'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='hle'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Cascadelake-Server-v3'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Cascadelake-Server-v4'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Cascadelake-Server-v5'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='ClearwaterForest'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx-ifma'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx-ne-convert'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx-vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx-vnni-int16'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx-vnni-int8'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='bhi-ctrl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='bhi-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='cldemote'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='cmpccxadd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ddpd-u'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fbsdp-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrs'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='intel-psfd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ipred-ctrl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='lam'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='mcdt-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='movdir64b'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='movdiri'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pbrsb-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='prefetchiti'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='psdp-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='rrsba-ctrl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='serialize'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='sha512'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='sm3'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='sm4'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ss'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='ClearwaterForest-v1'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx-ifma'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx-ne-convert'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx-vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx-vnni-int16'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx-vnni-int8'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='bhi-ctrl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='bhi-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='cldemote'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='cmpccxadd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ddpd-u'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fbsdp-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrs'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='intel-psfd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ipred-ctrl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='lam'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='mcdt-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='movdir64b'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='movdiri'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pbrsb-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='prefetchiti'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='psdp-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='rrsba-ctrl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='serialize'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='sha512'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='sm3'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='sm4'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ss'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Cooperlake'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-bf16'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='hle'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='taa-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Cooperlake-v1'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-bf16'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='hle'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='taa-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Cooperlake-v2'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-bf16'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='hle'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='taa-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Denverton'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='mpx'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Denverton-v1'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='mpx'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Denverton-v2'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Denverton-v3'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Dhyana-v2'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='EPYC-Genoa'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='amd-psfd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='auto-ibrs'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-bf16'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512ifma'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='la57'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='no-nested-data-bp'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='null-sel-clr-base'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='stibp-always-on'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='EPYC-Genoa-v1'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='amd-psfd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='auto-ibrs'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-bf16'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512ifma'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='la57'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='no-nested-data-bp'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='null-sel-clr-base'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='stibp-always-on'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='EPYC-Genoa-v2'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='amd-psfd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='auto-ibrs'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-bf16'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512ifma'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fs-gs-base-ns'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='la57'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='no-nested-data-bp'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='null-sel-clr-base'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='perfmon-v2'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='stibp-always-on'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='EPYC-Milan'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='EPYC-Milan-v1'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='EPYC-Milan-v2'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='amd-psfd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='no-nested-data-bp'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='null-sel-clr-base'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='stibp-always-on'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='EPYC-Milan-v3'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='amd-psfd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='no-nested-data-bp'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='null-sel-clr-base'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='stibp-always-on'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='EPYC-Rome'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='EPYC-Rome-v1'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='EPYC-Rome-v2'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='EPYC-Rome-v3'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='EPYC-Turin'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='amd-psfd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='auto-ibrs'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx-vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-bf16'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-vp2intersect'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512ifma'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fs-gs-base-ns'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ibpb-brtype'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='la57'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='movdir64b'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='movdiri'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='no-nested-data-bp'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='null-sel-clr-base'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='perfmon-v2'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='prefetchi'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='sbpb'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='srso-user-kernel-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='stibp-always-on'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='EPYC-Turin-v1'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='amd-psfd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='auto-ibrs'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx-vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-bf16'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-vp2intersect'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512ifma'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fs-gs-base-ns'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ibpb-brtype'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='la57'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='movdir64b'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='movdiri'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='no-nested-data-bp'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='null-sel-clr-base'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='perfmon-v2'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='prefetchi'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='sbpb'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='srso-user-kernel-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='stibp-always-on'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='EPYC-v3'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='EPYC-v4'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='EPYC-v5'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='GraniteRapids'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='amx-bf16'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='amx-fp16'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='amx-int8'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='amx-tile'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx-vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-bf16'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-fp16'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512ifma'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fbsdp-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrc'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrs'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fzrm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='hle'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='la57'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='mcdt-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pbrsb-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='prefetchiti'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='psdp-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='serialize'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='taa-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='tsx-ldtrk'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xfd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='GraniteRapids-v1'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='amx-bf16'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='amx-fp16'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='amx-int8'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='amx-tile'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx-vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-bf16'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-fp16'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512ifma'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fbsdp-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrc'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrs'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fzrm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='hle'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='la57'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='mcdt-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pbrsb-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='prefetchiti'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='psdp-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='serialize'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='taa-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='tsx-ldtrk'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xfd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='GraniteRapids-v2'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='amx-bf16'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='amx-fp16'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='amx-int8'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='amx-tile'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx-vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx10'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx10-128'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx10-256'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx10-512'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-bf16'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-fp16'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512ifma'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='cldemote'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fbsdp-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrc'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrs'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fzrm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='hle'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='la57'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='mcdt-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='movdir64b'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='movdiri'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pbrsb-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='prefetchiti'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='psdp-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='serialize'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ss'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='taa-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='tsx-ldtrk'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xfd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='GraniteRapids-v3'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='amx-bf16'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='amx-fp16'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='amx-int8'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='amx-tile'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx-vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx10'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx10-128'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx10-256'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx10-512'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-bf16'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-fp16'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512ifma'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='cldemote'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fbsdp-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrc'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrs'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fzrm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='hle'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='la57'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='mcdt-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='movdir64b'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='movdiri'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pbrsb-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='prefetchiti'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='psdp-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='serialize'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ss'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='taa-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='tsx-ldtrk'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xfd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Haswell'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='hle'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Haswell-IBRS'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='hle'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Haswell-noTSX'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Haswell-noTSX-IBRS'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Haswell-v1'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='hle'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Haswell-v2'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Haswell-v3'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='hle'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Haswell-v4'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Icelake-Server'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='hle'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='la57'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Icelake-Server-noTSX'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='la57'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Icelake-Server-v1'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='hle'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='la57'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Icelake-Server-v2'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='la57'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Icelake-Server-v3'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='la57'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='taa-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Icelake-Server-v4'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512ifma'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='la57'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='taa-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Icelake-Server-v5'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512ifma'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='la57'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='taa-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Icelake-Server-v6'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512ifma'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='la57'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='taa-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Icelake-Server-v7'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512ifma'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='hle'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='la57'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='taa-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='IvyBridge'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='IvyBridge-IBRS'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='IvyBridge-v1'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='IvyBridge-v2'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='KnightsMill'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-4fmaps'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-4vnniw'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512er'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512pf'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ss'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='KnightsMill-v1'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-4fmaps'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-4vnniw'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512er'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512pf'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ss'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Opteron_G4'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fma4'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xop'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Opteron_G4-v1'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fma4'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xop'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Opteron_G5'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fma4'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='tbm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xop'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Opteron_G5-v1'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fma4'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='tbm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xop'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='SapphireRapids'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='amx-bf16'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='amx-int8'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='amx-tile'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx-vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-bf16'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-fp16'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512ifma'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrc'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrs'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fzrm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='hle'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='la57'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='serialize'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='taa-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='tsx-ldtrk'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xfd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='SapphireRapids-v1'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='amx-bf16'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='amx-int8'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='amx-tile'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx-vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-bf16'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-fp16'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512ifma'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrc'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrs'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fzrm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='hle'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='la57'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='serialize'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='taa-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='tsx-ldtrk'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xfd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='SapphireRapids-v2'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='amx-bf16'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='amx-int8'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='amx-tile'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx-vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-bf16'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-fp16'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512ifma'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fbsdp-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrc'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrs'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fzrm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='hle'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='la57'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='psdp-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='serialize'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='taa-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='tsx-ldtrk'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xfd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='SapphireRapids-v3'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='amx-bf16'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='amx-int8'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='amx-tile'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx-vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-bf16'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-fp16'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512ifma'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='cldemote'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fbsdp-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrc'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrs'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fzrm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='hle'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='la57'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='movdir64b'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='movdiri'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='psdp-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='serialize'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ss'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='taa-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='tsx-ldtrk'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xfd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='SapphireRapids-v4'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='amx-bf16'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='amx-int8'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='amx-tile'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx-vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-bf16'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-fp16'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512ifma'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='cldemote'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fbsdp-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrc'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrs'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fzrm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='hle'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='la57'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='movdir64b'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='movdiri'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='psdp-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='serialize'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ss'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='taa-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='tsx-ldtrk'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xfd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='SierraForest'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx-ifma'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx-ne-convert'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx-vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx-vnni-int8'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='cmpccxadd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fbsdp-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrs'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='mcdt-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pbrsb-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='psdp-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='serialize'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='SierraForest-v1'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx-ifma'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx-ne-convert'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx-vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx-vnni-int8'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='cmpccxadd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fbsdp-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrs'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='mcdt-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pbrsb-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='psdp-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='serialize'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='SierraForest-v2'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx-ifma'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx-ne-convert'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx-vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx-vnni-int8'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='bhi-ctrl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='cldemote'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='cmpccxadd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fbsdp-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrs'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='intel-psfd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ipred-ctrl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='lam'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='mcdt-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='movdir64b'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='movdiri'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pbrsb-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='psdp-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='rrsba-ctrl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='serialize'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ss'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='SierraForest-v3'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx-ifma'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx-ne-convert'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx-vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx-vnni-int8'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='bhi-ctrl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='cldemote'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='cmpccxadd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fbsdp-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrs'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='intel-psfd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ipred-ctrl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='lam'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='mcdt-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='movdir64b'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='movdiri'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pbrsb-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='psdp-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='rrsba-ctrl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='serialize'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ss'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Skylake-Client'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='hle'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Skylake-Client-IBRS'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='hle'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Skylake-Client-v1'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='hle'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Skylake-Client-v2'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='hle'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Skylake-Client-v3'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Skylake-Client-v4'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Skylake-Server'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='hle'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Skylake-Server-IBRS'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='hle'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Skylake-Server-v1'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='hle'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Skylake-Server-v2'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='hle'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Skylake-Server-v3'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Skylake-Server-v4'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Skylake-Server-v5'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Snowridge'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='cldemote'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='core-capability'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='movdir64b'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='movdiri'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='mpx'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='split-lock-detect'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Snowridge-v1'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='cldemote'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='core-capability'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='movdir64b'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='movdiri'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='mpx'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='split-lock-detect'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Snowridge-v2'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='cldemote'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='core-capability'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='movdir64b'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='movdiri'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='split-lock-detect'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Snowridge-v3'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='cldemote'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='core-capability'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='movdir64b'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='movdiri'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='split-lock-detect'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Snowridge-v4'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='cldemote'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='movdir64b'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='movdiri'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='athlon'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='3dnow'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='3dnowext'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='athlon-v1'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='3dnow'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='3dnowext'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='core2duo'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ss'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='core2duo-v1'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ss'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='coreduo'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ss'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='coreduo-v1'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ss'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='n270'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ss'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='n270-v1'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ss'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='phenom'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='3dnow'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='3dnowext'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='phenom-v1'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='3dnow'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='3dnowext'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     </mode>
Feb 16 13:12:54 compute-0 nova_compute[185723]:   </cpu>
Feb 16 13:12:54 compute-0 nova_compute[185723]:   <memoryBacking supported='yes'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     <enum name='sourceType'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <value>file</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <value>anonymous</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <value>memfd</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     </enum>
Feb 16 13:12:54 compute-0 nova_compute[185723]:   </memoryBacking>
Feb 16 13:12:54 compute-0 nova_compute[185723]:   <devices>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     <disk supported='yes'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <enum name='diskDevice'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>disk</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>cdrom</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>floppy</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>lun</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </enum>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <enum name='bus'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>fdc</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>scsi</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>virtio</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>usb</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>sata</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </enum>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <enum name='model'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>virtio</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>virtio-transitional</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>virtio-non-transitional</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </enum>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     </disk>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     <graphics supported='yes'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <enum name='type'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>vnc</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>egl-headless</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>dbus</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </enum>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     </graphics>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     <video supported='yes'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <enum name='modelType'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>vga</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>cirrus</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>virtio</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>none</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>bochs</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>ramfb</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </enum>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     </video>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     <hostdev supported='yes'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <enum name='mode'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>subsystem</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </enum>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <enum name='startupPolicy'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>default</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>mandatory</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>requisite</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>optional</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </enum>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <enum name='subsysType'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>usb</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>pci</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>scsi</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </enum>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <enum name='capsType'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <enum name='pciBackend'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     </hostdev>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     <rng supported='yes'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <enum name='model'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>virtio</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>virtio-transitional</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>virtio-non-transitional</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </enum>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <enum name='backendModel'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>random</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>egd</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>builtin</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </enum>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     </rng>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     <filesystem supported='yes'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <enum name='driverType'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>path</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>handle</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>virtiofs</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </enum>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     </filesystem>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     <tpm supported='yes'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <enum name='model'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>tpm-tis</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>tpm-crb</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </enum>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <enum name='backendModel'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>emulator</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>external</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </enum>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <enum name='backendVersion'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>2.0</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </enum>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     </tpm>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     <redirdev supported='yes'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <enum name='bus'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>usb</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </enum>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     </redirdev>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     <channel supported='yes'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <enum name='type'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>pty</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>unix</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </enum>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     </channel>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     <crypto supported='yes'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <enum name='model'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <enum name='type'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>qemu</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </enum>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <enum name='backendModel'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>builtin</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </enum>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     </crypto>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     <interface supported='yes'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <enum name='backendType'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>default</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>passt</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </enum>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     </interface>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     <panic supported='yes'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <enum name='model'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>isa</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>hyperv</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </enum>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     </panic>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     <console supported='yes'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <enum name='type'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>null</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>vc</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>pty</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>dev</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>file</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>pipe</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>stdio</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>udp</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>tcp</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>unix</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>qemu-vdagent</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>dbus</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </enum>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     </console>
Feb 16 13:12:54 compute-0 nova_compute[185723]:   </devices>
Feb 16 13:12:54 compute-0 nova_compute[185723]:   <features>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     <gic supported='no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     <vmcoreinfo supported='yes'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     <genid supported='yes'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     <backingStoreInput supported='yes'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     <backup supported='yes'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     <async-teardown supported='yes'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     <s390-pv supported='no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     <ps2 supported='yes'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     <tdx supported='no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     <sev supported='no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     <sgx supported='no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     <hyperv supported='yes'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <enum name='features'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>relaxed</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>vapic</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>spinlocks</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>vpindex</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>runtime</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>synic</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>stimer</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>reset</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>vendor_id</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>frequencies</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>reenlightenment</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>tlbflush</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>ipi</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>avic</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>emsr_bitmap</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>xmm_input</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </enum>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <defaults>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <spinlocks>4095</spinlocks>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <stimer_direct>on</stimer_direct>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <tlbflush_direct>on</tlbflush_direct>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <tlbflush_extended>on</tlbflush_extended>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <vendor_id>Linux KVM Hv</vendor_id>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </defaults>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     </hyperv>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     <launchSecurity supported='no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:   </features>
Feb 16 13:12:54 compute-0 nova_compute[185723]: </domainCapabilities>
Feb 16 13:12:54 compute-0 nova_compute[185723]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.391 185727 DEBUG nova.virt.libvirt.host [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Feb 16 13:12:54 compute-0 nova_compute[185723]: <domainCapabilities>
Feb 16 13:12:54 compute-0 nova_compute[185723]:   <path>/usr/libexec/qemu-kvm</path>
Feb 16 13:12:54 compute-0 nova_compute[185723]:   <domain>kvm</domain>
Feb 16 13:12:54 compute-0 nova_compute[185723]:   <machine>pc-i440fx-rhel7.6.0</machine>
Feb 16 13:12:54 compute-0 nova_compute[185723]:   <arch>x86_64</arch>
Feb 16 13:12:54 compute-0 nova_compute[185723]:   <vcpu max='240'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:   <iothreads supported='yes'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:   <os supported='yes'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     <enum name='firmware'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     <loader supported='yes'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <enum name='type'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>rom</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>pflash</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </enum>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <enum name='readonly'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>yes</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>no</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </enum>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <enum name='secure'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>no</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </enum>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     </loader>
Feb 16 13:12:54 compute-0 nova_compute[185723]:   </os>
Feb 16 13:12:54 compute-0 nova_compute[185723]:   <cpu>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     <mode name='host-passthrough' supported='yes'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <enum name='hostPassthroughMigratable'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>on</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>off</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </enum>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     </mode>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     <mode name='maximum' supported='yes'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <enum name='maximumMigratable'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>on</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>off</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </enum>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     </mode>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     <mode name='host-model' supported='yes'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model fallback='forbid'>EPYC-Rome</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <vendor>AMD</vendor>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <maxphysaddr mode='passthrough' limit='40'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <feature policy='require' name='x2apic'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <feature policy='require' name='tsc-deadline'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <feature policy='require' name='hypervisor'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <feature policy='require' name='tsc_adjust'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <feature policy='require' name='spec-ctrl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <feature policy='require' name='stibp'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <feature policy='require' name='ssbd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <feature policy='require' name='cmp_legacy'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <feature policy='require' name='overflow-recov'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <feature policy='require' name='succor'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <feature policy='require' name='ibrs'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <feature policy='require' name='amd-ssbd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <feature policy='require' name='virt-ssbd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <feature policy='require' name='lbrv'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <feature policy='require' name='tsc-scale'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <feature policy='require' name='vmcb-clean'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <feature policy='require' name='flushbyasid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <feature policy='require' name='pause-filter'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <feature policy='require' name='pfthreshold'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <feature policy='require' name='svme-addr-chk'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <feature policy='require' name='lfence-always-serializing'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <feature policy='disable' name='xsaves'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     </mode>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     <mode name='custom' supported='yes'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Broadwell'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='hle'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Broadwell-IBRS'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='hle'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Broadwell-noTSX'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Broadwell-noTSX-IBRS'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Broadwell-v1'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='hle'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Broadwell-v2'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Broadwell-v3'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='hle'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Broadwell-v4'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Cascadelake-Server'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='hle'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Cascadelake-Server-noTSX'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Cascadelake-Server-v1'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='hle'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Cascadelake-Server-v2'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='hle'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Cascadelake-Server-v3'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Cascadelake-Server-v4'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Cascadelake-Server-v5'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='ClearwaterForest'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx-ifma'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx-ne-convert'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx-vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx-vnni-int16'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx-vnni-int8'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='bhi-ctrl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='bhi-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='cldemote'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='cmpccxadd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ddpd-u'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fbsdp-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrs'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='intel-psfd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ipred-ctrl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='lam'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='mcdt-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='movdir64b'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='movdiri'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pbrsb-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='prefetchiti'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='psdp-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='rrsba-ctrl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='serialize'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='sha512'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='sm3'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='sm4'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ss'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='ClearwaterForest-v1'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx-ifma'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx-ne-convert'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx-vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx-vnni-int16'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx-vnni-int8'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='bhi-ctrl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='bhi-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='cldemote'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='cmpccxadd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ddpd-u'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fbsdp-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrs'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='intel-psfd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ipred-ctrl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='lam'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='mcdt-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='movdir64b'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='movdiri'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pbrsb-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='prefetchiti'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='psdp-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='rrsba-ctrl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='serialize'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='sha512'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='sm3'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='sm4'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ss'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Cooperlake'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-bf16'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='hle'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='taa-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Cooperlake-v1'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-bf16'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='hle'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='taa-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Cooperlake-v2'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-bf16'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='hle'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='taa-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Denverton'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='mpx'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Denverton-v1'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='mpx'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Denverton-v2'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Denverton-v3'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Dhyana-v2'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='EPYC-Genoa'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='amd-psfd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='auto-ibrs'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-bf16'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512ifma'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='la57'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='no-nested-data-bp'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='null-sel-clr-base'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='stibp-always-on'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='EPYC-Genoa-v1'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='amd-psfd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='auto-ibrs'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-bf16'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512ifma'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='la57'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='no-nested-data-bp'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='null-sel-clr-base'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='stibp-always-on'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='EPYC-Genoa-v2'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='amd-psfd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='auto-ibrs'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-bf16'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512ifma'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fs-gs-base-ns'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='la57'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='no-nested-data-bp'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='null-sel-clr-base'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='perfmon-v2'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='stibp-always-on'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='EPYC-Milan'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='EPYC-Milan-v1'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='EPYC-Milan-v2'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='amd-psfd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='no-nested-data-bp'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='null-sel-clr-base'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='stibp-always-on'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='EPYC-Milan-v3'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='amd-psfd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='no-nested-data-bp'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='null-sel-clr-base'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='stibp-always-on'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='EPYC-Rome'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='EPYC-Rome-v1'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='EPYC-Rome-v2'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='EPYC-Rome-v3'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='EPYC-Turin'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='amd-psfd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='auto-ibrs'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx-vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-bf16'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-vp2intersect'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512ifma'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fs-gs-base-ns'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ibpb-brtype'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='la57'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='movdir64b'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='movdiri'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='no-nested-data-bp'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='null-sel-clr-base'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='perfmon-v2'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='prefetchi'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='sbpb'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='srso-user-kernel-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='stibp-always-on'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='EPYC-Turin-v1'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='amd-psfd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='auto-ibrs'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx-vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-bf16'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-vp2intersect'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512ifma'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fs-gs-base-ns'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ibpb-brtype'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='la57'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='movdir64b'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='movdiri'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='no-nested-data-bp'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='null-sel-clr-base'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='perfmon-v2'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='prefetchi'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='sbpb'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='srso-user-kernel-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='stibp-always-on'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='EPYC-v3'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='EPYC-v4'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='EPYC-v5'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='GraniteRapids'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='amx-bf16'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='amx-fp16'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='amx-int8'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='amx-tile'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx-vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-bf16'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-fp16'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512ifma'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fbsdp-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrc'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrs'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fzrm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='hle'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='la57'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='mcdt-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pbrsb-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='prefetchiti'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='psdp-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='serialize'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='taa-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='tsx-ldtrk'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xfd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='GraniteRapids-v1'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='amx-bf16'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='amx-fp16'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='amx-int8'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='amx-tile'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx-vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-bf16'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-fp16'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512ifma'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fbsdp-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrc'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrs'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fzrm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='hle'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='la57'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='mcdt-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pbrsb-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='prefetchiti'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='psdp-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='serialize'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='taa-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='tsx-ldtrk'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xfd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='GraniteRapids-v2'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='amx-bf16'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='amx-fp16'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='amx-int8'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='amx-tile'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx-vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx10'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx10-128'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx10-256'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx10-512'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-bf16'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-fp16'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512ifma'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='cldemote'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fbsdp-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrc'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrs'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fzrm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='hle'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='la57'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='mcdt-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='movdir64b'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='movdiri'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pbrsb-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='prefetchiti'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='psdp-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='serialize'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ss'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='taa-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='tsx-ldtrk'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xfd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='GraniteRapids-v3'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='amx-bf16'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='amx-fp16'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='amx-int8'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='amx-tile'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx-vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx10'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx10-128'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx10-256'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx10-512'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-bf16'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-fp16'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512ifma'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='cldemote'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fbsdp-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrc'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrs'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fzrm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='hle'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='la57'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='mcdt-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='movdir64b'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='movdiri'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pbrsb-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='prefetchiti'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='psdp-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='serialize'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ss'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='taa-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='tsx-ldtrk'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xfd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Haswell'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='hle'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Haswell-IBRS'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='hle'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Haswell-noTSX'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Haswell-noTSX-IBRS'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Haswell-v1'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='hle'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Haswell-v2'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Haswell-v3'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='hle'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Haswell-v4'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Icelake-Server'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='hle'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='la57'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Icelake-Server-noTSX'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='la57'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Icelake-Server-v1'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='hle'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='la57'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Icelake-Server-v2'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='la57'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Icelake-Server-v3'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='la57'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='taa-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Icelake-Server-v4'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512ifma'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='la57'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='taa-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Icelake-Server-v5'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512ifma'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='la57'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='taa-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Icelake-Server-v6'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512ifma'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='la57'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='taa-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Icelake-Server-v7'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512ifma'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='hle'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='la57'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='taa-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='IvyBridge'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='IvyBridge-IBRS'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='IvyBridge-v1'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='IvyBridge-v2'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='KnightsMill'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-4fmaps'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-4vnniw'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512er'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512pf'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ss'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='KnightsMill-v1'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-4fmaps'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-4vnniw'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512er'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512pf'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ss'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Opteron_G4'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fma4'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xop'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Opteron_G4-v1'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fma4'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xop'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Opteron_G5'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fma4'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='tbm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xop'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Opteron_G5-v1'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fma4'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='tbm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xop'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='SapphireRapids'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='amx-bf16'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='amx-int8'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='amx-tile'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx-vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-bf16'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-fp16'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512ifma'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrc'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrs'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fzrm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='hle'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='la57'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='serialize'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='taa-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='tsx-ldtrk'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xfd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='SapphireRapids-v1'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='amx-bf16'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='amx-int8'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='amx-tile'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx-vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-bf16'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-fp16'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512ifma'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrc'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrs'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fzrm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='hle'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='la57'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='serialize'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='taa-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='tsx-ldtrk'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xfd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='SapphireRapids-v2'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='amx-bf16'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='amx-int8'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='amx-tile'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx-vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-bf16'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-fp16'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512ifma'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fbsdp-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrc'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrs'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fzrm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='hle'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='la57'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='psdp-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='serialize'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='taa-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='tsx-ldtrk'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xfd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='SapphireRapids-v3'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='amx-bf16'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='amx-int8'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='amx-tile'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx-vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-bf16'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-fp16'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512ifma'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='cldemote'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fbsdp-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrc'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrs'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fzrm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='hle'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='la57'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='movdir64b'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='movdiri'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='psdp-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='serialize'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ss'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='taa-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='tsx-ldtrk'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xfd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='SapphireRapids-v4'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='amx-bf16'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='amx-int8'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='amx-tile'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx-vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-bf16'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-fp16'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512ifma'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='cldemote'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fbsdp-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrc'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrs'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fzrm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='hle'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='la57'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='movdir64b'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='movdiri'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='psdp-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='serialize'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ss'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='taa-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='tsx-ldtrk'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xfd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='SierraForest'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx-ifma'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx-ne-convert'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx-vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx-vnni-int8'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='cmpccxadd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fbsdp-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrs'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='mcdt-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pbrsb-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='psdp-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='serialize'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='SierraForest-v1'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx-ifma'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx-ne-convert'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx-vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx-vnni-int8'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='cmpccxadd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fbsdp-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrs'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='mcdt-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pbrsb-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='psdp-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='serialize'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='SierraForest-v2'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx-ifma'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx-ne-convert'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx-vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx-vnni-int8'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='bhi-ctrl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='cldemote'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='cmpccxadd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fbsdp-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrs'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='intel-psfd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ipred-ctrl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='lam'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='mcdt-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='movdir64b'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='movdiri'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pbrsb-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='psdp-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='rrsba-ctrl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='serialize'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ss'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='SierraForest-v3'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx-ifma'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx-ne-convert'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx-vnni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx-vnni-int8'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='bhi-ctrl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='cldemote'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='cmpccxadd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fbsdp-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='fsrs'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='intel-psfd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ipred-ctrl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='lam'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='mcdt-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='movdir64b'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='movdiri'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pbrsb-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='psdp-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='rrsba-ctrl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='serialize'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ss'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Skylake-Client'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='hle'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Skylake-Client-IBRS'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='hle'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Skylake-Client-v1'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='hle'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Skylake-Client-v2'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='hle'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Skylake-Client-v3'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Skylake-Client-v4'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Skylake-Server'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='hle'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Skylake-Server-IBRS'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='hle'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Skylake-Server-v1'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='hle'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Skylake-Server-v2'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='hle'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Skylake-Server-v3'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Skylake-Server-v4'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Skylake-Server-v5'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='pku'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Snowridge'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='cldemote'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='core-capability'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='movdir64b'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='movdiri'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='mpx'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='split-lock-detect'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Snowridge-v1'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='cldemote'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='core-capability'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='movdir64b'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='movdiri'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='mpx'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='split-lock-detect'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Snowridge-v2'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='cldemote'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='core-capability'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='movdir64b'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='movdiri'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='split-lock-detect'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Snowridge-v3'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='cldemote'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='core-capability'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='movdir64b'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='movdiri'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='split-lock-detect'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='Snowridge-v4'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='cldemote'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='erms'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='movdir64b'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='movdiri'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='athlon'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='3dnow'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='3dnowext'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='athlon-v1'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='3dnow'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='3dnowext'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='core2duo'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ss'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='core2duo-v1'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ss'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='coreduo'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ss'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='coreduo-v1'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ss'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='n270'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ss'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='n270-v1'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='ss'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='phenom'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='3dnow'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='3dnowext'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <blockers model='phenom-v1'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='3dnow'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <feature name='3dnowext'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </blockers>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     </mode>
Feb 16 13:12:54 compute-0 nova_compute[185723]:   </cpu>
Feb 16 13:12:54 compute-0 nova_compute[185723]:   <memoryBacking supported='yes'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     <enum name='sourceType'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <value>file</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <value>anonymous</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <value>memfd</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     </enum>
Feb 16 13:12:54 compute-0 nova_compute[185723]:   </memoryBacking>
Feb 16 13:12:54 compute-0 nova_compute[185723]:   <devices>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     <disk supported='yes'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <enum name='diskDevice'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>disk</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>cdrom</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>floppy</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>lun</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </enum>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <enum name='bus'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>ide</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>fdc</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>scsi</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>virtio</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>usb</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>sata</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </enum>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <enum name='model'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>virtio</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>virtio-transitional</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>virtio-non-transitional</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </enum>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     </disk>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     <graphics supported='yes'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <enum name='type'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>vnc</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>egl-headless</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>dbus</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </enum>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     </graphics>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     <video supported='yes'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <enum name='modelType'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>vga</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>cirrus</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>virtio</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>none</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>bochs</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>ramfb</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </enum>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     </video>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     <hostdev supported='yes'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <enum name='mode'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>subsystem</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </enum>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <enum name='startupPolicy'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>default</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>mandatory</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>requisite</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>optional</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </enum>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <enum name='subsysType'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>usb</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>pci</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>scsi</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </enum>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <enum name='capsType'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <enum name='pciBackend'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     </hostdev>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     <rng supported='yes'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <enum name='model'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>virtio</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>virtio-transitional</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>virtio-non-transitional</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </enum>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <enum name='backendModel'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>random</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>egd</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>builtin</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </enum>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     </rng>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     <filesystem supported='yes'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <enum name='driverType'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>path</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>handle</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>virtiofs</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </enum>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     </filesystem>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     <tpm supported='yes'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <enum name='model'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>tpm-tis</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>tpm-crb</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </enum>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <enum name='backendModel'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>emulator</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>external</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </enum>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <enum name='backendVersion'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>2.0</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </enum>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     </tpm>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     <redirdev supported='yes'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <enum name='bus'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>usb</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </enum>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     </redirdev>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     <channel supported='yes'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <enum name='type'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>pty</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>unix</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </enum>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     </channel>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     <crypto supported='yes'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <enum name='model'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <enum name='type'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>qemu</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </enum>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <enum name='backendModel'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>builtin</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </enum>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     </crypto>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     <interface supported='yes'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <enum name='backendType'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>default</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>passt</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </enum>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     </interface>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     <panic supported='yes'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <enum name='model'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>isa</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>hyperv</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </enum>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     </panic>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     <console supported='yes'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <enum name='type'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>null</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>vc</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>pty</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>dev</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>file</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>pipe</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>stdio</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>udp</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>tcp</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>unix</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>qemu-vdagent</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>dbus</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </enum>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     </console>
Feb 16 13:12:54 compute-0 nova_compute[185723]:   </devices>
Feb 16 13:12:54 compute-0 nova_compute[185723]:   <features>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     <gic supported='no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     <vmcoreinfo supported='yes'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     <genid supported='yes'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     <backingStoreInput supported='yes'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     <backup supported='yes'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     <async-teardown supported='yes'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     <s390-pv supported='no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     <ps2 supported='yes'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     <tdx supported='no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     <sev supported='no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     <sgx supported='no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     <hyperv supported='yes'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <enum name='features'>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>relaxed</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>vapic</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>spinlocks</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>vpindex</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>runtime</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>synic</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>stimer</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>reset</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>vendor_id</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>frequencies</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>reenlightenment</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>tlbflush</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>ipi</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>avic</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>emsr_bitmap</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <value>xmm_input</value>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </enum>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       <defaults>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <spinlocks>4095</spinlocks>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <stimer_direct>on</stimer_direct>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <tlbflush_direct>on</tlbflush_direct>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <tlbflush_extended>on</tlbflush_extended>
Feb 16 13:12:54 compute-0 nova_compute[185723]:         <vendor_id>Linux KVM Hv</vendor_id>
Feb 16 13:12:54 compute-0 nova_compute[185723]:       </defaults>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     </hyperv>
Feb 16 13:12:54 compute-0 nova_compute[185723]:     <launchSecurity supported='no'/>
Feb 16 13:12:54 compute-0 nova_compute[185723]:   </features>
Feb 16 13:12:54 compute-0 nova_compute[185723]: </domainCapabilities>
Feb 16 13:12:54 compute-0 nova_compute[185723]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.468 185727 DEBUG nova.virt.libvirt.host [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.468 185727 INFO nova.virt.libvirt.host [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] Secure Boot support detected
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.470 185727 INFO nova.virt.libvirt.driver [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.471 185727 INFO nova.virt.libvirt.driver [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.478 185727 DEBUG nova.virt.libvirt.driver [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] cpu compare xml: <cpu match="exact">
Feb 16 13:12:54 compute-0 nova_compute[185723]:   <model>Nehalem</model>
Feb 16 13:12:54 compute-0 nova_compute[185723]: </cpu>
Feb 16 13:12:54 compute-0 nova_compute[185723]:  _compare_cpu /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10019
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.480 185727 DEBUG nova.virt.libvirt.driver [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.667 185727 INFO nova.virt.node [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] Determined node identity c9501a85-df32-4b8f-bce0-9425ef1e7866 from /var/lib/nova/compute_id
Feb 16 13:12:54 compute-0 rsyslogd[1017]: imjournal from <np0005620856:nova_compute>: begin to drop messages due to rate-limiting
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.816 185727 DEBUG nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] Verified node c9501a85-df32-4b8f-bce0-9425ef1e7866 matches my host compute-0.ctlplane.example.com _check_for_host_rename /usr/lib/python3.9/site-packages/nova/compute/manager.py:1568
Feb 16 13:12:54 compute-0 nova_compute[185723]: 2026-02-16 13:12:54.902 185727 INFO nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host
Feb 16 13:12:55 compute-0 nova_compute[185723]: 2026-02-16 13:12:55.057 185727 DEBUG oslo_concurrency.lockutils [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:12:55 compute-0 nova_compute[185723]: 2026-02-16 13:12:55.057 185727 DEBUG oslo_concurrency.lockutils [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:12:55 compute-0 nova_compute[185723]: 2026-02-16 13:12:55.057 185727 DEBUG oslo_concurrency.lockutils [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:12:55 compute-0 nova_compute[185723]: 2026-02-16 13:12:55.058 185727 DEBUG nova.compute.resource_tracker [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 16 13:12:55 compute-0 nova_compute[185723]: 2026-02-16 13:12:55.182 185727 WARNING nova.virt.libvirt.driver [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 13:12:55 compute-0 nova_compute[185723]: 2026-02-16 13:12:55.183 185727 DEBUG nova.compute.resource_tracker [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6149MB free_disk=73.43876266479492GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 16 13:12:55 compute-0 nova_compute[185723]: 2026-02-16 13:12:55.183 185727 DEBUG oslo_concurrency.lockutils [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:12:55 compute-0 nova_compute[185723]: 2026-02-16 13:12:55.183 185727 DEBUG oslo_concurrency.lockutils [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:12:55 compute-0 nova_compute[185723]: 2026-02-16 13:12:55.414 185727 DEBUG nova.compute.resource_tracker [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 16 13:12:55 compute-0 nova_compute[185723]: 2026-02-16 13:12:55.414 185727 DEBUG nova.compute.resource_tracker [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 16 13:12:55 compute-0 nova_compute[185723]: 2026-02-16 13:12:55.513 185727 DEBUG nova.scheduler.client.report [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] Refreshing inventories for resource provider c9501a85-df32-4b8f-bce0-9425ef1e7866 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Feb 16 13:12:55 compute-0 nova_compute[185723]: 2026-02-16 13:12:55.556 185727 DEBUG nova.scheduler.client.report [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] Updating ProviderTree inventory for provider c9501a85-df32-4b8f-bce0-9425ef1e7866 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Feb 16 13:12:55 compute-0 nova_compute[185723]: 2026-02-16 13:12:55.556 185727 DEBUG nova.compute.provider_tree [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] Updating inventory in ProviderTree for provider c9501a85-df32-4b8f-bce0-9425ef1e7866 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Feb 16 13:12:55 compute-0 nova_compute[185723]: 2026-02-16 13:12:55.583 185727 DEBUG nova.scheduler.client.report [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] Refreshing aggregate associations for resource provider c9501a85-df32-4b8f-bce0-9425ef1e7866, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Feb 16 13:12:55 compute-0 nova_compute[185723]: 2026-02-16 13:12:55.607 185727 DEBUG nova.scheduler.client.report [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] Refreshing trait associations for resource provider c9501a85-df32-4b8f-bce0-9425ef1e7866, traits: COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_MMX,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_SSE,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_SSE2,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_SECURITY_TPM_1_2,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_VOLUME_EXTEND,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_STORAGE_BUS_USB,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_SECURITY_TPM_2_0,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_ACCELERATORS,COMPUTE_STORAGE_BUS_FDC,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NODE,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_SSE41,COMPUTE_IMAGE_TYPE_AKI _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Feb 16 13:12:55 compute-0 nova_compute[185723]: 2026-02-16 13:12:55.635 185727 DEBUG nova.virt.libvirt.host [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] /sys/module/kvm_amd/parameters/sev contains [N
Feb 16 13:12:55 compute-0 nova_compute[185723]: ] _kernel_supports_amd_sev /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1803
Feb 16 13:12:55 compute-0 nova_compute[185723]: 2026-02-16 13:12:55.635 185727 INFO nova.virt.libvirt.host [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] kernel doesn't support AMD SEV
Feb 16 13:12:55 compute-0 nova_compute[185723]: 2026-02-16 13:12:55.636 185727 DEBUG nova.compute.provider_tree [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] Inventory has not changed in ProviderTree for provider: c9501a85-df32-4b8f-bce0-9425ef1e7866 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 13:12:55 compute-0 nova_compute[185723]: 2026-02-16 13:12:55.636 185727 DEBUG nova.virt.libvirt.driver [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Feb 16 13:12:55 compute-0 nova_compute[185723]: 2026-02-16 13:12:55.639 185727 DEBUG nova.virt.libvirt.driver [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] Libvirt baseline CPU <cpu>
Feb 16 13:12:55 compute-0 nova_compute[185723]:   <arch>x86_64</arch>
Feb 16 13:12:55 compute-0 nova_compute[185723]:   <model>Nehalem</model>
Feb 16 13:12:55 compute-0 nova_compute[185723]:   <vendor>AMD</vendor>
Feb 16 13:12:55 compute-0 nova_compute[185723]:   <topology sockets="8" cores="1" threads="1"/>
Feb 16 13:12:55 compute-0 nova_compute[185723]: </cpu>
Feb 16 13:12:55 compute-0 nova_compute[185723]:  _get_guest_baseline_cpu_features /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12537
Feb 16 13:12:55 compute-0 nova_compute[185723]: 2026-02-16 13:12:55.684 185727 DEBUG nova.scheduler.client.report [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] Inventory has not changed for provider c9501a85-df32-4b8f-bce0-9425ef1e7866 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 13:12:55 compute-0 nova_compute[185723]: 2026-02-16 13:12:55.727 185727 DEBUG nova.compute.resource_tracker [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 16 13:12:55 compute-0 nova_compute[185723]: 2026-02-16 13:12:55.728 185727 DEBUG oslo_concurrency.lockutils [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.544s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:12:55 compute-0 nova_compute[185723]: 2026-02-16 13:12:55.728 185727 DEBUG nova.service [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] Creating RPC server for service compute start /usr/lib/python3.9/site-packages/nova/service.py:182
Feb 16 13:12:55 compute-0 nova_compute[185723]: 2026-02-16 13:12:55.790 185727 DEBUG nova.service [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] Join ServiceGroup membership for this service compute start /usr/lib/python3.9/site-packages/nova/service.py:199
Feb 16 13:12:55 compute-0 nova_compute[185723]: 2026-02-16 13:12:55.790 185727 DEBUG nova.servicegroup.drivers.db [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] DB_Driver: join new ServiceGroup member compute-0.ctlplane.example.com to the compute group, service = <Service: host=compute-0.ctlplane.example.com, binary=nova-compute, manager_class_name=nova.compute.manager.ComputeManager> join /usr/lib/python3.9/site-packages/nova/servicegroup/drivers/db.py:44
Feb 16 13:12:55 compute-0 nova_compute[185723]: 2026-02-16 13:12:55.791 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:12:55 compute-0 nova_compute[185723]: 2026-02-16 13:12:55.819 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:12:56 compute-0 sshd-session[186022]: Connection closed by authenticating user root 64.227.72.94 port 60756 [preauth]
Feb 16 13:12:59 compute-0 sshd-session[186024]: Accepted publickey for zuul from 192.168.122.30 port 52608 ssh2: ECDSA SHA256:6P1Q60WP6ePVjzkJMgpM7PslBYS6YIK3pVS2L1FZ4Yk
Feb 16 13:12:59 compute-0 systemd-logind[818]: New session 26 of user zuul.
Feb 16 13:13:00 compute-0 systemd[1]: Started Session 26 of User zuul.
Feb 16 13:13:00 compute-0 sshd-session[186024]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 16 13:13:01 compute-0 python3.9[186177]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 16 13:13:02 compute-0 sudo[186331]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ztmsmpnzfxpzfzkignffbyvpoaawonii ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247581.6508818-52-29566534470335/AnsiballZ_systemd_service.py'
Feb 16 13:13:02 compute-0 sudo[186331]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:13:02 compute-0 python3.9[186333]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 16 13:13:02 compute-0 systemd[1]: Reloading.
Feb 16 13:13:02 compute-0 systemd-rc-local-generator[186357]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 13:13:02 compute-0 systemd-sysv-generator[186363]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 16 13:13:02 compute-0 sudo[186331]: pam_unix(sudo:session): session closed for user root
Feb 16 13:13:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:13:03.201 105360 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:13:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:13:03.202 105360 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:13:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:13:03.203 105360 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:13:03 compute-0 python3.9[186526]: ansible-ansible.builtin.service_facts Invoked
Feb 16 13:13:03 compute-0 network[186543]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Feb 16 13:13:03 compute-0 network[186544]: 'network-scripts' will be removed from distribution in near future.
Feb 16 13:13:03 compute-0 network[186545]: It is advised to switch to 'NetworkManager' instead for network management.
Feb 16 13:13:04 compute-0 sshd-session[186334]: Connection closed by authenticating user root 146.190.22.227 port 53298 [preauth]
Feb 16 13:13:06 compute-0 sudo[186816]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jgnqaupdymckdcddbblpzpvfzdwyderb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247586.3059773-90-98096894659685/AnsiballZ_systemd_service.py'
Feb 16 13:13:06 compute-0 sudo[186816]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:13:06 compute-0 python3.9[186818]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_ceilometer_agent_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 16 13:13:06 compute-0 sudo[186816]: pam_unix(sudo:session): session closed for user root
Feb 16 13:13:07 compute-0 sudo[186980]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-auwhidwuppkqkapqyozwsdmmwqyzwrji ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247587.3340795-110-215395355261135/AnsiballZ_file.py'
Feb 16 13:13:07 compute-0 sudo[186980]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:13:07 compute-0 podman[186943]: 2026-02-16 13:13:07.770573784 +0000 UTC m=+0.047754032 container health_status d69bd0be854ec8043fcba555b70c2cd311c7a2510d07d6bc9929fe7684abece9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3)
Feb 16 13:13:07 compute-0 python3.9[186989]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:13:07 compute-0 sudo[186980]: pam_unix(sudo:session): session closed for user root
Feb 16 13:13:07 compute-0 rsyslogd[1017]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Feb 16 13:13:07 compute-0 rsyslogd[1017]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Feb 16 13:13:08 compute-0 sudo[187140]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dzpbjqodbayyjgdjlxsmliuffhrgazpf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247588.1430833-126-77730911598560/AnsiballZ_file.py'
Feb 16 13:13:08 compute-0 sudo[187140]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:13:08 compute-0 python3.9[187142]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:13:08 compute-0 sudo[187140]: pam_unix(sudo:session): session closed for user root
Feb 16 13:13:09 compute-0 sudo[187292]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jsosmypaiccrzygstmuemkebxyanbywt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247588.947715-144-161481984297292/AnsiballZ_command.py'
Feb 16 13:13:09 compute-0 sudo[187292]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:13:09 compute-0 python3.9[187294]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                               systemctl disable --now certmonger.service
                                               test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                             fi
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 13:13:09 compute-0 sudo[187292]: pam_unix(sudo:session): session closed for user root
Feb 16 13:13:10 compute-0 python3.9[187446]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Feb 16 13:13:10 compute-0 sudo[187596]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xstyvpsibeplwtezuslseejcqdsanens ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247590.5432377-180-110958352860024/AnsiballZ_systemd_service.py'
Feb 16 13:13:10 compute-0 sudo[187596]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:13:11 compute-0 python3.9[187598]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 16 13:13:11 compute-0 systemd[1]: Reloading.
Feb 16 13:13:11 compute-0 systemd-sysv-generator[187651]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 16 13:13:11 compute-0 systemd-rc-local-generator[187648]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 13:13:11 compute-0 podman[187600]: 2026-02-16 13:13:11.213884509 +0000 UTC m=+0.075394002 container health_status 19961a2f872cc72daf954f4dd556f980b5f58f137d145776df6132c167a8a93c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true)
Feb 16 13:13:11 compute-0 sudo[187596]: pam_unix(sudo:session): session closed for user root
Feb 16 13:13:11 compute-0 sudo[187816]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bnvdxwwrprwfdzdmwbjiiqsxgfwkyvsq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247591.599693-196-19681587263444/AnsiballZ_command.py'
Feb 16 13:13:11 compute-0 sudo[187816]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:13:12 compute-0 python3.9[187818]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_ceilometer_agent_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 13:13:12 compute-0 sudo[187816]: pam_unix(sudo:session): session closed for user root
Feb 16 13:13:12 compute-0 sudo[187969]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-misyenqysoqsmnyjzonmsxdchxxigjpr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247592.2866817-214-20191624290061/AnsiballZ_file.py'
Feb 16 13:13:12 compute-0 sudo[187969]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:13:12 compute-0 python3.9[187971]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/openstack/telemetry recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 16 13:13:12 compute-0 sudo[187969]: pam_unix(sudo:session): session closed for user root
Feb 16 13:13:13 compute-0 python3.9[188121]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 16 13:13:14 compute-0 sudo[188273]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-shglrowazmdngrpbqmuliaqwhshewrvb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247593.815226-246-21419706957476/AnsiballZ_group.py'
Feb 16 13:13:14 compute-0 sudo[188273]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:13:14 compute-0 python3.9[188275]: ansible-ansible.builtin.group Invoked with name=libvirt state=present force=False system=False local=False non_unique=False gid=None gid_min=None gid_max=None
Feb 16 13:13:14 compute-0 sudo[188273]: pam_unix(sudo:session): session closed for user root
Feb 16 13:13:15 compute-0 sudo[188425]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-atbeqisvdeqwknzcxluqjlkgyzssuuao ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247594.9249587-268-152970938384897/AnsiballZ_getent.py'
Feb 16 13:13:15 compute-0 sudo[188425]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:13:15 compute-0 python3.9[188427]: ansible-ansible.builtin.getent Invoked with database=passwd key=ceilometer fail_key=True service=None split=None
Feb 16 13:13:15 compute-0 sudo[188425]: pam_unix(sudo:session): session closed for user root
Feb 16 13:13:16 compute-0 sudo[188578]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bhervofgrrecsuplrlqcvoyqpizvjmvf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247595.8825905-284-12008796657514/AnsiballZ_group.py'
Feb 16 13:13:16 compute-0 sudo[188578]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:13:16 compute-0 python3.9[188580]: ansible-ansible.builtin.group Invoked with gid=42405 name=ceilometer state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Feb 16 13:13:16 compute-0 groupadd[188581]: group added to /etc/group: name=ceilometer, GID=42405
Feb 16 13:13:16 compute-0 groupadd[188581]: group added to /etc/gshadow: name=ceilometer
Feb 16 13:13:16 compute-0 groupadd[188581]: new group: name=ceilometer, GID=42405
Feb 16 13:13:16 compute-0 sudo[188578]: pam_unix(sudo:session): session closed for user root
Feb 16 13:13:17 compute-0 sudo[188736]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wcdqfvecskqqffxvgggctffnauguxtsw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247596.8090155-300-109207343359485/AnsiballZ_user.py'
Feb 16 13:13:17 compute-0 sudo[188736]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:13:17 compute-0 python3.9[188738]: ansible-ansible.builtin.user Invoked with comment=ceilometer user group=ceilometer groups=['libvirt'] name=ceilometer shell=/sbin/nologin state=present uid=42405 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Feb 16 13:13:17 compute-0 useradd[188740]: new user: name=ceilometer, UID=42405, GID=42405, home=/home/ceilometer, shell=/sbin/nologin, from=/dev/pts/0
Feb 16 13:13:17 compute-0 useradd[188740]: add 'ceilometer' to group 'libvirt'
Feb 16 13:13:17 compute-0 useradd[188740]: add 'ceilometer' to shadow group 'libvirt'
Feb 16 13:13:17 compute-0 sudo[188736]: pam_unix(sudo:session): session closed for user root
Feb 16 13:13:19 compute-0 python3.9[188896]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/ceilometer.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:13:20 compute-0 python3.9[189017]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/ceilometer.conf mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1771247598.7626681-352-90689843776116/.source.conf _original_basename=ceilometer.conf follow=False checksum=5c6a9288d15d1b05b1484826ce363ad306e9930c backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:13:20 compute-0 python3.9[189167]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/polling.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:13:21 compute-0 python3.9[189288]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/polling.yaml mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1771247600.3981013-352-71063550456589/.source.yaml _original_basename=polling.yaml follow=False checksum=6c8680a286285f2e0ef9fa528ca754765e5ed0e5 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:13:21 compute-0 python3.9[189438]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/custom.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:13:22 compute-0 python3.9[189559]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/custom.conf mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1771247601.5058122-352-15276362990204/.source.conf _original_basename=custom.conf follow=False checksum=838b8b0a7d7f72e55ab67d39f32e3cb3eca2139b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:13:22 compute-0 sshd-session[189560]: Connection closed by 188.166.42.159 port 44934
Feb 16 13:13:23 compute-0 python3.9[189710]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 16 13:13:23 compute-0 python3.9[189862]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 16 13:13:24 compute-0 python3.9[190014]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/ceilometer-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:13:25 compute-0 python3.9[190135]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/ceilometer-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771247604.072046-470-161582132424312/.source.conf follow=False _original_basename=ceilometer-host-specific.conf.j2 checksum=e86e0e43000ce9ccfe5aefbf8e8f2e3d15d05584 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 16 13:13:25 compute-0 python3.9[190285]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/openstack_network_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:13:26 compute-0 python3.9[190406]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/openstack_network_exporter.yaml mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771247605.2167-470-105767446914737/.source.yaml follow=False _original_basename=openstack_network_exporter.yaml.j2 checksum=87dede51a10e22722618c1900db75cb764463d91 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 16 13:13:26 compute-0 python3.9[190556]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/firewall.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:13:27 compute-0 python3.9[190677]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/firewall.yaml mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771247606.4463196-528-276476839470222/.source.yaml _original_basename=firewall.yaml follow=False checksum=d942d984493b214bda2913f753ff68cdcedff00e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 16 13:13:28 compute-0 python3.9[190827]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/node_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:13:28 compute-0 python3.9[190948]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/node_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1771247607.936146-560-276224723077112/.source.yaml _original_basename=node_exporter.yaml follow=False checksum=81d906d3e1e8c4f8367276f5d3a67b80ca7e989e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:13:29 compute-0 python3.9[191098]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/podman_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:13:30 compute-0 python3.9[191219]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/podman_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1771247609.141603-590-133916290451824/.source.yaml _original_basename=podman_exporter.yaml follow=False checksum=7ccb5eca2ff1dc337c3f3ecbbff5245af7149c47 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:13:30 compute-0 python3.9[191369]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:13:31 compute-0 python3.9[191490]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1771247610.2842202-620-77785589028325/.source.yaml _original_basename=ceilometer_prom_exporter.yaml follow=False checksum=10157c879411ee6023e506dc85a343cedc52700f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:13:31 compute-0 sudo[191640]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gldunbeomjejsvycjgqfgewjwgycapzg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247611.361653-650-229060517533795/AnsiballZ_file.py'
Feb 16 13:13:31 compute-0 sudo[191640]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:13:31 compute-0 python3.9[191642]: ansible-ansible.builtin.file Invoked with group=ceilometer mode=0644 owner=ceilometer path=/var/lib/openstack/certs/telemetry/default/tls.crt recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:13:31 compute-0 sudo[191640]: pam_unix(sudo:session): session closed for user root
Feb 16 13:13:32 compute-0 sudo[191792]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zfbldmvozxcriznvbqghfrpclcgxeacm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247611.994958-666-129762387684762/AnsiballZ_file.py'
Feb 16 13:13:32 compute-0 sudo[191792]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:13:32 compute-0 python3.9[191794]: ansible-ansible.builtin.file Invoked with group=ceilometer mode=0644 owner=ceilometer path=/var/lib/openstack/certs/telemetry/default/tls.key recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:13:32 compute-0 sudo[191792]: pam_unix(sudo:session): session closed for user root
Feb 16 13:13:32 compute-0 python3.9[191944]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 16 13:13:33 compute-0 python3.9[192096]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 16 13:13:34 compute-0 python3.9[192248]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 16 13:13:34 compute-0 sudo[192400]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vomshtprwyrphcmzeifnzmppbdrqlqiv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247614.4220622-730-220524999367551/AnsiballZ_file.py'
Feb 16 13:13:34 compute-0 sudo[192400]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:13:34 compute-0 python3.9[192402]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 16 13:13:34 compute-0 sudo[192400]: pam_unix(sudo:session): session closed for user root
Feb 16 13:13:35 compute-0 sudo[192552]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rkglkuizgyggzxcanzifyicudezqiwiu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247615.0768468-746-202426608906964/AnsiballZ_systemd_service.py'
Feb 16 13:13:35 compute-0 sudo[192552]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:13:35 compute-0 python3.9[192554]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=podman.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 16 13:13:35 compute-0 systemd[1]: Reloading.
Feb 16 13:13:35 compute-0 systemd-sysv-generator[192583]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 16 13:13:35 compute-0 systemd-rc-local-generator[192578]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 13:13:35 compute-0 systemd[1]: Listening on Podman API Socket.
Feb 16 13:13:35 compute-0 sudo[192552]: pam_unix(sudo:session): session closed for user root
Feb 16 13:13:36 compute-0 sudo[192750]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-alhchkzrvwdbcqejpsbozraupsuzzuxy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247616.1929188-764-38056140139807/AnsiballZ_stat.py'
Feb 16 13:13:36 compute-0 sudo[192750]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:13:36 compute-0 python3.9[192752]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/podman_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:13:36 compute-0 sudo[192750]: pam_unix(sudo:session): session closed for user root
Feb 16 13:13:36 compute-0 sudo[192873]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ukfcgyrjmqihqghvpxqcjsietdwxicxw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247616.1929188-764-38056140139807/AnsiballZ_copy.py'
Feb 16 13:13:36 compute-0 sudo[192873]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:13:37 compute-0 python3.9[192875]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/podman_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771247616.1929188-764-38056140139807/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Feb 16 13:13:37 compute-0 sudo[192873]: pam_unix(sudo:session): session closed for user root
Feb 16 13:13:37 compute-0 sudo[193035]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-avcrhqmiyohhfmbybevofzsorbvcjpjt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247617.7135046-806-118650652500127/AnsiballZ_file.py'
Feb 16 13:13:37 compute-0 sudo[193035]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:13:38 compute-0 podman[192999]: 2026-02-16 13:13:38.004968328 +0000 UTC m=+0.075059965 container health_status d69bd0be854ec8043fcba555b70c2cd311c7a2510d07d6bc9929fe7684abece9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Feb 16 13:13:38 compute-0 python3.9[193045]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:13:38 compute-0 sudo[193035]: pam_unix(sudo:session): session closed for user root
Feb 16 13:13:38 compute-0 sudo[193199]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wmcmghkaegotnujxjyhxpdxrpwqtoozr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247618.337889-822-176796531010643/AnsiballZ_file.py'
Feb 16 13:13:38 compute-0 sudo[193199]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:13:38 compute-0 python3.9[193201]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 16 13:13:38 compute-0 sudo[193199]: pam_unix(sudo:session): session closed for user root
Feb 16 13:13:39 compute-0 python3.9[193351]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/podman_exporter state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:13:41 compute-0 sudo[193786]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-thcazlywfgnxeuihtmlrkeyvhovpniqb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247621.062229-890-3592357587203/AnsiballZ_container_config_data.py'
Feb 16 13:13:41 compute-0 sudo[193786]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:13:41 compute-0 podman[193746]: 2026-02-16 13:13:41.58210177 +0000 UTC m=+0.120353893 container health_status 19961a2f872cc72daf954f4dd556f980b5f58f137d145776df6132c167a8a93c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Feb 16 13:13:41 compute-0 python3.9[193794]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/podman_exporter config_pattern=*.json debug=False
Feb 16 13:13:41 compute-0 sudo[193786]: pam_unix(sudo:session): session closed for user root
Feb 16 13:13:42 compute-0 sudo[193951]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pjfqwqsiroedbtiaknumcdykxefqppgw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247622.146669-912-181690134467850/AnsiballZ_container_config_hash.py'
Feb 16 13:13:42 compute-0 sudo[193951]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:13:42 compute-0 python3.9[193953]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Feb 16 13:13:42 compute-0 sudo[193951]: pam_unix(sudo:session): session closed for user root
Feb 16 13:13:43 compute-0 sudo[194103]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dylyavdfcmcnbspvrqckctyzzsetllmu ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1771247623.11423-932-237765682711856/AnsiballZ_edpm_container_manage.py'
Feb 16 13:13:43 compute-0 sudo[194103]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:13:43 compute-0 python3[194105]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/podman_exporter config_id=podman_exporter config_overrides={} config_patterns=*.json containers=['podman_exporter'] log_base_path=/var/log/containers/stdouts debug=False
Feb 16 13:13:45 compute-0 sshd-session[194140]: Connection closed by authenticating user root 146.190.226.24 port 48838 [preauth]
Feb 16 13:13:46 compute-0 podman[194118]: 2026-02-16 13:13:46.119986591 +0000 UTC m=+2.164615789 image pull e56d40e393eb5ea8704d9af8cf0d74665df83747106713fda91530f201837815 quay.io/navidys/prometheus-podman-exporter:v1.10.1
Feb 16 13:13:46 compute-0 podman[194217]: 2026-02-16 13:13:46.250366468 +0000 UTC m=+0.049349616 container create 4ec6105d1727f201f656d7e6e358931be8ceabc5af37fb5ad779abc620122180 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, config_id=podman_exporter, container_name=podman_exporter)
Feb 16 13:13:46 compute-0 podman[194217]: 2026-02-16 13:13:46.22285279 +0000 UTC m=+0.021835958 image pull e56d40e393eb5ea8704d9af8cf0d74665df83747106713fda91530f201837815 quay.io/navidys/prometheus-podman-exporter:v1.10.1
Feb 16 13:13:46 compute-0 python3[194105]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name podman_exporter --conmon-pidfile /run/podman_exporter.pid --env CONTAINER_HOST=unix:///run/podman/podman.sock --env OS_ENDPOINT_TYPE=internal --env EDPM_CONFIG_HASH=8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535 --healthcheck-command /openstack/healthcheck podman_exporter --label config_id=podman_exporter --label container_name=podman_exporter --label managed_by=edpm_ansible --label config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9882:9882 --user root --volume /var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z --volume /run/podman/podman.sock:/run/podman/podman.sock:rw,z --volume /var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z quay.io/navidys/prometheus-podman-exporter:v1.10.1 --web.config.file=/etc/podman_exporter/podman_exporter.yaml
Feb 16 13:13:46 compute-0 sudo[194103]: pam_unix(sudo:session): session closed for user root
Feb 16 13:13:46 compute-0 sudo[194403]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gnemnisaghxwbfegxuimmyfnwfgotrwk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247626.53042-948-69071267351587/AnsiballZ_stat.py'
Feb 16 13:13:46 compute-0 sudo[194403]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:13:46 compute-0 python3.9[194405]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 16 13:13:47 compute-0 sudo[194403]: pam_unix(sudo:session): session closed for user root
Feb 16 13:13:47 compute-0 sudo[194557]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-umqknbcpyeblupekmvsaofmhcbrpcncp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247627.3145916-966-15632265951306/AnsiballZ_file.py'
Feb 16 13:13:47 compute-0 sudo[194557]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:13:47 compute-0 python3.9[194559]: ansible-file Invoked with path=/etc/systemd/system/edpm_podman_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:13:47 compute-0 sudo[194557]: pam_unix(sudo:session): session closed for user root
Feb 16 13:13:48 compute-0 sudo[194633]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kdlmxpcbprwulamandvjjhaghuknaqbd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247627.3145916-966-15632265951306/AnsiballZ_stat.py'
Feb 16 13:13:48 compute-0 sudo[194633]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:13:48 compute-0 python3.9[194635]: ansible-stat Invoked with path=/etc/systemd/system/edpm_podman_exporter_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 16 13:13:48 compute-0 sudo[194633]: pam_unix(sudo:session): session closed for user root
Feb 16 13:13:48 compute-0 sudo[194784]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xrszpdakiybnyttzepedjqvscjrlgzpf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247628.28207-966-65722586650872/AnsiballZ_copy.py'
Feb 16 13:13:48 compute-0 sudo[194784]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:13:48 compute-0 python3.9[194786]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1771247628.28207-966-65722586650872/source dest=/etc/systemd/system/edpm_podman_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:13:48 compute-0 sudo[194784]: pam_unix(sudo:session): session closed for user root
Feb 16 13:13:49 compute-0 sudo[194860]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ahsvxumpammsaghcbgtmlezuibwfklqj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247628.28207-966-65722586650872/AnsiballZ_systemd.py'
Feb 16 13:13:49 compute-0 sudo[194860]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:13:49 compute-0 python3.9[194862]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 16 13:13:49 compute-0 systemd[1]: Reloading.
Feb 16 13:13:50 compute-0 systemd-sysv-generator[194891]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 16 13:13:50 compute-0 systemd-rc-local-generator[194885]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 13:13:50 compute-0 sudo[194860]: pam_unix(sudo:session): session closed for user root
Feb 16 13:13:50 compute-0 sudo[194979]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dqajjcxwtwyljwrusgipvgjidsinjsoj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247628.28207-966-65722586650872/AnsiballZ_systemd.py'
Feb 16 13:13:50 compute-0 sudo[194979]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:13:50 compute-0 python3.9[194981]: ansible-systemd Invoked with state=restarted name=edpm_podman_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 16 13:13:50 compute-0 systemd[1]: Reloading.
Feb 16 13:13:50 compute-0 systemd-rc-local-generator[195010]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 13:13:50 compute-0 systemd-sysv-generator[195014]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 16 13:13:51 compute-0 systemd[1]: Starting podman_exporter container...
Feb 16 13:13:51 compute-0 systemd[1]: Started libcrun container.
Feb 16 13:13:51 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cc281097d4654240a68e77bf63a92a4f784ee0e8a7f4ecc953cb5ca7c183bf45/merged/etc/podman_exporter/podman_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Feb 16 13:13:51 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cc281097d4654240a68e77bf63a92a4f784ee0e8a7f4ecc953cb5ca7c183bf45/merged/etc/podman_exporter/tls supports timestamps until 2038 (0x7fffffff)
Feb 16 13:13:51 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 4ec6105d1727f201f656d7e6e358931be8ceabc5af37fb5ad779abc620122180.
Feb 16 13:13:51 compute-0 podman[195027]: 2026-02-16 13:13:51.190636227 +0000 UTC m=+0.136384356 container init 4ec6105d1727f201f656d7e6e358931be8ceabc5af37fb5ad779abc620122180 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 16 13:13:51 compute-0 podman_exporter[195042]: ts=2026-02-16T13:13:51.205Z caller=exporter.go:68 level=info msg="Starting podman-prometheus-exporter" version="(version=1.10.1, branch=HEAD, revision=1)"
Feb 16 13:13:51 compute-0 podman_exporter[195042]: ts=2026-02-16T13:13:51.205Z caller=exporter.go:69 level=info msg=metrics enhanced=false
Feb 16 13:13:51 compute-0 podman_exporter[195042]: ts=2026-02-16T13:13:51.205Z caller=handler.go:94 level=info msg="enabled collectors"
Feb 16 13:13:51 compute-0 podman_exporter[195042]: ts=2026-02-16T13:13:51.205Z caller=handler.go:105 level=info collector=container
Feb 16 13:13:51 compute-0 podman[195027]: 2026-02-16 13:13:51.213314467 +0000 UTC m=+0.159062566 container start 4ec6105d1727f201f656d7e6e358931be8ceabc5af37fb5ad779abc620122180 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 16 13:13:51 compute-0 systemd[1]: Starting Podman API Service...
Feb 16 13:13:51 compute-0 podman[195027]: podman_exporter
Feb 16 13:13:51 compute-0 systemd[1]: Started Podman API Service.
Feb 16 13:13:51 compute-0 systemd[1]: Started podman_exporter container.
Feb 16 13:13:51 compute-0 podman[195053]: time="2026-02-16T13:13:51Z" level=info msg="/usr/bin/podman filtering at log level info"
Feb 16 13:13:51 compute-0 podman[195053]: time="2026-02-16T13:13:51Z" level=info msg="Setting parallel job count to 25"
Feb 16 13:13:51 compute-0 podman[195053]: time="2026-02-16T13:13:51Z" level=info msg="Using sqlite as database backend"
Feb 16 13:13:51 compute-0 podman[195053]: time="2026-02-16T13:13:51Z" level=info msg="Not using native diff for overlay, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled"
Feb 16 13:13:51 compute-0 podman[195053]: time="2026-02-16T13:13:51Z" level=info msg="Using systemd socket activation to determine API endpoint"
Feb 16 13:13:51 compute-0 podman[195053]: time="2026-02-16T13:13:51Z" level=info msg="API service listening on \"/run/podman/podman.sock\". URI: \"unix:///run/podman/podman.sock\""
Feb 16 13:13:51 compute-0 podman[195053]: @ - - [16/Feb/2026:13:13:51 +0000] "GET /v4.9.3/libpod/_ping HTTP/1.1" 200 2 "" "Go-http-client/1.1"
Feb 16 13:13:51 compute-0 sudo[194979]: pam_unix(sudo:session): session closed for user root
Feb 16 13:13:51 compute-0 podman[195053]: time="2026-02-16T13:13:51Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:13:51 compute-0 podman[195053]: @ - - [16/Feb/2026:13:13:51 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=true&sync=false HTTP/1.1" 200 12585 "" "Go-http-client/1.1"
Feb 16 13:13:51 compute-0 podman_exporter[195042]: ts=2026-02-16T13:13:51.261Z caller=exporter.go:96 level=info msg="Listening on" address=:9882
Feb 16 13:13:51 compute-0 podman_exporter[195042]: ts=2026-02-16T13:13:51.262Z caller=tls_config.go:313 level=info msg="Listening on" address=[::]:9882
Feb 16 13:13:51 compute-0 podman_exporter[195042]: ts=2026-02-16T13:13:51.262Z caller=tls_config.go:349 level=info msg="TLS is enabled." http2=true address=[::]:9882
Feb 16 13:13:51 compute-0 podman[195051]: 2026-02-16 13:13:51.291044412 +0000 UTC m=+0.069960151 container health_status 4ec6105d1727f201f656d7e6e358931be8ceabc5af37fb5ad779abc620122180 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=starting, health_failing_streak=1, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 16 13:13:51 compute-0 systemd[1]: 4ec6105d1727f201f656d7e6e358931be8ceabc5af37fb5ad779abc620122180-2f3fe8eda982c4f9.service: Main process exited, code=exited, status=1/FAILURE
Feb 16 13:13:51 compute-0 systemd[1]: 4ec6105d1727f201f656d7e6e358931be8ceabc5af37fb5ad779abc620122180-2f3fe8eda982c4f9.service: Failed with result 'exit-code'.
Feb 16 13:13:52 compute-0 python3.9[195237]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Feb 16 13:13:52 compute-0 rsyslogd[1017]: imjournal: 1884 messages lost due to rate-limiting (20000 allowed within 600 seconds)
Feb 16 13:13:53 compute-0 sudo[195387]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kvkmmepxpzwchobbezhjrtvwmvobpuxt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247632.7727084-1056-187236384548581/AnsiballZ_stat.py'
Feb 16 13:13:53 compute-0 sudo[195387]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:13:53 compute-0 python3.9[195389]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:13:53 compute-0 sudo[195387]: pam_unix(sudo:session): session closed for user root
Feb 16 13:13:53 compute-0 nova_compute[185723]: 2026-02-16 13:13:53.434 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:13:53 compute-0 nova_compute[185723]: 2026-02-16 13:13:53.435 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:13:53 compute-0 nova_compute[185723]: 2026-02-16 13:13:53.435 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 16 13:13:53 compute-0 nova_compute[185723]: 2026-02-16 13:13:53.435 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 16 13:13:53 compute-0 nova_compute[185723]: 2026-02-16 13:13:53.454 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 16 13:13:53 compute-0 nova_compute[185723]: 2026-02-16 13:13:53.455 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:13:53 compute-0 nova_compute[185723]: 2026-02-16 13:13:53.455 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:13:53 compute-0 nova_compute[185723]: 2026-02-16 13:13:53.456 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:13:53 compute-0 nova_compute[185723]: 2026-02-16 13:13:53.456 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:13:53 compute-0 nova_compute[185723]: 2026-02-16 13:13:53.456 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:13:53 compute-0 nova_compute[185723]: 2026-02-16 13:13:53.456 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:13:53 compute-0 nova_compute[185723]: 2026-02-16 13:13:53.457 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 16 13:13:53 compute-0 nova_compute[185723]: 2026-02-16 13:13:53.457 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:13:53 compute-0 nova_compute[185723]: 2026-02-16 13:13:53.485 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:13:53 compute-0 nova_compute[185723]: 2026-02-16 13:13:53.485 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:13:53 compute-0 nova_compute[185723]: 2026-02-16 13:13:53.485 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:13:53 compute-0 nova_compute[185723]: 2026-02-16 13:13:53.485 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 16 13:13:53 compute-0 sudo[195512]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zrismqwkwczthwkmocaqrxaguivatcki ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247632.7727084-1056-187236384548581/AnsiballZ_copy.py'
Feb 16 13:13:53 compute-0 sudo[195512]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:13:53 compute-0 nova_compute[185723]: 2026-02-16 13:13:53.610 185727 WARNING nova.virt.libvirt.driver [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 13:13:53 compute-0 nova_compute[185723]: 2026-02-16 13:13:53.611 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6033MB free_disk=73.38699340820312GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 16 13:13:53 compute-0 nova_compute[185723]: 2026-02-16 13:13:53.612 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:13:53 compute-0 nova_compute[185723]: 2026-02-16 13:13:53.612 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:13:53 compute-0 nova_compute[185723]: 2026-02-16 13:13:53.688 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 16 13:13:53 compute-0 nova_compute[185723]: 2026-02-16 13:13:53.689 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 16 13:13:53 compute-0 nova_compute[185723]: 2026-02-16 13:13:53.709 185727 DEBUG nova.compute.provider_tree [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Inventory has not changed in ProviderTree for provider: c9501a85-df32-4b8f-bce0-9425ef1e7866 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 13:13:53 compute-0 python3.9[195514]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771247632.7727084-1056-187236384548581/.source.yaml _original_basename=.c4mj04ta follow=False checksum=280f1141251475cb4d34033a34a96ef19e265de1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:13:53 compute-0 nova_compute[185723]: 2026-02-16 13:13:53.727 185727 DEBUG nova.scheduler.client.report [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Inventory has not changed for provider c9501a85-df32-4b8f-bce0-9425ef1e7866 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 13:13:53 compute-0 nova_compute[185723]: 2026-02-16 13:13:53.729 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 16 13:13:53 compute-0 nova_compute[185723]: 2026-02-16 13:13:53.730 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.118s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:13:53 compute-0 sudo[195512]: pam_unix(sudo:session): session closed for user root
Feb 16 13:13:54 compute-0 sudo[195664]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gmodplvyfbholtbevwjjhcaunxiqynok ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247633.912367-1086-277804771288556/AnsiballZ_stat.py'
Feb 16 13:13:54 compute-0 sudo[195664]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:13:54 compute-0 python3.9[195666]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/openstack_network_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:13:54 compute-0 sudo[195664]: pam_unix(sudo:session): session closed for user root
Feb 16 13:13:54 compute-0 sudo[195787]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-utztqkybruiuhbwxtmfeetcuydldgcno ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247633.912367-1086-277804771288556/AnsiballZ_copy.py'
Feb 16 13:13:54 compute-0 sudo[195787]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:13:54 compute-0 python3.9[195789]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/openstack_network_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771247633.912367-1086-277804771288556/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Feb 16 13:13:54 compute-0 sudo[195787]: pam_unix(sudo:session): session closed for user root
Feb 16 13:13:55 compute-0 sudo[195939]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ujhlfmzczlvaqkdooahyggnjhfrecgjw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247635.5091279-1128-222839209819520/AnsiballZ_file.py'
Feb 16 13:13:55 compute-0 sudo[195939]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:13:55 compute-0 python3.9[195941]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:13:55 compute-0 sudo[195939]: pam_unix(sudo:session): session closed for user root
Feb 16 13:13:56 compute-0 sudo[196091]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fvorffsgxooemsmoiggnpstvnuyeixzp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247636.1323037-1144-6371626350523/AnsiballZ_file.py'
Feb 16 13:13:56 compute-0 sudo[196091]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:13:56 compute-0 python3.9[196093]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 16 13:13:56 compute-0 sudo[196091]: pam_unix(sudo:session): session closed for user root
Feb 16 13:13:57 compute-0 python3.9[196243]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/openstack_network_exporter state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:13:58 compute-0 sudo[196664]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kqewiknlxvgmdtcigvudmxjhavzdgros ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247638.693023-1212-67561280601621/AnsiballZ_container_config_data.py'
Feb 16 13:13:58 compute-0 sudo[196664]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:13:59 compute-0 python3.9[196666]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/openstack_network_exporter config_pattern=*.json debug=False
Feb 16 13:13:59 compute-0 sudo[196664]: pam_unix(sudo:session): session closed for user root
Feb 16 13:13:59 compute-0 sudo[196816]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pevattvxtxigcorgbsiqhpxlcumrlouj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247639.6075706-1234-128552473281295/AnsiballZ_container_config_hash.py'
Feb 16 13:13:59 compute-0 sudo[196816]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:14:00 compute-0 python3.9[196818]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Feb 16 13:14:00 compute-0 sudo[196816]: pam_unix(sudo:session): session closed for user root
Feb 16 13:14:00 compute-0 sudo[196968]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ejssskfgihavwsdiatmstvembwsxvomb ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1771247640.4153824-1254-5141204753230/AnsiballZ_edpm_container_manage.py'
Feb 16 13:14:00 compute-0 sudo[196968]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:14:00 compute-0 python3[196970]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/openstack_network_exporter config_id=openstack_network_exporter config_overrides={} config_patterns=*.json containers=['openstack_network_exporter'] log_base_path=/var/log/containers/stdouts debug=False
Feb 16 13:14:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:14:03.202 105360 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:14:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:14:03.204 105360 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:14:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:14:03.204 105360 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:14:03 compute-0 podman[196983]: 2026-02-16 13:14:03.22091387 +0000 UTC m=+2.254066874 image pull 8da9a5cb84d98cc9d82bfbfe59b1a8f3d35b219d7fadc752f19c50c8fa4c9c58 quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Feb 16 13:14:03 compute-0 podman[197082]: 2026-02-16 13:14:03.327323723 +0000 UTC m=+0.042701269 container create 93151dcbec9f159583042df5101bdd7b5b1bcb5f3117955b4bc9cd1c0e465b98 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, io.openshift.expose-services=, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=openstack_network_exporter, vcs-type=git, architecture=x86_64, name=ubi9/ubi-minimal, com.redhat.component=ubi9-minimal-container, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1770267347, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2026-02-05T04:57:10Z, managed_by=edpm_ansible, maintainer=Red Hat, Inc., build-date=2026-02-05T04:57:10Z, distribution-scope=public, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter)
Feb 16 13:14:03 compute-0 podman[197082]: 2026-02-16 13:14:03.302776245 +0000 UTC m=+0.018153801 image pull 8da9a5cb84d98cc9d82bfbfe59b1a8f3d35b219d7fadc752f19c50c8fa4c9c58 quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Feb 16 13:14:03 compute-0 python3[196970]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name openstack_network_exporter --conmon-pidfile /run/openstack_network_exporter.pid --env OPENSTACK_NETWORK_EXPORTER_YAML=/etc/openstack_network_exporter/openstack_network_exporter.yaml --env OS_ENDPOINT_TYPE=internal --env EDPM_CONFIG_HASH=8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535 --healthcheck-command /openstack/healthcheck openstack-netwo --label config_id=openstack_network_exporter --label container_name=openstack_network_exporter --label managed_by=edpm_ansible --label config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9105:9105 --volume /var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z --volume /var/run/openvswitch:/run/openvswitch:rw,z --volume /var/lib/openvswitch/ovn:/run/ovn:rw,z --volume /proc:/host/proc:ro --volume /var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Feb 16 13:14:03 compute-0 sudo[196968]: pam_unix(sudo:session): session closed for user root
Feb 16 13:14:03 compute-0 sshd-session[197057]: Connection closed by authenticating user root 64.227.72.94 port 48992 [preauth]
Feb 16 13:14:04 compute-0 sudo[197269]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zhmxyikcgzkgdehfjxsunqovgqeeanca ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247643.7963612-1270-107240524556416/AnsiballZ_stat.py'
Feb 16 13:14:04 compute-0 sudo[197269]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:14:04 compute-0 python3.9[197271]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 16 13:14:04 compute-0 sudo[197269]: pam_unix(sudo:session): session closed for user root
Feb 16 13:14:04 compute-0 sudo[197423]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rhrymglrrhixakynbkzizteemvffyhgq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247644.5523458-1288-268295251470781/AnsiballZ_file.py'
Feb 16 13:14:04 compute-0 sudo[197423]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:14:04 compute-0 python3.9[197425]: ansible-file Invoked with path=/etc/systemd/system/edpm_openstack_network_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:14:05 compute-0 sudo[197423]: pam_unix(sudo:session): session closed for user root
Feb 16 13:14:05 compute-0 sudo[197499]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bjrlaertapsizepiwjhttspdgjuioesk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247644.5523458-1288-268295251470781/AnsiballZ_stat.py'
Feb 16 13:14:05 compute-0 sudo[197499]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:14:05 compute-0 python3.9[197501]: ansible-stat Invoked with path=/etc/systemd/system/edpm_openstack_network_exporter_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 16 13:14:05 compute-0 sudo[197499]: pam_unix(sudo:session): session closed for user root
Feb 16 13:14:05 compute-0 sudo[197650]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tzvjtsggmwpktaajweclxwxzmwiubntp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247645.420818-1288-7466409388039/AnsiballZ_copy.py'
Feb 16 13:14:05 compute-0 sudo[197650]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:14:05 compute-0 python3.9[197652]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1771247645.420818-1288-7466409388039/source dest=/etc/systemd/system/edpm_openstack_network_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:14:05 compute-0 sudo[197650]: pam_unix(sudo:session): session closed for user root
Feb 16 13:14:06 compute-0 sudo[197726]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cslzwjcedmqacvtsnbkdwgvcyjmdjpse ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247645.420818-1288-7466409388039/AnsiballZ_systemd.py'
Feb 16 13:14:06 compute-0 sudo[197726]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:14:06 compute-0 python3.9[197728]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 16 13:14:06 compute-0 systemd[1]: Reloading.
Feb 16 13:14:06 compute-0 systemd-rc-local-generator[197752]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 13:14:06 compute-0 systemd-sysv-generator[197757]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 16 13:14:06 compute-0 sudo[197726]: pam_unix(sudo:session): session closed for user root
Feb 16 13:14:06 compute-0 sudo[197844]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iedcfopzusenktcrrrgzjrekwippwwuk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247645.420818-1288-7466409388039/AnsiballZ_systemd.py'
Feb 16 13:14:06 compute-0 sudo[197844]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:14:07 compute-0 python3.9[197846]: ansible-systemd Invoked with state=restarted name=edpm_openstack_network_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 16 13:14:07 compute-0 systemd[1]: Reloading.
Feb 16 13:14:07 compute-0 systemd-rc-local-generator[197875]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 13:14:07 compute-0 systemd-sysv-generator[197881]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 16 13:14:07 compute-0 systemd[1]: Starting openstack_network_exporter container...
Feb 16 13:14:07 compute-0 systemd[1]: Started libcrun container.
Feb 16 13:14:07 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1b912c37a86b512e66231de23e402fe127579a8c7c002e8d6fe4316663a0c7c6/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Feb 16 13:14:07 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1b912c37a86b512e66231de23e402fe127579a8c7c002e8d6fe4316663a0c7c6/merged/etc/openstack_network_exporter/openstack_network_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Feb 16 13:14:07 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1b912c37a86b512e66231de23e402fe127579a8c7c002e8d6fe4316663a0c7c6/merged/etc/openstack_network_exporter/tls supports timestamps until 2038 (0x7fffffff)
Feb 16 13:14:07 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 93151dcbec9f159583042df5101bdd7b5b1bcb5f3117955b4bc9cd1c0e465b98.
Feb 16 13:14:07 compute-0 podman[197894]: 2026-02-16 13:14:07.75617878 +0000 UTC m=+0.194924993 container init 93151dcbec9f159583042df5101bdd7b5b1bcb5f3117955b4bc9cd1c0e465b98 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, build-date=2026-02-05T04:57:10Z, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., name=ubi9/ubi-minimal, org.opencontainers.image.created=2026-02-05T04:57:10Z, distribution-scope=public, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, vendor=Red Hat, Inc., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, release=1770267347, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, io.openshift.expose-services=, config_id=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git)
Feb 16 13:14:07 compute-0 openstack_network_exporter[197909]: INFO    13:14:07 main.go:48: registering *bridge.Collector
Feb 16 13:14:07 compute-0 openstack_network_exporter[197909]: INFO    13:14:07 main.go:48: registering *coverage.Collector
Feb 16 13:14:07 compute-0 openstack_network_exporter[197909]: INFO    13:14:07 main.go:48: registering *datapath.Collector
Feb 16 13:14:07 compute-0 openstack_network_exporter[197909]: INFO    13:14:07 main.go:48: registering *iface.Collector
Feb 16 13:14:07 compute-0 openstack_network_exporter[197909]: INFO    13:14:07 main.go:48: registering *memory.Collector
Feb 16 13:14:07 compute-0 openstack_network_exporter[197909]: INFO    13:14:07 main.go:55: *ovnnorthd.Collector not registered, metric set not enabled
Feb 16 13:14:07 compute-0 openstack_network_exporter[197909]: INFO    13:14:07 main.go:48: registering *ovn.Collector
Feb 16 13:14:07 compute-0 openstack_network_exporter[197909]: INFO    13:14:07 main.go:55: *ovsdbserver.Collector not registered, metric set not enabled
Feb 16 13:14:07 compute-0 openstack_network_exporter[197909]: INFO    13:14:07 main.go:48: registering *pmd_perf.Collector
Feb 16 13:14:07 compute-0 openstack_network_exporter[197909]: INFO    13:14:07 main.go:48: registering *pmd_rxq.Collector
Feb 16 13:14:07 compute-0 openstack_network_exporter[197909]: INFO    13:14:07 main.go:48: registering *vswitch.Collector
Feb 16 13:14:07 compute-0 openstack_network_exporter[197909]: NOTICE  13:14:07 main.go:76: listening on https://:9105/metrics
Feb 16 13:14:07 compute-0 podman[197894]: 2026-02-16 13:14:07.785809594 +0000 UTC m=+0.224555797 container start 93151dcbec9f159583042df5101bdd7b5b1bcb5f3117955b4bc9cd1c0e465b98 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2026-02-05T04:57:10Z, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, release=1770267347, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, vcs-type=git, build-date=2026-02-05T04:57:10Z, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., version=9.7, container_name=openstack_network_exporter, io.openshift.expose-services=, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 16 13:14:07 compute-0 podman[197894]: openstack_network_exporter
Feb 16 13:14:07 compute-0 systemd[1]: Started openstack_network_exporter container.
Feb 16 13:14:07 compute-0 sudo[197844]: pam_unix(sudo:session): session closed for user root
Feb 16 13:14:07 compute-0 podman[197919]: 2026-02-16 13:14:07.849566591 +0000 UTC m=+0.056988889 container health_status 93151dcbec9f159583042df5101bdd7b5b1bcb5f3117955b4bc9cd1c0e465b98 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1770267347, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-02-05T04:57:10Z, version=9.7, config_id=openstack_network_exporter, io.buildah.version=1.33.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-type=git, build-date=2026-02-05T04:57:10Z, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, name=ubi9/ubi-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc.)
Feb 16 13:14:08 compute-0 podman[198069]: 2026-02-16 13:14:08.749952495 +0000 UTC m=+0.041386045 container health_status d69bd0be854ec8043fcba555b70c2cd311c7a2510d07d6bc9929fe7684abece9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible)
Feb 16 13:14:08 compute-0 python3.9[198110]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Feb 16 13:14:09 compute-0 sudo[198264]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vhnopqjcpjyevaqtmglmqrtnczeotbna ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247649.6152954-1378-162721152949301/AnsiballZ_stat.py'
Feb 16 13:14:09 compute-0 sudo[198264]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:14:10 compute-0 python3.9[198266]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:14:10 compute-0 sudo[198264]: pam_unix(sudo:session): session closed for user root
Feb 16 13:14:10 compute-0 sudo[198389]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ggawvxaftmlmwiqcueanqbfcdazaaxsu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247649.6152954-1378-162721152949301/AnsiballZ_copy.py'
Feb 16 13:14:10 compute-0 sudo[198389]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:14:10 compute-0 python3.9[198391]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771247649.6152954-1378-162721152949301/.source.yaml _original_basename=.19gejqdd follow=False checksum=6d20d35f7d87354f9b20be3862dc4377478a0a27 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:14:10 compute-0 sudo[198389]: pam_unix(sudo:session): session closed for user root
Feb 16 13:14:10 compute-0 sudo[198541]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ahqomepreuetwiptnfpjzhzwtupmugjd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247650.764697-1408-271159921898671/AnsiballZ_find.py'
Feb 16 13:14:10 compute-0 sudo[198541]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:14:11 compute-0 python3.9[198543]: ansible-ansible.builtin.find Invoked with file_type=directory paths=['/var/lib/openstack/healthchecks/'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Feb 16 13:14:11 compute-0 sudo[198541]: pam_unix(sudo:session): session closed for user root
Feb 16 13:14:12 compute-0 sudo[198713]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ridkrpjfvwyfjzqvgeenfmspshxeyhnu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247651.5871043-1427-207625102416866/AnsiballZ_podman_container_info.py'
Feb 16 13:14:12 compute-0 sudo[198713]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:14:12 compute-0 podman[198646]: 2026-02-16 13:14:12.027109486 +0000 UTC m=+0.064053825 container health_status 19961a2f872cc72daf954f4dd556f980b5f58f137d145776df6132c167a8a93c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller)
Feb 16 13:14:12 compute-0 python3.9[198721]: ansible-containers.podman.podman_container_info Invoked with name=['ovn_controller'] executable=podman
Feb 16 13:14:12 compute-0 sudo[198713]: pam_unix(sudo:session): session closed for user root
Feb 16 13:14:13 compute-0 sudo[198884]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mgtykokpwuccfahkebpxesbkshbpoens ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247652.7292943-1435-229286445789049/AnsiballZ_podman_container_exec.py'
Feb 16 13:14:13 compute-0 sudo[198884]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:14:13 compute-0 python3.9[198886]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ovn_controller detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Feb 16 13:14:13 compute-0 systemd[1]: Started libpod-conmon-19961a2f872cc72daf954f4dd556f980b5f58f137d145776df6132c167a8a93c.scope.
Feb 16 13:14:13 compute-0 podman[198887]: 2026-02-16 13:14:13.553401318 +0000 UTC m=+0.088378018 container exec 19961a2f872cc72daf954f4dd556f980b5f58f137d145776df6132c167a8a93c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Feb 16 13:14:13 compute-0 podman[198887]: 2026-02-16 13:14:13.583719049 +0000 UTC m=+0.118695729 container exec_died 19961a2f872cc72daf954f4dd556f980b5f58f137d145776df6132c167a8a93c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20260127, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller)
Feb 16 13:14:13 compute-0 systemd[1]: libpod-conmon-19961a2f872cc72daf954f4dd556f980b5f58f137d145776df6132c167a8a93c.scope: Deactivated successfully.
Feb 16 13:14:13 compute-0 sudo[198884]: pam_unix(sudo:session): session closed for user root
Feb 16 13:14:13 compute-0 sudo[199068]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pfemfolaubmnpedjqikkkqiakujulmxe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247653.7519033-1443-70106637687775/AnsiballZ_podman_container_exec.py'
Feb 16 13:14:13 compute-0 sudo[199068]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:14:14 compute-0 python3.9[199070]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ovn_controller detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Feb 16 13:14:14 compute-0 systemd[1]: Started libpod-conmon-19961a2f872cc72daf954f4dd556f980b5f58f137d145776df6132c167a8a93c.scope.
Feb 16 13:14:14 compute-0 podman[199071]: 2026-02-16 13:14:14.221814429 +0000 UTC m=+0.078904127 container exec 19961a2f872cc72daf954f4dd556f980b5f58f137d145776df6132c167a8a93c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Feb 16 13:14:14 compute-0 podman[199071]: 2026-02-16 13:14:14.256753553 +0000 UTC m=+0.113843191 container exec_died 19961a2f872cc72daf954f4dd556f980b5f58f137d145776df6132c167a8a93c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Feb 16 13:14:14 compute-0 systemd[1]: libpod-conmon-19961a2f872cc72daf954f4dd556f980b5f58f137d145776df6132c167a8a93c.scope: Deactivated successfully.
Feb 16 13:14:14 compute-0 sudo[199068]: pam_unix(sudo:session): session closed for user root
Feb 16 13:14:14 compute-0 sudo[199252]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mgpcgrhuythddkyglgkvdmdjupkrcnbg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247654.5060325-1451-253742962771259/AnsiballZ_file.py'
Feb 16 13:14:14 compute-0 sudo[199252]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:14:14 compute-0 python3.9[199254]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/ovn_controller recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:14:14 compute-0 sudo[199252]: pam_unix(sudo:session): session closed for user root
Feb 16 13:14:15 compute-0 sudo[199404]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-krdrcazngpznkblnanjuzytrdlzvokoi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247655.1910248-1460-221074309771435/AnsiballZ_podman_container_info.py'
Feb 16 13:14:15 compute-0 sudo[199404]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:14:15 compute-0 python3.9[199406]: ansible-containers.podman.podman_container_info Invoked with name=['ovn_metadata_agent'] executable=podman
Feb 16 13:14:15 compute-0 sudo[199404]: pam_unix(sudo:session): session closed for user root
Feb 16 13:14:16 compute-0 sudo[199569]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xnqnkqaoveettuoyzopnxcogdrlbeuia ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247655.8558712-1468-187175475923403/AnsiballZ_podman_container_exec.py'
Feb 16 13:14:16 compute-0 sudo[199569]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:14:16 compute-0 python3.9[199571]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ovn_metadata_agent detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Feb 16 13:14:16 compute-0 systemd[1]: Started libpod-conmon-d69bd0be854ec8043fcba555b70c2cd311c7a2510d07d6bc9929fe7684abece9.scope.
Feb 16 13:14:16 compute-0 podman[199572]: 2026-02-16 13:14:16.347252891 +0000 UTC m=+0.068859892 container exec d69bd0be854ec8043fcba555b70c2cd311c7a2510d07d6bc9929fe7684abece9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20260127, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 16 13:14:16 compute-0 podman[199591]: 2026-02-16 13:14:16.404591406 +0000 UTC m=+0.048525383 container exec_died d69bd0be854ec8043fcba555b70c2cd311c7a2510d07d6bc9929fe7684abece9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127)
Feb 16 13:14:16 compute-0 podman[199572]: 2026-02-16 13:14:16.410668967 +0000 UTC m=+0.132275998 container exec_died d69bd0be854ec8043fcba555b70c2cd311c7a2510d07d6bc9929fe7684abece9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 16 13:14:16 compute-0 systemd[1]: libpod-conmon-d69bd0be854ec8043fcba555b70c2cd311c7a2510d07d6bc9929fe7684abece9.scope: Deactivated successfully.
Feb 16 13:14:16 compute-0 sudo[199569]: pam_unix(sudo:session): session closed for user root
Feb 16 13:14:16 compute-0 sudo[199753]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-illqyqgoorbssagolzepfherwxhgbzoy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247656.609652-1476-248821714298617/AnsiballZ_podman_container_exec.py'
Feb 16 13:14:16 compute-0 sudo[199753]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:14:17 compute-0 python3.9[199755]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ovn_metadata_agent detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Feb 16 13:14:17 compute-0 systemd[1]: Started libpod-conmon-d69bd0be854ec8043fcba555b70c2cd311c7a2510d07d6bc9929fe7684abece9.scope.
Feb 16 13:14:17 compute-0 podman[199756]: 2026-02-16 13:14:17.096809918 +0000 UTC m=+0.068101912 container exec d69bd0be854ec8043fcba555b70c2cd311c7a2510d07d6bc9929fe7684abece9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Feb 16 13:14:17 compute-0 podman[199756]: 2026-02-16 13:14:17.131582767 +0000 UTC m=+0.102874761 container exec_died d69bd0be854ec8043fcba555b70c2cd311c7a2510d07d6bc9929fe7684abece9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 16 13:14:17 compute-0 systemd[1]: libpod-conmon-d69bd0be854ec8043fcba555b70c2cd311c7a2510d07d6bc9929fe7684abece9.scope: Deactivated successfully.
Feb 16 13:14:17 compute-0 sudo[199753]: pam_unix(sudo:session): session closed for user root
Feb 16 13:14:17 compute-0 sudo[199937]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bhijgxheajqbyyjmyudsdjfvjzdlbriw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247657.3187728-1484-224233809283365/AnsiballZ_file.py'
Feb 16 13:14:17 compute-0 sudo[199937]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:14:17 compute-0 python3.9[199939]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/ovn_metadata_agent recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:14:17 compute-0 sudo[199937]: pam_unix(sudo:session): session closed for user root
Feb 16 13:14:18 compute-0 sudo[200089]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dzggvanuyigtigikhwrzuxbkyefygipk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247657.9692428-1493-37964897189007/AnsiballZ_podman_container_info.py'
Feb 16 13:14:18 compute-0 sudo[200089]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:14:18 compute-0 python3.9[200091]: ansible-containers.podman.podman_container_info Invoked with name=['podman_exporter'] executable=podman
Feb 16 13:14:18 compute-0 sudo[200089]: pam_unix(sudo:session): session closed for user root
Feb 16 13:14:18 compute-0 sudo[200254]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zijzjlgjrpoklruistialajasrasiikr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247658.5847447-1501-210728305515948/AnsiballZ_podman_container_exec.py'
Feb 16 13:14:18 compute-0 sudo[200254]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:14:18 compute-0 python3.9[200256]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=podman_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Feb 16 13:14:19 compute-0 systemd[1]: Started libpod-conmon-4ec6105d1727f201f656d7e6e358931be8ceabc5af37fb5ad779abc620122180.scope.
Feb 16 13:14:19 compute-0 podman[200257]: 2026-02-16 13:14:19.05700522 +0000 UTC m=+0.059612047 container exec 4ec6105d1727f201f656d7e6e358931be8ceabc5af37fb5ad779abc620122180 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 16 13:14:19 compute-0 podman[200276]: 2026-02-16 13:14:19.114596162 +0000 UTC m=+0.049432057 container exec_died 4ec6105d1727f201f656d7e6e358931be8ceabc5af37fb5ad779abc620122180 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 16 13:14:19 compute-0 podman[200257]: 2026-02-16 13:14:19.119512062 +0000 UTC m=+0.122118889 container exec_died 4ec6105d1727f201f656d7e6e358931be8ceabc5af37fb5ad779abc620122180 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 16 13:14:19 compute-0 systemd[1]: libpod-conmon-4ec6105d1727f201f656d7e6e358931be8ceabc5af37fb5ad779abc620122180.scope: Deactivated successfully.
Feb 16 13:14:19 compute-0 sudo[200254]: pam_unix(sudo:session): session closed for user root
Feb 16 13:14:19 compute-0 sudo[200438]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ymqvxfxtujkqrcnauahjmyiatkxboaij ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247659.3646855-1509-244623798168142/AnsiballZ_podman_container_exec.py'
Feb 16 13:14:19 compute-0 sudo[200438]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:14:19 compute-0 python3.9[200440]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=podman_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Feb 16 13:14:19 compute-0 systemd[1]: Started libpod-conmon-4ec6105d1727f201f656d7e6e358931be8ceabc5af37fb5ad779abc620122180.scope.
Feb 16 13:14:19 compute-0 podman[200441]: 2026-02-16 13:14:19.903727086 +0000 UTC m=+0.089637861 container exec 4ec6105d1727f201f656d7e6e358931be8ceabc5af37fb5ad779abc620122180 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 16 13:14:19 compute-0 podman[200441]: 2026-02-16 13:14:19.933432321 +0000 UTC m=+0.119343066 container exec_died 4ec6105d1727f201f656d7e6e358931be8ceabc5af37fb5ad779abc620122180 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 16 13:14:19 compute-0 systemd[1]: libpod-conmon-4ec6105d1727f201f656d7e6e358931be8ceabc5af37fb5ad779abc620122180.scope: Deactivated successfully.
Feb 16 13:14:19 compute-0 sudo[200438]: pam_unix(sudo:session): session closed for user root
Feb 16 13:14:20 compute-0 sudo[200621]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uttddlpljpxbfpkbemhntixyukkmukhn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247660.130515-1517-270383416609924/AnsiballZ_file.py'
Feb 16 13:14:20 compute-0 sudo[200621]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:14:20 compute-0 python3.9[200623]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/podman_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:14:20 compute-0 sudo[200621]: pam_unix(sudo:session): session closed for user root
Feb 16 13:14:20 compute-0 sudo[200773]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-spzsoaxrllffwntzgtgqeyjvuhtyejmp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247660.733402-1526-55745969338527/AnsiballZ_podman_container_info.py'
Feb 16 13:14:20 compute-0 sudo[200773]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:14:21 compute-0 python3.9[200775]: ansible-containers.podman.podman_container_info Invoked with name=['openstack_network_exporter'] executable=podman
Feb 16 13:14:21 compute-0 sudo[200773]: pam_unix(sudo:session): session closed for user root
Feb 16 13:14:21 compute-0 sudo[200950]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nwpvipxjtuhrddimwcpeunlbajpywdtw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247661.370245-1534-171198806601534/AnsiballZ_podman_container_exec.py'
Feb 16 13:14:21 compute-0 sudo[200950]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:14:21 compute-0 podman[200912]: 2026-02-16 13:14:21.585434537 +0000 UTC m=+0.040254045 container health_status 4ec6105d1727f201f656d7e6e358931be8ceabc5af37fb5ad779abc620122180 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 16 13:14:21 compute-0 python3.9[200963]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=openstack_network_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Feb 16 13:14:21 compute-0 systemd[1]: Started libpod-conmon-93151dcbec9f159583042df5101bdd7b5b1bcb5f3117955b4bc9cd1c0e465b98.scope.
Feb 16 13:14:21 compute-0 podman[200964]: 2026-02-16 13:14:21.83180939 +0000 UTC m=+0.068825541 container exec 93151dcbec9f159583042df5101bdd7b5b1bcb5f3117955b4bc9cd1c0e465b98 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, io.openshift.expose-services=, managed_by=edpm_ansible, release=1770267347, io.buildah.version=1.33.7, org.opencontainers.image.created=2026-02-05T04:57:10Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9/ubi-minimal, distribution-scope=public, architecture=x86_64, com.redhat.component=ubi9-minimal-container, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2026-02-05T04:57:10Z, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.7, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, config_id=openstack_network_exporter, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Feb 16 13:14:21 compute-0 podman[200964]: 2026-02-16 13:14:21.861631318 +0000 UTC m=+0.098647439 container exec_died 93151dcbec9f159583042df5101bdd7b5b1bcb5f3117955b4bc9cd1c0e465b98 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, build-date=2026-02-05T04:57:10Z, vcs-type=git, version=9.7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, release=1770267347, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, org.opencontainers.image.created=2026-02-05T04:57:10Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, name=ubi9/ubi-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, container_name=openstack_network_exporter, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vendor=Red Hat, Inc., config_id=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Feb 16 13:14:21 compute-0 systemd[1]: libpod-conmon-93151dcbec9f159583042df5101bdd7b5b1bcb5f3117955b4bc9cd1c0e465b98.scope: Deactivated successfully.
Feb 16 13:14:21 compute-0 sudo[200950]: pam_unix(sudo:session): session closed for user root
Feb 16 13:14:22 compute-0 sudo[201146]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-clybzktmvksmgwyxehbfhardnewgvnqb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247662.0616133-1542-247203032345247/AnsiballZ_podman_container_exec.py'
Feb 16 13:14:22 compute-0 sudo[201146]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:14:22 compute-0 python3.9[201148]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=openstack_network_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Feb 16 13:14:22 compute-0 systemd[1]: Started libpod-conmon-93151dcbec9f159583042df5101bdd7b5b1bcb5f3117955b4bc9cd1c0e465b98.scope.
Feb 16 13:14:22 compute-0 podman[201149]: 2026-02-16 13:14:22.559966201 +0000 UTC m=+0.081223078 container exec 93151dcbec9f159583042df5101bdd7b5b1bcb5f3117955b4bc9cd1c0e465b98 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, release=1770267347, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.33.7, vcs-type=git, version=9.7, build-date=2026-02-05T04:57:10Z, io.openshift.tags=minimal rhel9, org.opencontainers.image.created=2026-02-05T04:57:10Z, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, distribution-scope=public, maintainer=Red Hat, Inc., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible)
Feb 16 13:14:22 compute-0 podman[201149]: 2026-02-16 13:14:22.591057523 +0000 UTC m=+0.112314400 container exec_died 93151dcbec9f159583042df5101bdd7b5b1bcb5f3117955b4bc9cd1c0e465b98 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, release=1770267347, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, name=ubi9/ubi-minimal, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, build-date=2026-02-05T04:57:10Z, config_id=openstack_network_exporter, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.created=2026-02-05T04:57:10Z, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, vcs-type=git, version=9.7, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc.)
Feb 16 13:14:22 compute-0 systemd[1]: libpod-conmon-93151dcbec9f159583042df5101bdd7b5b1bcb5f3117955b4bc9cd1c0e465b98.scope: Deactivated successfully.
Feb 16 13:14:22 compute-0 sudo[201146]: pam_unix(sudo:session): session closed for user root
Feb 16 13:14:23 compute-0 sudo[201329]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-afschnhdkdnrpgpnutdjfxclhjnrzypl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247662.799632-1550-109994732455515/AnsiballZ_file.py'
Feb 16 13:14:23 compute-0 sudo[201329]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:14:23 compute-0 python3.9[201331]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/openstack_network_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:14:23 compute-0 sudo[201329]: pam_unix(sudo:session): session closed for user root
Feb 16 13:14:34 compute-0 sudo[201482]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-waswrcqbktdgxnjmnmfcpjrqvdvxadec ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247674.4857125-1692-134469031494964/AnsiballZ_file.py'
Feb 16 13:14:34 compute-0 sudo[201482]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:14:34 compute-0 python3.9[201484]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall/ state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:14:34 compute-0 sudo[201482]: pam_unix(sudo:session): session closed for user root
Feb 16 13:14:35 compute-0 sudo[201634]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dwcuqwdgnguxjeeorimgflklhqsxtvhy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247675.1055634-1708-138283181785468/AnsiballZ_stat.py'
Feb 16 13:14:35 compute-0 sudo[201634]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:14:35 compute-0 python3.9[201636]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/telemetry.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:14:35 compute-0 sudo[201634]: pam_unix(sudo:session): session closed for user root
Feb 16 13:14:35 compute-0 sudo[201757]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wslztteuvgzcttfjpoibdcizfabsrzwf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247675.1055634-1708-138283181785468/AnsiballZ_copy.py'
Feb 16 13:14:35 compute-0 sudo[201757]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:14:35 compute-0 python3.9[201759]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/telemetry.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1771247675.1055634-1708-138283181785468/.source.yaml _original_basename=firewall.yaml follow=False checksum=d942d984493b214bda2913f753ff68cdcedff00e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:14:35 compute-0 sudo[201757]: pam_unix(sudo:session): session closed for user root
Feb 16 13:14:36 compute-0 sudo[201909]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-izmsooflnqnrmapruqlcyjswrdygzkoz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247676.268516-1740-121879485831250/AnsiballZ_file.py'
Feb 16 13:14:36 compute-0 sudo[201909]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:14:36 compute-0 python3.9[201911]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:14:36 compute-0 sudo[201909]: pam_unix(sudo:session): session closed for user root
Feb 16 13:14:37 compute-0 sudo[202061]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rexdsubrwcdxecikfshpmfjzgtgtqdlh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247676.8950603-1756-228058421802814/AnsiballZ_stat.py'
Feb 16 13:14:37 compute-0 sudo[202061]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:14:37 compute-0 python3.9[202063]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:14:37 compute-0 sudo[202061]: pam_unix(sudo:session): session closed for user root
Feb 16 13:14:37 compute-0 sudo[202139]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zvhjisvmhygomiewyeqlardecorjzmqy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247676.8950603-1756-228058421802814/AnsiballZ_file.py'
Feb 16 13:14:37 compute-0 sudo[202139]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:14:37 compute-0 python3.9[202141]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:14:37 compute-0 sudo[202139]: pam_unix(sudo:session): session closed for user root
Feb 16 13:14:37 compute-0 podman[202142]: 2026-02-16 13:14:37.949066124 +0000 UTC m=+0.057851520 container health_status 93151dcbec9f159583042df5101bdd7b5b1bcb5f3117955b4bc9cd1c0e465b98 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9/ubi-minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, org.opencontainers.image.created=2026-02-05T04:57:10Z, architecture=x86_64, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.tags=minimal rhel9, release=1770267347, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=9.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2026-02-05T04:57:10Z, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.openshift.expose-services=, managed_by=edpm_ansible)
Feb 16 13:14:38 compute-0 sudo[202313]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hxihfapuxgjcasdxzwfyxjarierghucy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247678.0228634-1780-225842791350222/AnsiballZ_stat.py'
Feb 16 13:14:38 compute-0 sudo[202313]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:14:38 compute-0 python3.9[202315]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:14:38 compute-0 sudo[202313]: pam_unix(sudo:session): session closed for user root
Feb 16 13:14:38 compute-0 sudo[202391]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-adcvkwtewhozdkksydzhjeewozezrtba ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247678.0228634-1780-225842791350222/AnsiballZ_file.py'
Feb 16 13:14:38 compute-0 sudo[202391]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:14:38 compute-0 python3.9[202393]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.vk_rs4wh recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:14:38 compute-0 sudo[202391]: pam_unix(sudo:session): session closed for user root
Feb 16 13:14:38 compute-0 podman[202394]: 2026-02-16 13:14:38.938472292 +0000 UTC m=+0.094727515 container health_status d69bd0be854ec8043fcba555b70c2cd311c7a2510d07d6bc9929fe7684abece9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Feb 16 13:14:40 compute-0 sudo[202562]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fkaopvrwaerbjztuzpfqltznbdtzrgcx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247679.8128924-1804-5267994958826/AnsiballZ_stat.py'
Feb 16 13:14:40 compute-0 sudo[202562]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:14:40 compute-0 python3.9[202564]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:14:40 compute-0 sudo[202562]: pam_unix(sudo:session): session closed for user root
Feb 16 13:14:40 compute-0 sudo[202642]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hgjwxjcoesjzyvkjgdnvicoivxiwwnfa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247679.8128924-1804-5267994958826/AnsiballZ_file.py'
Feb 16 13:14:40 compute-0 sudo[202642]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:14:40 compute-0 python3.9[202644]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:14:40 compute-0 sudo[202642]: pam_unix(sudo:session): session closed for user root
Feb 16 13:14:40 compute-0 sshd-session[202565]: Connection closed by authenticating user root 188.166.42.159 port 56962 [preauth]
Feb 16 13:14:41 compute-0 sudo[202794]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tnnemxqfomutllouuxcfutgbsgmmfdta ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247680.8667269-1830-110571063435165/AnsiballZ_command.py'
Feb 16 13:14:41 compute-0 sudo[202794]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:14:41 compute-0 python3.9[202796]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 13:14:41 compute-0 sudo[202794]: pam_unix(sudo:session): session closed for user root
Feb 16 13:14:41 compute-0 auditd[719]: Audit daemon rotating log files
Feb 16 13:14:41 compute-0 sudo[202947]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ubuxsdiavfkggrtskpatycvkyqvxlwhl ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1771247681.5008922-1846-91158880274468/AnsiballZ_edpm_nftables_from_files.py'
Feb 16 13:14:41 compute-0 sudo[202947]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:14:42 compute-0 python3[202949]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Feb 16 13:14:42 compute-0 sudo[202947]: pam_unix(sudo:session): session closed for user root
Feb 16 13:14:42 compute-0 sudo[203112]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ugsosokrjtmjrrhwihmsmndcarvnsksx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247682.277564-1862-131433595500595/AnsiballZ_stat.py'
Feb 16 13:14:42 compute-0 sudo[203112]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:14:42 compute-0 podman[203073]: 2026-02-16 13:14:42.61174331 +0000 UTC m=+0.077563034 container health_status 19961a2f872cc72daf954f4dd556f980b5f58f137d145776df6132c167a8a93c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 16 13:14:42 compute-0 python3.9[203120]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:14:42 compute-0 sudo[203112]: pam_unix(sudo:session): session closed for user root
Feb 16 13:14:42 compute-0 sudo[203204]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fmefguqbgtqbbzkaqobjhbtlnacutkoa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247682.277564-1862-131433595500595/AnsiballZ_file.py'
Feb 16 13:14:42 compute-0 sudo[203204]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:14:43 compute-0 python3.9[203206]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:14:43 compute-0 sudo[203204]: pam_unix(sudo:session): session closed for user root
Feb 16 13:14:43 compute-0 sudo[203356]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dbbyxshtjqfznzkverqustsuhgvecsdl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247683.3494763-1886-39255697649802/AnsiballZ_stat.py'
Feb 16 13:14:43 compute-0 sudo[203356]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:14:43 compute-0 python3.9[203358]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:14:43 compute-0 sudo[203356]: pam_unix(sudo:session): session closed for user root
Feb 16 13:14:44 compute-0 sudo[203434]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xtuxvmmxsvyvfjhnxofbsfxblfzlsjhk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247683.3494763-1886-39255697649802/AnsiballZ_file.py'
Feb 16 13:14:44 compute-0 sudo[203434]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:14:44 compute-0 python3.9[203436]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:14:44 compute-0 sudo[203434]: pam_unix(sudo:session): session closed for user root
Feb 16 13:14:44 compute-0 sudo[203586]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ttabauxqpkqlnjwmdgwjrqujxqkcvfis ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247684.4349709-1910-35360493830604/AnsiballZ_stat.py'
Feb 16 13:14:44 compute-0 sudo[203586]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:14:44 compute-0 python3.9[203588]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:14:44 compute-0 sudo[203586]: pam_unix(sudo:session): session closed for user root
Feb 16 13:14:45 compute-0 sudo[203664]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oalgzhtrrhwlkpfpvsncekxwmwsfmpbe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247684.4349709-1910-35360493830604/AnsiballZ_file.py'
Feb 16 13:14:45 compute-0 sudo[203664]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:14:45 compute-0 python3.9[203666]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:14:45 compute-0 sudo[203664]: pam_unix(sudo:session): session closed for user root
Feb 16 13:14:45 compute-0 sudo[203816]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ncgorpdowlrbdypwsfwopfmqqxdzgexx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247685.4539623-1934-90536897829046/AnsiballZ_stat.py'
Feb 16 13:14:45 compute-0 sudo[203816]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:14:45 compute-0 python3.9[203818]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:14:45 compute-0 sudo[203816]: pam_unix(sudo:session): session closed for user root
Feb 16 13:14:46 compute-0 sudo[203894]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cuptbihaegihfvcgbxmcohkmqinhzown ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247685.4539623-1934-90536897829046/AnsiballZ_file.py'
Feb 16 13:14:46 compute-0 sudo[203894]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:14:46 compute-0 python3.9[203896]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:14:46 compute-0 sudo[203894]: pam_unix(sudo:session): session closed for user root
Feb 16 13:14:46 compute-0 sudo[204046]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gfjnsgusczzjxsluswzjzdfznxcyvfzy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247686.5396595-1958-249318169093070/AnsiballZ_stat.py'
Feb 16 13:14:46 compute-0 sudo[204046]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:14:47 compute-0 python3.9[204048]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:14:47 compute-0 sudo[204046]: pam_unix(sudo:session): session closed for user root
Feb 16 13:14:47 compute-0 sudo[204171]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jxyacjmvioglvrpetniogfndefjokbux ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247686.5396595-1958-249318169093070/AnsiballZ_copy.py'
Feb 16 13:14:47 compute-0 sudo[204171]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:14:47 compute-0 python3.9[204173]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771247686.5396595-1958-249318169093070/.source.nft follow=False _original_basename=ruleset.j2 checksum=fb3275eced3a2e06312143189928124e1b2df34a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:14:47 compute-0 sudo[204171]: pam_unix(sudo:session): session closed for user root
Feb 16 13:14:47 compute-0 sudo[204323]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ntxbtoyktdaipplijmrzlkotykkwnmlz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247687.6764324-1988-238693812805767/AnsiballZ_file.py'
Feb 16 13:14:47 compute-0 sudo[204323]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:14:48 compute-0 python3.9[204325]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:14:48 compute-0 sudo[204323]: pam_unix(sudo:session): session closed for user root
Feb 16 13:14:48 compute-0 sudo[204475]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ojvihbkrfpdcdpejzqnewyawzptgkrdi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247688.2731671-2004-196136103225632/AnsiballZ_command.py'
Feb 16 13:14:48 compute-0 sudo[204475]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:14:48 compute-0 python3.9[204477]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 13:14:48 compute-0 sudo[204475]: pam_unix(sudo:session): session closed for user root
Feb 16 13:14:50 compute-0 sudo[204630]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mlcmqthdpwvyqozybbxgtouzmwauyinh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247688.9380708-2020-96148003849682/AnsiballZ_blockinfile.py'
Feb 16 13:14:50 compute-0 sudo[204630]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:14:50 compute-0 python3.9[204632]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                             include "/etc/nftables/edpm-chains.nft"
                                             include "/etc/nftables/edpm-rules.nft"
                                             include "/etc/nftables/edpm-jumps.nft"
                                              path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:14:50 compute-0 sudo[204630]: pam_unix(sudo:session): session closed for user root
Feb 16 13:14:50 compute-0 sudo[204782]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-buqfgczvqhulghuyzrpdlprmmpxxkyto ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247690.558059-2038-143167745786703/AnsiballZ_command.py'
Feb 16 13:14:50 compute-0 sudo[204782]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:14:50 compute-0 python3.9[204784]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 13:14:51 compute-0 sudo[204782]: pam_unix(sudo:session): session closed for user root
Feb 16 13:14:51 compute-0 sudo[204935]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qifjcwfxofefevyjbjdckhiridldckzt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247691.1643665-2054-218401646446883/AnsiballZ_stat.py'
Feb 16 13:14:51 compute-0 sudo[204935]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:14:51 compute-0 python3.9[204937]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 16 13:14:51 compute-0 sudo[204935]: pam_unix(sudo:session): session closed for user root
Feb 16 13:14:51 compute-0 sudo[205100]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vwhqtyvaefgzfopfkugfojwkpqtdcfow ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247691.7616038-2070-113119269388162/AnsiballZ_command.py'
Feb 16 13:14:51 compute-0 sudo[205100]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:14:52 compute-0 podman[205063]: 2026-02-16 13:14:52.002992333 +0000 UTC m=+0.045040509 container health_status 4ec6105d1727f201f656d7e6e358931be8ceabc5af37fb5ad779abc620122180 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 16 13:14:52 compute-0 python3.9[205113]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 13:14:52 compute-0 sudo[205100]: pam_unix(sudo:session): session closed for user root
Feb 16 13:14:52 compute-0 sudo[205266]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qziqijdskxannqjfpxwpluiqacfhgrsy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247692.3591816-2086-7622714190735/AnsiballZ_file.py'
Feb 16 13:14:52 compute-0 sudo[205266]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:14:52 compute-0 python3.9[205268]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:14:52 compute-0 sudo[205266]: pam_unix(sudo:session): session closed for user root
Feb 16 13:14:53 compute-0 sshd-session[186027]: Connection closed by 192.168.122.30 port 52608
Feb 16 13:14:53 compute-0 sshd-session[186024]: pam_unix(sshd:session): session closed for user zuul
Feb 16 13:14:53 compute-0 systemd[1]: session-26.scope: Deactivated successfully.
Feb 16 13:14:53 compute-0 systemd[1]: session-26.scope: Consumed 1min 4.497s CPU time.
Feb 16 13:14:53 compute-0 systemd-logind[818]: Session 26 logged out. Waiting for processes to exit.
Feb 16 13:14:53 compute-0 systemd-logind[818]: Removed session 26.
Feb 16 13:14:53 compute-0 nova_compute[185723]: 2026-02-16 13:14:53.723 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:14:53 compute-0 nova_compute[185723]: 2026-02-16 13:14:53.740 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:14:53 compute-0 nova_compute[185723]: 2026-02-16 13:14:53.761 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:14:53 compute-0 nova_compute[185723]: 2026-02-16 13:14:53.761 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:14:53 compute-0 nova_compute[185723]: 2026-02-16 13:14:53.761 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:14:53 compute-0 nova_compute[185723]: 2026-02-16 13:14:53.761 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 16 13:14:53 compute-0 nova_compute[185723]: 2026-02-16 13:14:53.890 185727 WARNING nova.virt.libvirt.driver [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 13:14:53 compute-0 nova_compute[185723]: 2026-02-16 13:14:53.892 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5959MB free_disk=73.26259231567383GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 16 13:14:53 compute-0 nova_compute[185723]: 2026-02-16 13:14:53.892 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:14:53 compute-0 nova_compute[185723]: 2026-02-16 13:14:53.892 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:14:53 compute-0 nova_compute[185723]: 2026-02-16 13:14:53.990 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 16 13:14:53 compute-0 nova_compute[185723]: 2026-02-16 13:14:53.990 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 16 13:14:54 compute-0 nova_compute[185723]: 2026-02-16 13:14:54.009 185727 DEBUG nova.compute.provider_tree [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Inventory has not changed in ProviderTree for provider: c9501a85-df32-4b8f-bce0-9425ef1e7866 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 13:14:54 compute-0 nova_compute[185723]: 2026-02-16 13:14:54.024 185727 DEBUG nova.scheduler.client.report [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Inventory has not changed for provider c9501a85-df32-4b8f-bce0-9425ef1e7866 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 13:14:54 compute-0 nova_compute[185723]: 2026-02-16 13:14:54.026 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 16 13:14:54 compute-0 nova_compute[185723]: 2026-02-16 13:14:54.026 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.134s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:14:54 compute-0 nova_compute[185723]: 2026-02-16 13:14:54.719 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:14:54 compute-0 nova_compute[185723]: 2026-02-16 13:14:54.719 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:14:54 compute-0 nova_compute[185723]: 2026-02-16 13:14:54.719 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:14:54 compute-0 nova_compute[185723]: 2026-02-16 13:14:54.720 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:14:54 compute-0 nova_compute[185723]: 2026-02-16 13:14:54.720 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:14:54 compute-0 nova_compute[185723]: 2026-02-16 13:14:54.720 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:14:54 compute-0 nova_compute[185723]: 2026-02-16 13:14:54.720 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 16 13:14:55 compute-0 nova_compute[185723]: 2026-02-16 13:14:55.433 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:14:55 compute-0 nova_compute[185723]: 2026-02-16 13:14:55.433 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 16 13:14:55 compute-0 nova_compute[185723]: 2026-02-16 13:14:55.434 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 16 13:14:55 compute-0 nova_compute[185723]: 2026-02-16 13:14:55.454 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 16 13:14:55 compute-0 nova_compute[185723]: 2026-02-16 13:14:55.454 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:14:55 compute-0 sshd-session[205293]: Connection closed by authenticating user root 146.190.226.24 port 32894 [preauth]
Feb 16 13:14:59 compute-0 podman[195053]: time="2026-02-16T13:14:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:14:59 compute-0 podman[195053]: @ - - [16/Feb/2026:13:14:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16012 "" "Go-http-client/1.1"
Feb 16 13:14:59 compute-0 podman[195053]: @ - - [16/Feb/2026:13:14:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2137 "" "Go-http-client/1.1"
Feb 16 13:15:01 compute-0 openstack_network_exporter[197909]: ERROR   13:15:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:15:01 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:15:01 compute-0 openstack_network_exporter[197909]: ERROR   13:15:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:15:01 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:15:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:15:03.204 105360 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:15:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:15:03.205 105360 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:15:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:15:03.205 105360 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:15:09 compute-0 podman[205305]: 2026-02-16 13:15:09.011192878 +0000 UTC m=+0.048934129 container health_status d69bd0be854ec8043fcba555b70c2cd311c7a2510d07d6bc9929fe7684abece9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127)
Feb 16 13:15:09 compute-0 podman[205304]: 2026-02-16 13:15:09.011456635 +0000 UTC m=+0.055359244 container health_status 93151dcbec9f159583042df5101bdd7b5b1bcb5f3117955b4bc9cd1c0e465b98 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=ubi9/ubi-minimal, org.opencontainers.image.created=2026-02-05T04:57:10Z, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, container_name=openstack_network_exporter, config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.7, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, maintainer=Red Hat, Inc., managed_by=edpm_ansible, io.buildah.version=1.33.7, io.openshift.expose-services=, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, distribution-scope=public, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, release=1770267347, build-date=2026-02-05T04:57:10Z)
Feb 16 13:15:09 compute-0 sshd-session[205302]: Connection closed by authenticating user root 64.227.72.94 port 37062 [preauth]
Feb 16 13:15:13 compute-0 podman[205344]: 2026-02-16 13:15:13.026206576 +0000 UTC m=+0.068167613 container health_status 19961a2f872cc72daf954f4dd556f980b5f58f137d145776df6132c167a8a93c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Feb 16 13:15:23 compute-0 podman[205370]: 2026-02-16 13:15:23.000397241 +0000 UTC m=+0.046123036 container health_status 4ec6105d1727f201f656d7e6e358931be8ceabc5af37fb5ad779abc620122180 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 16 13:15:29 compute-0 podman[195053]: time="2026-02-16T13:15:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:15:29 compute-0 podman[195053]: @ - - [16/Feb/2026:13:15:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16012 "" "Go-http-client/1.1"
Feb 16 13:15:29 compute-0 podman[195053]: @ - - [16/Feb/2026:13:15:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2140 "" "Go-http-client/1.1"
Feb 16 13:15:30 compute-0 sshd-session[205395]: Connection closed by authenticating user root 146.190.22.227 port 60244 [preauth]
Feb 16 13:15:31 compute-0 openstack_network_exporter[197909]: ERROR   13:15:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:15:31 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:15:31 compute-0 openstack_network_exporter[197909]: ERROR   13:15:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:15:31 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:15:40 compute-0 podman[205399]: 2026-02-16 13:15:40.008183825 +0000 UTC m=+0.050307494 container health_status 93151dcbec9f159583042df5101bdd7b5b1bcb5f3117955b4bc9cd1c0e465b98 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.7, maintainer=Red Hat, Inc., name=ubi9/ubi-minimal, org.opencontainers.image.created=2026-02-05T04:57:10Z, architecture=x86_64, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, build-date=2026-02-05T04:57:10Z, com.redhat.component=ubi9-minimal-container, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, release=1770267347, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vendor=Red Hat, Inc., io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter)
Feb 16 13:15:40 compute-0 podman[205400]: 2026-02-16 13:15:40.027984724 +0000 UTC m=+0.067584078 container health_status d69bd0be854ec8043fcba555b70c2cd311c7a2510d07d6bc9929fe7684abece9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Feb 16 13:15:40 compute-0 sshd-session[205397]: Connection closed by authenticating user root 188.166.42.159 port 44096 [preauth]
Feb 16 13:15:44 compute-0 podman[205437]: 2026-02-16 13:15:44.025782398 +0000 UTC m=+0.064356374 container health_status 19961a2f872cc72daf954f4dd556f980b5f58f137d145776df6132c167a8a93c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller)
Feb 16 13:15:53 compute-0 nova_compute[185723]: 2026-02-16 13:15:53.434 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:15:53 compute-0 nova_compute[185723]: 2026-02-16 13:15:53.474 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:15:53 compute-0 nova_compute[185723]: 2026-02-16 13:15:53.475 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:15:53 compute-0 nova_compute[185723]: 2026-02-16 13:15:53.475 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:15:53 compute-0 nova_compute[185723]: 2026-02-16 13:15:53.475 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 16 13:15:53 compute-0 nova_compute[185723]: 2026-02-16 13:15:53.581 185727 WARNING nova.virt.libvirt.driver [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 13:15:53 compute-0 nova_compute[185723]: 2026-02-16 13:15:53.582 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6093MB free_disk=73.26263046264648GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 16 13:15:53 compute-0 nova_compute[185723]: 2026-02-16 13:15:53.582 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:15:53 compute-0 nova_compute[185723]: 2026-02-16 13:15:53.583 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:15:53 compute-0 nova_compute[185723]: 2026-02-16 13:15:53.646 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 16 13:15:53 compute-0 nova_compute[185723]: 2026-02-16 13:15:53.646 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 16 13:15:53 compute-0 nova_compute[185723]: 2026-02-16 13:15:53.672 185727 DEBUG nova.compute.provider_tree [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Inventory has not changed in ProviderTree for provider: c9501a85-df32-4b8f-bce0-9425ef1e7866 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 13:15:53 compute-0 nova_compute[185723]: 2026-02-16 13:15:53.689 185727 DEBUG nova.scheduler.client.report [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Inventory has not changed for provider c9501a85-df32-4b8f-bce0-9425ef1e7866 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 13:15:53 compute-0 nova_compute[185723]: 2026-02-16 13:15:53.691 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 16 13:15:53 compute-0 nova_compute[185723]: 2026-02-16 13:15:53.692 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.109s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:15:53 compute-0 podman[205464]: 2026-02-16 13:15:53.991786146 +0000 UTC m=+0.035996400 container health_status 4ec6105d1727f201f656d7e6e358931be8ceabc5af37fb5ad779abc620122180 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 16 13:15:54 compute-0 nova_compute[185723]: 2026-02-16 13:15:54.691 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:15:54 compute-0 nova_compute[185723]: 2026-02-16 13:15:54.692 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:15:54 compute-0 nova_compute[185723]: 2026-02-16 13:15:54.692 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 16 13:15:55 compute-0 nova_compute[185723]: 2026-02-16 13:15:55.434 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:15:55 compute-0 nova_compute[185723]: 2026-02-16 13:15:55.434 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:15:56 compute-0 nova_compute[185723]: 2026-02-16 13:15:56.429 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:15:56 compute-0 nova_compute[185723]: 2026-02-16 13:15:56.432 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:15:56 compute-0 nova_compute[185723]: 2026-02-16 13:15:56.432 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:15:57 compute-0 nova_compute[185723]: 2026-02-16 13:15:57.434 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:15:57 compute-0 nova_compute[185723]: 2026-02-16 13:15:57.434 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 16 13:15:57 compute-0 nova_compute[185723]: 2026-02-16 13:15:57.434 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 16 13:15:57 compute-0 nova_compute[185723]: 2026-02-16 13:15:57.449 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 16 13:16:01 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:16:01.494 105360 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=2, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '52:96:1b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '16:6c:7a:bd:49:39'}, ipsec=False) old=SB_Global(nb_cfg=1) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 13:16:01 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:16:01.495 105360 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 16 13:16:01 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:16:01.496 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=b0e583b2-47d7-4bde-bbd6-282143e0c194, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '2'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:16:02 compute-0 sshd-session[205489]: Connection closed by authenticating user root 146.190.226.24 port 37896 [preauth]
Feb 16 13:16:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:16:03.206 105360 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:16:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:16:03.207 105360 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:16:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:16:03.207 105360 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:16:11 compute-0 podman[205492]: 2026-02-16 13:16:11.004437984 +0000 UTC m=+0.045957662 container health_status d69bd0be854ec8043fcba555b70c2cd311c7a2510d07d6bc9929fe7684abece9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Feb 16 13:16:11 compute-0 podman[205491]: 2026-02-16 13:16:11.03519902 +0000 UTC m=+0.079086558 container health_status 93151dcbec9f159583042df5101bdd7b5b1bcb5f3117955b4bc9cd1c0e465b98 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, maintainer=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, config_id=openstack_network_exporter, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., architecture=x86_64, org.opencontainers.image.created=2026-02-05T04:57:10Z, container_name=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1770267347, managed_by=edpm_ansible, version=9.7, name=ubi9/ubi-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2026-02-05T04:57:10Z, io.buildah.version=1.33.7, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-type=git, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Feb 16 13:16:12 compute-0 sshd-session[205531]: Connection closed by authenticating user root 64.227.72.94 port 34252 [preauth]
Feb 16 13:16:15 compute-0 podman[205533]: 2026-02-16 13:16:15.025487677 +0000 UTC m=+0.069311021 container health_status 19961a2f872cc72daf954f4dd556f980b5f58f137d145776df6132c167a8a93c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true)
Feb 16 13:16:25 compute-0 podman[205559]: 2026-02-16 13:16:25.017485598 +0000 UTC m=+0.055171524 container health_status 4ec6105d1727f201f656d7e6e358931be8ceabc5af37fb5ad779abc620122180 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 16 13:16:29 compute-0 podman[195053]: time="2026-02-16T13:16:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:16:29 compute-0 podman[195053]: @ - - [16/Feb/2026:13:16:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16012 "" "Go-http-client/1.1"
Feb 16 13:16:29 compute-0 podman[195053]: @ - - [16/Feb/2026:13:16:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2148 "" "Go-http-client/1.1"
Feb 16 13:16:31 compute-0 openstack_network_exporter[197909]: ERROR   13:16:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:16:31 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:16:31 compute-0 openstack_network_exporter[197909]: ERROR   13:16:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:16:31 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:16:36 compute-0 sshd-session[205588]: Connection closed by authenticating user root 188.166.42.159 port 36866 [preauth]
Feb 16 13:16:42 compute-0 podman[205591]: 2026-02-16 13:16:42.028769471 +0000 UTC m=+0.053806870 container health_status d69bd0be854ec8043fcba555b70c2cd311c7a2510d07d6bc9929fe7684abece9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, managed_by=edpm_ansible, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_managed=true)
Feb 16 13:16:42 compute-0 podman[205590]: 2026-02-16 13:16:42.036321982 +0000 UTC m=+0.063462404 container health_status 93151dcbec9f159583042df5101bdd7b5b1bcb5f3117955b4bc9cd1c0e465b98 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, build-date=2026-02-05T04:57:10Z, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, version=9.7, architecture=x86_64, io.openshift.expose-services=, managed_by=edpm_ansible, name=ubi9/ubi-minimal, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-02-05T04:57:10Z, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, config_id=openstack_network_exporter, maintainer=Red Hat, Inc., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, release=1770267347, cpe=cpe:/a:redhat:enterprise_linux:9::appstream)
Feb 16 13:16:46 compute-0 podman[205628]: 2026-02-16 13:16:46.083031201 +0000 UTC m=+0.120195505 container health_status 19961a2f872cc72daf954f4dd556f980b5f58f137d145776df6132c167a8a93c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, managed_by=edpm_ansible)
Feb 16 13:16:53 compute-0 nova_compute[185723]: 2026-02-16 13:16:53.436 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:16:53 compute-0 nova_compute[185723]: 2026-02-16 13:16:53.463 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:16:53 compute-0 nova_compute[185723]: 2026-02-16 13:16:53.464 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:16:53 compute-0 nova_compute[185723]: 2026-02-16 13:16:53.464 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:16:53 compute-0 nova_compute[185723]: 2026-02-16 13:16:53.464 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 16 13:16:53 compute-0 nova_compute[185723]: 2026-02-16 13:16:53.590 185727 WARNING nova.virt.libvirt.driver [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 13:16:53 compute-0 nova_compute[185723]: 2026-02-16 13:16:53.591 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6159MB free_disk=73.26275253295898GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 16 13:16:53 compute-0 nova_compute[185723]: 2026-02-16 13:16:53.591 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:16:53 compute-0 nova_compute[185723]: 2026-02-16 13:16:53.591 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:16:53 compute-0 nova_compute[185723]: 2026-02-16 13:16:53.711 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 16 13:16:53 compute-0 nova_compute[185723]: 2026-02-16 13:16:53.712 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 16 13:16:53 compute-0 nova_compute[185723]: 2026-02-16 13:16:53.737 185727 DEBUG nova.compute.provider_tree [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Inventory has not changed in ProviderTree for provider: c9501a85-df32-4b8f-bce0-9425ef1e7866 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 13:16:53 compute-0 nova_compute[185723]: 2026-02-16 13:16:53.819 185727 DEBUG nova.scheduler.client.report [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Inventory has not changed for provider c9501a85-df32-4b8f-bce0-9425ef1e7866 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 13:16:53 compute-0 nova_compute[185723]: 2026-02-16 13:16:53.820 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 16 13:16:53 compute-0 nova_compute[185723]: 2026-02-16 13:16:53.821 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.229s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:16:54 compute-0 nova_compute[185723]: 2026-02-16 13:16:54.819 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:16:55 compute-0 nova_compute[185723]: 2026-02-16 13:16:55.431 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:16:55 compute-0 nova_compute[185723]: 2026-02-16 13:16:55.472 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:16:55 compute-0 nova_compute[185723]: 2026-02-16 13:16:55.473 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:16:56 compute-0 podman[205656]: 2026-02-16 13:16:56.030538713 +0000 UTC m=+0.073946704 container health_status 4ec6105d1727f201f656d7e6e358931be8ceabc5af37fb5ad779abc620122180 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 16 13:16:56 compute-0 nova_compute[185723]: 2026-02-16 13:16:56.433 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:16:56 compute-0 nova_compute[185723]: 2026-02-16 13:16:56.434 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:16:56 compute-0 nova_compute[185723]: 2026-02-16 13:16:56.434 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:16:56 compute-0 nova_compute[185723]: 2026-02-16 13:16:56.435 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 16 13:16:57 compute-0 nova_compute[185723]: 2026-02-16 13:16:57.434 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:16:57 compute-0 nova_compute[185723]: 2026-02-16 13:16:57.434 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 16 13:16:57 compute-0 nova_compute[185723]: 2026-02-16 13:16:57.435 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 16 13:16:57 compute-0 nova_compute[185723]: 2026-02-16 13:16:57.678 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 16 13:16:57 compute-0 nova_compute[185723]: 2026-02-16 13:16:57.678 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:16:59 compute-0 podman[195053]: time="2026-02-16T13:16:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:16:59 compute-0 podman[195053]: @ - - [16/Feb/2026:13:16:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16012 "" "Go-http-client/1.1"
Feb 16 13:16:59 compute-0 podman[195053]: @ - - [16/Feb/2026:13:16:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2144 "" "Go-http-client/1.1"
Feb 16 13:17:01 compute-0 openstack_network_exporter[197909]: ERROR   13:17:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:17:01 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:17:01 compute-0 openstack_network_exporter[197909]: ERROR   13:17:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:17:01 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:17:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:17:03.208 105360 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:17:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:17:03.209 105360 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:17:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:17:03.209 105360 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:17:07 compute-0 sshd-session[205681]: Connection closed by authenticating user root 146.190.226.24 port 39368 [preauth]
Feb 16 13:17:13 compute-0 podman[205683]: 2026-02-16 13:17:13.035739621 +0000 UTC m=+0.072840575 container health_status 93151dcbec9f159583042df5101bdd7b5b1bcb5f3117955b4bc9cd1c0e465b98 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., managed_by=edpm_ansible, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1770267347, version=9.7, build-date=2026-02-05T04:57:10Z, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., org.opencontainers.image.created=2026-02-05T04:57:10Z, io.openshift.expose-services=, vcs-type=git, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, name=ubi9/ubi-minimal, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c)
Feb 16 13:17:13 compute-0 podman[205684]: 2026-02-16 13:17:13.046625597 +0000 UTC m=+0.079078294 container health_status d69bd0be854ec8043fcba555b70c2cd311c7a2510d07d6bc9929fe7684abece9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Feb 16 13:17:17 compute-0 podman[205726]: 2026-02-16 13:17:17.048041969 +0000 UTC m=+0.090612005 container health_status 19961a2f872cc72daf954f4dd556f980b5f58f137d145776df6132c167a8a93c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127)
Feb 16 13:17:17 compute-0 sshd-session[205724]: Connection closed by authenticating user root 64.227.72.94 port 47720 [preauth]
Feb 16 13:17:27 compute-0 podman[205753]: 2026-02-16 13:17:27.005293581 +0000 UTC m=+0.045601105 container health_status 4ec6105d1727f201f656d7e6e358931be8ceabc5af37fb5ad779abc620122180 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 16 13:17:29 compute-0 podman[195053]: time="2026-02-16T13:17:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:17:29 compute-0 podman[195053]: @ - - [16/Feb/2026:13:17:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16012 "" "Go-http-client/1.1"
Feb 16 13:17:29 compute-0 podman[195053]: @ - - [16/Feb/2026:13:17:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2149 "" "Go-http-client/1.1"
Feb 16 13:17:31 compute-0 sshd-session[205777]: Connection closed by authenticating user root 188.166.42.159 port 41072 [preauth]
Feb 16 13:17:31 compute-0 openstack_network_exporter[197909]: ERROR   13:17:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:17:31 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:17:31 compute-0 openstack_network_exporter[197909]: ERROR   13:17:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:17:31 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:17:35 compute-0 sshd-session[205779]: Connection closed by authenticating user root 146.190.22.227 port 49662 [preauth]
Feb 16 13:17:44 compute-0 podman[205781]: 2026-02-16 13:17:44.007046358 +0000 UTC m=+0.047299418 container health_status 93151dcbec9f159583042df5101bdd7b5b1bcb5f3117955b4bc9cd1c0e465b98 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, version=9.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2026-02-05T04:57:10Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-02-05T04:57:10Z, io.buildah.version=1.33.7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.expose-services=, managed_by=edpm_ansible, release=1770267347, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Feb 16 13:17:44 compute-0 podman[205782]: 2026-02-16 13:17:44.030454801 +0000 UTC m=+0.068762872 container health_status d69bd0be854ec8043fcba555b70c2cd311c7a2510d07d6bc9929fe7684abece9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 16 13:17:48 compute-0 podman[205820]: 2026-02-16 13:17:48.058093189 +0000 UTC m=+0.096317520 container health_status 19961a2f872cc72daf954f4dd556f980b5f58f137d145776df6132c167a8a93c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 16 13:17:53 compute-0 nova_compute[185723]: 2026-02-16 13:17:53.433 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:17:53 compute-0 nova_compute[185723]: 2026-02-16 13:17:53.434 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Feb 16 13:17:53 compute-0 nova_compute[185723]: 2026-02-16 13:17:53.454 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Feb 16 13:17:53 compute-0 nova_compute[185723]: 2026-02-16 13:17:53.455 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:17:53 compute-0 nova_compute[185723]: 2026-02-16 13:17:53.455 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Feb 16 13:17:53 compute-0 nova_compute[185723]: 2026-02-16 13:17:53.471 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:17:55 compute-0 nova_compute[185723]: 2026-02-16 13:17:55.492 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:17:55 compute-0 nova_compute[185723]: 2026-02-16 13:17:55.492 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:17:55 compute-0 nova_compute[185723]: 2026-02-16 13:17:55.541 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:17:55 compute-0 nova_compute[185723]: 2026-02-16 13:17:55.541 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:17:55 compute-0 nova_compute[185723]: 2026-02-16 13:17:55.542 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:17:55 compute-0 nova_compute[185723]: 2026-02-16 13:17:55.542 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 16 13:17:55 compute-0 nova_compute[185723]: 2026-02-16 13:17:55.653 185727 WARNING nova.virt.libvirt.driver [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 13:17:55 compute-0 nova_compute[185723]: 2026-02-16 13:17:55.654 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6189MB free_disk=73.26248931884766GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 16 13:17:55 compute-0 nova_compute[185723]: 2026-02-16 13:17:55.654 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:17:55 compute-0 nova_compute[185723]: 2026-02-16 13:17:55.654 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:17:55 compute-0 nova_compute[185723]: 2026-02-16 13:17:55.805 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 16 13:17:55 compute-0 nova_compute[185723]: 2026-02-16 13:17:55.806 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 16 13:17:55 compute-0 nova_compute[185723]: 2026-02-16 13:17:55.933 185727 DEBUG nova.scheduler.client.report [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Refreshing inventories for resource provider c9501a85-df32-4b8f-bce0-9425ef1e7866 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Feb 16 13:17:56 compute-0 nova_compute[185723]: 2026-02-16 13:17:56.038 185727 DEBUG nova.scheduler.client.report [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Updating ProviderTree inventory for provider c9501a85-df32-4b8f-bce0-9425ef1e7866 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Feb 16 13:17:56 compute-0 nova_compute[185723]: 2026-02-16 13:17:56.038 185727 DEBUG nova.compute.provider_tree [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Updating inventory in ProviderTree for provider c9501a85-df32-4b8f-bce0-9425ef1e7866 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Feb 16 13:17:56 compute-0 nova_compute[185723]: 2026-02-16 13:17:56.055 185727 DEBUG nova.scheduler.client.report [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Refreshing aggregate associations for resource provider c9501a85-df32-4b8f-bce0-9425ef1e7866, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Feb 16 13:17:56 compute-0 nova_compute[185723]: 2026-02-16 13:17:56.079 185727 DEBUG nova.scheduler.client.report [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Refreshing trait associations for resource provider c9501a85-df32-4b8f-bce0-9425ef1e7866, traits: COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_MMX,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_SSE,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_SSE2,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_SECURITY_TPM_1_2,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_VOLUME_EXTEND,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_STORAGE_BUS_USB,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_SECURITY_TPM_2_0,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_ACCELERATORS,COMPUTE_STORAGE_BUS_FDC,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NODE,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_SSE41,COMPUTE_IMAGE_TYPE_AKI _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Feb 16 13:17:56 compute-0 nova_compute[185723]: 2026-02-16 13:17:56.106 185727 DEBUG nova.compute.provider_tree [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Inventory has not changed in ProviderTree for provider: c9501a85-df32-4b8f-bce0-9425ef1e7866 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 13:17:56 compute-0 nova_compute[185723]: 2026-02-16 13:17:56.133 185727 DEBUG nova.scheduler.client.report [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Inventory has not changed for provider c9501a85-df32-4b8f-bce0-9425ef1e7866 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 13:17:56 compute-0 nova_compute[185723]: 2026-02-16 13:17:56.136 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 16 13:17:56 compute-0 nova_compute[185723]: 2026-02-16 13:17:56.137 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.483s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:17:57 compute-0 nova_compute[185723]: 2026-02-16 13:17:57.079 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:17:57 compute-0 nova_compute[185723]: 2026-02-16 13:17:57.079 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 16 13:17:57 compute-0 nova_compute[185723]: 2026-02-16 13:17:57.429 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:17:57 compute-0 nova_compute[185723]: 2026-02-16 13:17:57.432 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:17:57 compute-0 nova_compute[185723]: 2026-02-16 13:17:57.433 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:17:57 compute-0 nova_compute[185723]: 2026-02-16 13:17:57.433 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:17:58 compute-0 podman[205847]: 2026-02-16 13:17:58.025154644 +0000 UTC m=+0.069966992 container health_status 4ec6105d1727f201f656d7e6e358931be8ceabc5af37fb5ad779abc620122180 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 16 13:17:59 compute-0 nova_compute[185723]: 2026-02-16 13:17:59.432 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:17:59 compute-0 nova_compute[185723]: 2026-02-16 13:17:59.433 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 16 13:17:59 compute-0 nova_compute[185723]: 2026-02-16 13:17:59.433 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 16 13:17:59 compute-0 nova_compute[185723]: 2026-02-16 13:17:59.448 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 16 13:17:59 compute-0 nova_compute[185723]: 2026-02-16 13:17:59.448 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:17:59 compute-0 podman[195053]: time="2026-02-16T13:17:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:17:59 compute-0 podman[195053]: @ - - [16/Feb/2026:13:17:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16012 "" "Go-http-client/1.1"
Feb 16 13:17:59 compute-0 podman[195053]: @ - - [16/Feb/2026:13:17:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2145 "" "Go-http-client/1.1"
Feb 16 13:18:01 compute-0 openstack_network_exporter[197909]: ERROR   13:18:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:18:01 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:18:01 compute-0 openstack_network_exporter[197909]: ERROR   13:18:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:18:01 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:18:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:18:03.209 105360 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:18:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:18:03.210 105360 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:18:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:18:03.210 105360 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:18:15 compute-0 podman[205872]: 2026-02-16 13:18:15.011763699 +0000 UTC m=+0.049101072 container health_status d69bd0be854ec8043fcba555b70c2cd311c7a2510d07d6bc9929fe7684abece9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Feb 16 13:18:15 compute-0 podman[205871]: 2026-02-16 13:18:15.028591518 +0000 UTC m=+0.066072515 container health_status 93151dcbec9f159583042df5101bdd7b5b1bcb5f3117955b4bc9cd1c0e465b98 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.tags=minimal rhel9, vcs-type=git, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, build-date=2026-02-05T04:57:10Z, version=9.7, com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., org.opencontainers.image.created=2026-02-05T04:57:10Z, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1770267347, config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9/ubi-minimal)
Feb 16 13:18:18 compute-0 sshd-session[205911]: Connection closed by authenticating user root 146.190.226.24 port 45250 [preauth]
Feb 16 13:18:19 compute-0 podman[205913]: 2026-02-16 13:18:19.036218581 +0000 UTC m=+0.077697454 container health_status 19961a2f872cc72daf954f4dd556f980b5f58f137d145776df6132c167a8a93c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20260127, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Feb 16 13:18:24 compute-0 sshd-session[205941]: Connection closed by authenticating user root 64.227.72.94 port 60900 [preauth]
Feb 16 13:18:27 compute-0 sshd-session[205943]: Connection closed by authenticating user root 188.166.42.159 port 47730 [preauth]
Feb 16 13:18:29 compute-0 podman[205945]: 2026-02-16 13:18:29.001512055 +0000 UTC m=+0.045486312 container health_status 4ec6105d1727f201f656d7e6e358931be8ceabc5af37fb5ad779abc620122180 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 16 13:18:29 compute-0 podman[195053]: time="2026-02-16T13:18:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:18:29 compute-0 podman[195053]: @ - - [16/Feb/2026:13:18:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16012 "" "Go-http-client/1.1"
Feb 16 13:18:29 compute-0 podman[195053]: @ - - [16/Feb/2026:13:18:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2153 "" "Go-http-client/1.1"
Feb 16 13:18:31 compute-0 openstack_network_exporter[197909]: ERROR   13:18:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:18:31 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:18:31 compute-0 openstack_network_exporter[197909]: ERROR   13:18:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:18:31 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:18:46 compute-0 podman[205971]: 2026-02-16 13:18:46.026190459 +0000 UTC m=+0.057323647 container health_status d69bd0be854ec8043fcba555b70c2cd311c7a2510d07d6bc9929fe7684abece9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Feb 16 13:18:46 compute-0 podman[205970]: 2026-02-16 13:18:46.037169843 +0000 UTC m=+0.074908105 container health_status 93151dcbec9f159583042df5101bdd7b5b1bcb5f3117955b4bc9cd1c0e465b98 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, io.buildah.version=1.33.7, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2026-02-05T04:57:10Z, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9/ubi-minimal, release=1770267347, distribution-scope=public, org.opencontainers.image.created=2026-02-05T04:57:10Z, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, version=9.7)
Feb 16 13:18:50 compute-0 podman[206011]: 2026-02-16 13:18:50.038831886 +0000 UTC m=+0.081088198 container health_status 19961a2f872cc72daf954f4dd556f980b5f58f137d145776df6132c167a8a93c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Feb 16 13:18:56 compute-0 nova_compute[185723]: 2026-02-16 13:18:56.438 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:18:56 compute-0 nova_compute[185723]: 2026-02-16 13:18:56.438 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:18:56 compute-0 nova_compute[185723]: 2026-02-16 13:18:56.469 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:18:56 compute-0 nova_compute[185723]: 2026-02-16 13:18:56.469 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:18:56 compute-0 nova_compute[185723]: 2026-02-16 13:18:56.470 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:18:56 compute-0 nova_compute[185723]: 2026-02-16 13:18:56.470 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 16 13:18:56 compute-0 nova_compute[185723]: 2026-02-16 13:18:56.619 185727 WARNING nova.virt.libvirt.driver [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 13:18:56 compute-0 nova_compute[185723]: 2026-02-16 13:18:56.620 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6183MB free_disk=73.26250839233398GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 16 13:18:56 compute-0 nova_compute[185723]: 2026-02-16 13:18:56.620 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:18:56 compute-0 nova_compute[185723]: 2026-02-16 13:18:56.621 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:18:56 compute-0 nova_compute[185723]: 2026-02-16 13:18:56.713 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 16 13:18:56 compute-0 nova_compute[185723]: 2026-02-16 13:18:56.714 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 16 13:18:56 compute-0 nova_compute[185723]: 2026-02-16 13:18:56.739 185727 DEBUG nova.compute.provider_tree [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Inventory has not changed in ProviderTree for provider: c9501a85-df32-4b8f-bce0-9425ef1e7866 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 13:18:56 compute-0 nova_compute[185723]: 2026-02-16 13:18:56.757 185727 DEBUG nova.scheduler.client.report [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Inventory has not changed for provider c9501a85-df32-4b8f-bce0-9425ef1e7866 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 13:18:56 compute-0 nova_compute[185723]: 2026-02-16 13:18:56.758 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 16 13:18:56 compute-0 nova_compute[185723]: 2026-02-16 13:18:56.759 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.138s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:18:57 compute-0 nova_compute[185723]: 2026-02-16 13:18:57.750 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:18:57 compute-0 nova_compute[185723]: 2026-02-16 13:18:57.769 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:18:58 compute-0 nova_compute[185723]: 2026-02-16 13:18:58.433 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:18:58 compute-0 nova_compute[185723]: 2026-02-16 13:18:58.433 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:18:58 compute-0 nova_compute[185723]: 2026-02-16 13:18:58.433 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:18:58 compute-0 nova_compute[185723]: 2026-02-16 13:18:58.433 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 16 13:18:59 compute-0 nova_compute[185723]: 2026-02-16 13:18:59.433 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:18:59 compute-0 nova_compute[185723]: 2026-02-16 13:18:59.434 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:18:59 compute-0 podman[195053]: time="2026-02-16T13:18:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:18:59 compute-0 podman[195053]: @ - - [16/Feb/2026:13:18:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16012 "" "Go-http-client/1.1"
Feb 16 13:18:59 compute-0 podman[195053]: @ - - [16/Feb/2026:13:18:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2154 "" "Go-http-client/1.1"
Feb 16 13:19:00 compute-0 podman[206037]: 2026-02-16 13:19:00.01119932 +0000 UTC m=+0.049551286 container health_status 4ec6105d1727f201f656d7e6e358931be8ceabc5af37fb5ad779abc620122180 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 16 13:19:01 compute-0 openstack_network_exporter[197909]: ERROR   13:19:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:19:01 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:19:01 compute-0 openstack_network_exporter[197909]: ERROR   13:19:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:19:01 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:19:01 compute-0 nova_compute[185723]: 2026-02-16 13:19:01.433 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:19:01 compute-0 nova_compute[185723]: 2026-02-16 13:19:01.433 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 16 13:19:01 compute-0 nova_compute[185723]: 2026-02-16 13:19:01.434 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 16 13:19:01 compute-0 nova_compute[185723]: 2026-02-16 13:19:01.462 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 16 13:19:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:19:03.211 105360 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:19:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:19:03.211 105360 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:19:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:19:03.212 105360 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:19:17 compute-0 podman[206063]: 2026-02-16 13:19:17.023170743 +0000 UTC m=+0.056364787 container health_status 93151dcbec9f159583042df5101bdd7b5b1bcb5f3117955b4bc9cd1c0e465b98 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=openstack_network_exporter, container_name=openstack_network_exporter, org.opencontainers.image.created=2026-02-05T04:57:10Z, vcs-type=git, managed_by=edpm_ansible, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, distribution-scope=public, maintainer=Red Hat, Inc., release=1770267347, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., name=ubi9/ubi-minimal, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.7, build-date=2026-02-05T04:57:10Z, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=)
Feb 16 13:19:17 compute-0 podman[206064]: 2026-02-16 13:19:17.03307771 +0000 UTC m=+0.066254694 container health_status d69bd0be854ec8043fcba555b70c2cd311c7a2510d07d6bc9929fe7684abece9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_metadata_agent)
Feb 16 13:19:21 compute-0 podman[206102]: 2026-02-16 13:19:21.070289743 +0000 UTC m=+0.113057391 container health_status 19961a2f872cc72daf954f4dd556f980b5f58f137d145776df6132c167a8a93c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, io.buildah.version=1.41.3)
Feb 16 13:19:22 compute-0 sshd-session[206131]: Connection closed by authenticating user root 188.166.42.159 port 46152 [preauth]
Feb 16 13:19:23 compute-0 sshd-session[206129]: Connection closed by authenticating user root 146.190.22.227 port 40594 [preauth]
Feb 16 13:19:28 compute-0 sshd-session[206133]: Connection closed by authenticating user root 146.190.226.24 port 60398 [preauth]
Feb 16 13:19:29 compute-0 podman[195053]: time="2026-02-16T13:19:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:19:29 compute-0 podman[195053]: @ - - [16/Feb/2026:13:19:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16012 "" "Go-http-client/1.1"
Feb 16 13:19:29 compute-0 podman[195053]: @ - - [16/Feb/2026:13:19:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2153 "" "Go-http-client/1.1"
Feb 16 13:19:31 compute-0 podman[206135]: 2026-02-16 13:19:31.015490504 +0000 UTC m=+0.057538546 container health_status 4ec6105d1727f201f656d7e6e358931be8ceabc5af37fb5ad779abc620122180 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 16 13:19:31 compute-0 openstack_network_exporter[197909]: ERROR   13:19:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:19:31 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:19:31 compute-0 openstack_network_exporter[197909]: ERROR   13:19:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:19:31 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:19:37 compute-0 sshd-session[206160]: Connection closed by authenticating user root 64.227.72.94 port 36140 [preauth]
Feb 16 13:19:48 compute-0 podman[206162]: 2026-02-16 13:19:48.014386008 +0000 UTC m=+0.052953312 container health_status 93151dcbec9f159583042df5101bdd7b5b1bcb5f3117955b4bc9cd1c0e465b98 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, version=9.7, io.openshift.expose-services=, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2026-02-05T04:57:10Z, config_id=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2026-02-05T04:57:10Z, release=1770267347, container_name=openstack_network_exporter, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Feb 16 13:19:48 compute-0 podman[206163]: 2026-02-16 13:19:48.02286578 +0000 UTC m=+0.054266555 container health_status d69bd0be854ec8043fcba555b70c2cd311c7a2510d07d6bc9929fe7684abece9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Feb 16 13:19:51 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:19:51.753 105360 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=3, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '52:96:1b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '16:6c:7a:bd:49:39'}, ipsec=False) old=SB_Global(nb_cfg=2) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 13:19:51 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:19:51.754 105360 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 16 13:19:52 compute-0 podman[206204]: 2026-02-16 13:19:52.062181656 +0000 UTC m=+0.102260402 container health_status 19961a2f872cc72daf954f4dd556f980b5f58f137d145776df6132c167a8a93c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 16 13:19:55 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:19:55.756 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=b0e583b2-47d7-4bde-bbd6-282143e0c194, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '3'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:19:57 compute-0 nova_compute[185723]: 2026-02-16 13:19:57.433 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:19:57 compute-0 nova_compute[185723]: 2026-02-16 13:19:57.433 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:19:58 compute-0 nova_compute[185723]: 2026-02-16 13:19:58.433 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:19:58 compute-0 nova_compute[185723]: 2026-02-16 13:19:58.434 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:19:58 compute-0 nova_compute[185723]: 2026-02-16 13:19:58.464 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:19:58 compute-0 nova_compute[185723]: 2026-02-16 13:19:58.465 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:19:58 compute-0 nova_compute[185723]: 2026-02-16 13:19:58.465 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:19:58 compute-0 nova_compute[185723]: 2026-02-16 13:19:58.465 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 16 13:19:58 compute-0 nova_compute[185723]: 2026-02-16 13:19:58.632 185727 WARNING nova.virt.libvirt.driver [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 13:19:58 compute-0 nova_compute[185723]: 2026-02-16 13:19:58.634 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6193MB free_disk=73.26206588745117GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 16 13:19:58 compute-0 nova_compute[185723]: 2026-02-16 13:19:58.634 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:19:58 compute-0 nova_compute[185723]: 2026-02-16 13:19:58.634 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:19:58 compute-0 nova_compute[185723]: 2026-02-16 13:19:58.706 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 16 13:19:58 compute-0 nova_compute[185723]: 2026-02-16 13:19:58.707 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 16 13:19:58 compute-0 nova_compute[185723]: 2026-02-16 13:19:58.733 185727 DEBUG nova.compute.provider_tree [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Inventory has not changed in ProviderTree for provider: c9501a85-df32-4b8f-bce0-9425ef1e7866 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 13:19:58 compute-0 nova_compute[185723]: 2026-02-16 13:19:58.751 185727 DEBUG nova.scheduler.client.report [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Inventory has not changed for provider c9501a85-df32-4b8f-bce0-9425ef1e7866 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 13:19:58 compute-0 nova_compute[185723]: 2026-02-16 13:19:58.752 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 16 13:19:58 compute-0 nova_compute[185723]: 2026-02-16 13:19:58.752 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.118s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:19:59 compute-0 podman[195053]: time="2026-02-16T13:19:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:19:59 compute-0 podman[195053]: @ - - [16/Feb/2026:13:19:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16012 "" "Go-http-client/1.1"
Feb 16 13:19:59 compute-0 podman[195053]: @ - - [16/Feb/2026:13:19:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2152 "" "Go-http-client/1.1"
Feb 16 13:19:59 compute-0 nova_compute[185723]: 2026-02-16 13:19:59.752 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:19:59 compute-0 nova_compute[185723]: 2026-02-16 13:19:59.753 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 16 13:20:00 compute-0 nova_compute[185723]: 2026-02-16 13:20:00.430 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:20:01 compute-0 openstack_network_exporter[197909]: ERROR   13:20:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:20:01 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:20:01 compute-0 openstack_network_exporter[197909]: ERROR   13:20:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:20:01 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:20:01 compute-0 nova_compute[185723]: 2026-02-16 13:20:01.433 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:20:01 compute-0 nova_compute[185723]: 2026-02-16 13:20:01.434 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:20:02 compute-0 podman[206231]: 2026-02-16 13:20:02.009432898 +0000 UTC m=+0.047337631 container health_status 4ec6105d1727f201f656d7e6e358931be8ceabc5af37fb5ad779abc620122180 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 16 13:20:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:20:03.211 105360 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:20:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:20:03.212 105360 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:20:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:20:03.212 105360 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:20:03 compute-0 nova_compute[185723]: 2026-02-16 13:20:03.434 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:20:03 compute-0 nova_compute[185723]: 2026-02-16 13:20:03.435 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 16 13:20:03 compute-0 nova_compute[185723]: 2026-02-16 13:20:03.435 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 16 13:20:03 compute-0 nova_compute[185723]: 2026-02-16 13:20:03.459 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 16 13:20:19 compute-0 podman[206255]: 2026-02-16 13:20:19.012233602 +0000 UTC m=+0.048956637 container health_status 93151dcbec9f159583042df5101bdd7b5b1bcb5f3117955b4bc9cd1c0e465b98 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=openstack_network_exporter, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1770267347, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, vcs-type=git, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, version=9.7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, managed_by=edpm_ansible, name=ubi9/ubi-minimal, build-date=2026-02-05T04:57:10Z, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, org.opencontainers.image.created=2026-02-05T04:57:10Z, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, architecture=x86_64, distribution-scope=public, maintainer=Red Hat, Inc.)
Feb 16 13:20:19 compute-0 podman[206256]: 2026-02-16 13:20:19.012781705 +0000 UTC m=+0.046950689 container health_status d69bd0be854ec8043fcba555b70c2cd311c7a2510d07d6bc9929fe7684abece9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20260127)
Feb 16 13:20:20 compute-0 sshd-session[206295]: Connection closed by authenticating user root 188.166.42.159 port 47862 [preauth]
Feb 16 13:20:23 compute-0 podman[206297]: 2026-02-16 13:20:23.019608318 +0000 UTC m=+0.060715834 container health_status 19961a2f872cc72daf954f4dd556f980b5f58f137d145776df6132c167a8a93c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Feb 16 13:20:29 compute-0 podman[195053]: time="2026-02-16T13:20:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:20:29 compute-0 podman[195053]: @ - - [16/Feb/2026:13:20:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16012 "" "Go-http-client/1.1"
Feb 16 13:20:29 compute-0 podman[195053]: @ - - [16/Feb/2026:13:20:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2152 "" "Go-http-client/1.1"
Feb 16 13:20:30 compute-0 nova_compute[185723]: 2026-02-16 13:20:30.096 185727 DEBUG oslo_concurrency.lockutils [None req-97176d72-5811-4702-8d64-4218af710c9f 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Acquiring lock "934dfad2-33a3-44dd-82c8-0b913e89cb8e" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:20:30 compute-0 nova_compute[185723]: 2026-02-16 13:20:30.096 185727 DEBUG oslo_concurrency.lockutils [None req-97176d72-5811-4702-8d64-4218af710c9f 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Lock "934dfad2-33a3-44dd-82c8-0b913e89cb8e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:20:30 compute-0 nova_compute[185723]: 2026-02-16 13:20:30.137 185727 DEBUG nova.compute.manager [None req-97176d72-5811-4702-8d64-4218af710c9f 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 934dfad2-33a3-44dd-82c8-0b913e89cb8e] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Feb 16 13:20:30 compute-0 nova_compute[185723]: 2026-02-16 13:20:30.286 185727 DEBUG oslo_concurrency.lockutils [None req-97176d72-5811-4702-8d64-4218af710c9f 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:20:30 compute-0 nova_compute[185723]: 2026-02-16 13:20:30.287 185727 DEBUG oslo_concurrency.lockutils [None req-97176d72-5811-4702-8d64-4218af710c9f 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:20:30 compute-0 nova_compute[185723]: 2026-02-16 13:20:30.295 185727 DEBUG nova.virt.hardware [None req-97176d72-5811-4702-8d64-4218af710c9f 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Feb 16 13:20:30 compute-0 nova_compute[185723]: 2026-02-16 13:20:30.296 185727 INFO nova.compute.claims [None req-97176d72-5811-4702-8d64-4218af710c9f 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 934dfad2-33a3-44dd-82c8-0b913e89cb8e] Claim successful on node compute-0.ctlplane.example.com
Feb 16 13:20:30 compute-0 nova_compute[185723]: 2026-02-16 13:20:30.634 185727 DEBUG nova.compute.provider_tree [None req-97176d72-5811-4702-8d64-4218af710c9f 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Inventory has not changed in ProviderTree for provider: c9501a85-df32-4b8f-bce0-9425ef1e7866 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 13:20:30 compute-0 nova_compute[185723]: 2026-02-16 13:20:30.650 185727 DEBUG nova.scheduler.client.report [None req-97176d72-5811-4702-8d64-4218af710c9f 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Inventory has not changed for provider c9501a85-df32-4b8f-bce0-9425ef1e7866 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 13:20:30 compute-0 nova_compute[185723]: 2026-02-16 13:20:30.677 185727 DEBUG oslo_concurrency.lockutils [None req-97176d72-5811-4702-8d64-4218af710c9f 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.390s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:20:30 compute-0 nova_compute[185723]: 2026-02-16 13:20:30.678 185727 DEBUG nova.compute.manager [None req-97176d72-5811-4702-8d64-4218af710c9f 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 934dfad2-33a3-44dd-82c8-0b913e89cb8e] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Feb 16 13:20:30 compute-0 nova_compute[185723]: 2026-02-16 13:20:30.731 185727 DEBUG nova.compute.manager [None req-97176d72-5811-4702-8d64-4218af710c9f 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 934dfad2-33a3-44dd-82c8-0b913e89cb8e] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Feb 16 13:20:30 compute-0 nova_compute[185723]: 2026-02-16 13:20:30.732 185727 DEBUG nova.network.neutron [None req-97176d72-5811-4702-8d64-4218af710c9f 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 934dfad2-33a3-44dd-82c8-0b913e89cb8e] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Feb 16 13:20:30 compute-0 nova_compute[185723]: 2026-02-16 13:20:30.764 185727 INFO nova.virt.libvirt.driver [None req-97176d72-5811-4702-8d64-4218af710c9f 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 934dfad2-33a3-44dd-82c8-0b913e89cb8e] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Feb 16 13:20:30 compute-0 nova_compute[185723]: 2026-02-16 13:20:30.793 185727 DEBUG nova.compute.manager [None req-97176d72-5811-4702-8d64-4218af710c9f 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 934dfad2-33a3-44dd-82c8-0b913e89cb8e] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Feb 16 13:20:31 compute-0 nova_compute[185723]: 2026-02-16 13:20:31.027 185727 DEBUG nova.compute.manager [None req-97176d72-5811-4702-8d64-4218af710c9f 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 934dfad2-33a3-44dd-82c8-0b913e89cb8e] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Feb 16 13:20:31 compute-0 nova_compute[185723]: 2026-02-16 13:20:31.029 185727 DEBUG nova.virt.libvirt.driver [None req-97176d72-5811-4702-8d64-4218af710c9f 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 934dfad2-33a3-44dd-82c8-0b913e89cb8e] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Feb 16 13:20:31 compute-0 nova_compute[185723]: 2026-02-16 13:20:31.029 185727 INFO nova.virt.libvirt.driver [None req-97176d72-5811-4702-8d64-4218af710c9f 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 934dfad2-33a3-44dd-82c8-0b913e89cb8e] Creating image(s)
Feb 16 13:20:31 compute-0 nova_compute[185723]: 2026-02-16 13:20:31.030 185727 DEBUG oslo_concurrency.lockutils [None req-97176d72-5811-4702-8d64-4218af710c9f 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Acquiring lock "/var/lib/nova/instances/934dfad2-33a3-44dd-82c8-0b913e89cb8e/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:20:31 compute-0 nova_compute[185723]: 2026-02-16 13:20:31.030 185727 DEBUG oslo_concurrency.lockutils [None req-97176d72-5811-4702-8d64-4218af710c9f 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Lock "/var/lib/nova/instances/934dfad2-33a3-44dd-82c8-0b913e89cb8e/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:20:31 compute-0 nova_compute[185723]: 2026-02-16 13:20:31.030 185727 DEBUG oslo_concurrency.lockutils [None req-97176d72-5811-4702-8d64-4218af710c9f 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Lock "/var/lib/nova/instances/934dfad2-33a3-44dd-82c8-0b913e89cb8e/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:20:31 compute-0 nova_compute[185723]: 2026-02-16 13:20:31.031 185727 DEBUG oslo_concurrency.lockutils [None req-97176d72-5811-4702-8d64-4218af710c9f 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Acquiring lock "755ae6cb63977e865a71a8c38a1a67ce95c9d0b7" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:20:31 compute-0 nova_compute[185723]: 2026-02-16 13:20:31.031 185727 DEBUG oslo_concurrency.lockutils [None req-97176d72-5811-4702-8d64-4218af710c9f 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Lock "755ae6cb63977e865a71a8c38a1a67ce95c9d0b7" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:20:31 compute-0 openstack_network_exporter[197909]: ERROR   13:20:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:20:31 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:20:31 compute-0 openstack_network_exporter[197909]: ERROR   13:20:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:20:31 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:20:31 compute-0 nova_compute[185723]: 2026-02-16 13:20:31.813 185727 WARNING oslo_policy.policy [None req-97176d72-5811-4702-8d64-4218af710c9f 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] JSON formatted policy_file support is deprecated since Victoria release. You need to use YAML format which will be default in future. You can use ``oslopolicy-convert-json-to-yaml`` tool to convert existing JSON-formatted policy file to YAML-formatted in backward compatible way: https://docs.openstack.org/oslo.policy/latest/cli/oslopolicy-convert-json-to-yaml.html.
Feb 16 13:20:31 compute-0 nova_compute[185723]: 2026-02-16 13:20:31.814 185727 WARNING oslo_policy.policy [None req-97176d72-5811-4702-8d64-4218af710c9f 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] JSON formatted policy_file support is deprecated since Victoria release. You need to use YAML format which will be default in future. You can use ``oslopolicy-convert-json-to-yaml`` tool to convert existing JSON-formatted policy file to YAML-formatted in backward compatible way: https://docs.openstack.org/oslo.policy/latest/cli/oslopolicy-convert-json-to-yaml.html.
Feb 16 13:20:31 compute-0 nova_compute[185723]: 2026-02-16 13:20:31.816 185727 DEBUG nova.policy [None req-97176d72-5811-4702-8d64-4218af710c9f 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '53b5045c5aaf4a7d8d84dce2ac4aa424', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b5e0321e3a614b62a46eef7fb2e737ff', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Feb 16 13:20:32 compute-0 nova_compute[185723]: 2026-02-16 13:20:32.514 185727 DEBUG oslo_concurrency.processutils [None req-97176d72-5811-4702-8d64-4218af710c9f 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:20:32 compute-0 nova_compute[185723]: 2026-02-16 13:20:32.568 185727 DEBUG oslo_concurrency.processutils [None req-97176d72-5811-4702-8d64-4218af710c9f 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7.part --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:20:32 compute-0 nova_compute[185723]: 2026-02-16 13:20:32.569 185727 DEBUG nova.virt.images [None req-97176d72-5811-4702-8d64-4218af710c9f 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] 6fb9af7f-2971-4890-a777-6e99e888717f was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242
Feb 16 13:20:32 compute-0 nova_compute[185723]: 2026-02-16 13:20:32.571 185727 DEBUG nova.privsep.utils [None req-97176d72-5811-4702-8d64-4218af710c9f 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Feb 16 13:20:32 compute-0 nova_compute[185723]: 2026-02-16 13:20:32.572 185727 DEBUG oslo_concurrency.processutils [None req-97176d72-5811-4702-8d64-4218af710c9f 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7.part /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:20:33 compute-0 nova_compute[185723]: 2026-02-16 13:20:33.011 185727 DEBUG oslo_concurrency.processutils [None req-97176d72-5811-4702-8d64-4218af710c9f 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7.part /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7.converted" returned: 0 in 0.439s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:20:33 compute-0 nova_compute[185723]: 2026-02-16 13:20:33.013 185727 DEBUG oslo_concurrency.processutils [None req-97176d72-5811-4702-8d64-4218af710c9f 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:20:33 compute-0 podman[206334]: 2026-02-16 13:20:33.031069676 +0000 UTC m=+0.068002547 container health_status 4ec6105d1727f201f656d7e6e358931be8ceabc5af37fb5ad779abc620122180 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 16 13:20:33 compute-0 nova_compute[185723]: 2026-02-16 13:20:33.052 185727 DEBUG nova.network.neutron [None req-97176d72-5811-4702-8d64-4218af710c9f 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 934dfad2-33a3-44dd-82c8-0b913e89cb8e] Successfully created port: b0642d70-aac9-4a19-b18b-6f6a914d307a _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Feb 16 13:20:33 compute-0 nova_compute[185723]: 2026-02-16 13:20:33.054 185727 DEBUG oslo_concurrency.processutils [None req-97176d72-5811-4702-8d64-4218af710c9f 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7.converted --force-share --output=json" returned: 0 in 0.041s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:20:33 compute-0 nova_compute[185723]: 2026-02-16 13:20:33.055 185727 DEBUG oslo_concurrency.lockutils [None req-97176d72-5811-4702-8d64-4218af710c9f 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Lock "755ae6cb63977e865a71a8c38a1a67ce95c9d0b7" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 2.024s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:20:33 compute-0 nova_compute[185723]: 2026-02-16 13:20:33.066 185727 INFO oslo.privsep.daemon [None req-97176d72-5811-4702-8d64-4218af710c9f 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'nova.privsep.sys_admin_pctxt', '--privsep_sock_path', '/tmp/tmpu1jy6_vi/privsep.sock']
Feb 16 13:20:33 compute-0 nova_compute[185723]: 2026-02-16 13:20:33.654 185727 INFO oslo.privsep.daemon [None req-97176d72-5811-4702-8d64-4218af710c9f 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Spawned new privsep daemon via rootwrap
Feb 16 13:20:33 compute-0 nova_compute[185723]: 2026-02-16 13:20:33.546 206365 INFO oslo.privsep.daemon [-] privsep daemon starting
Feb 16 13:20:33 compute-0 nova_compute[185723]: 2026-02-16 13:20:33.550 206365 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Feb 16 13:20:33 compute-0 nova_compute[185723]: 2026-02-16 13:20:33.552 206365 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/none
Feb 16 13:20:33 compute-0 nova_compute[185723]: 2026-02-16 13:20:33.552 206365 INFO oslo.privsep.daemon [-] privsep daemon running as pid 206365
Feb 16 13:20:33 compute-0 nova_compute[185723]: 2026-02-16 13:20:33.727 185727 DEBUG oslo_concurrency.processutils [None req-97176d72-5811-4702-8d64-4218af710c9f 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:20:33 compute-0 nova_compute[185723]: 2026-02-16 13:20:33.770 185727 DEBUG oslo_concurrency.processutils [None req-97176d72-5811-4702-8d64-4218af710c9f 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json" returned: 0 in 0.043s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:20:33 compute-0 nova_compute[185723]: 2026-02-16 13:20:33.771 185727 DEBUG oslo_concurrency.lockutils [None req-97176d72-5811-4702-8d64-4218af710c9f 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Acquiring lock "755ae6cb63977e865a71a8c38a1a67ce95c9d0b7" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:20:33 compute-0 nova_compute[185723]: 2026-02-16 13:20:33.771 185727 DEBUG oslo_concurrency.lockutils [None req-97176d72-5811-4702-8d64-4218af710c9f 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Lock "755ae6cb63977e865a71a8c38a1a67ce95c9d0b7" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:20:33 compute-0 nova_compute[185723]: 2026-02-16 13:20:33.782 185727 DEBUG oslo_concurrency.processutils [None req-97176d72-5811-4702-8d64-4218af710c9f 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:20:33 compute-0 nova_compute[185723]: 2026-02-16 13:20:33.821 185727 DEBUG oslo_concurrency.processutils [None req-97176d72-5811-4702-8d64-4218af710c9f 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json" returned: 0 in 0.040s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:20:33 compute-0 nova_compute[185723]: 2026-02-16 13:20:33.822 185727 DEBUG oslo_concurrency.processutils [None req-97176d72-5811-4702-8d64-4218af710c9f 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7,backing_fmt=raw /var/lib/nova/instances/934dfad2-33a3-44dd-82c8-0b913e89cb8e/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:20:33 compute-0 nova_compute[185723]: 2026-02-16 13:20:33.843 185727 DEBUG oslo_concurrency.processutils [None req-97176d72-5811-4702-8d64-4218af710c9f 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7,backing_fmt=raw /var/lib/nova/instances/934dfad2-33a3-44dd-82c8-0b913e89cb8e/disk 1073741824" returned: 0 in 0.021s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:20:33 compute-0 nova_compute[185723]: 2026-02-16 13:20:33.844 185727 DEBUG oslo_concurrency.lockutils [None req-97176d72-5811-4702-8d64-4218af710c9f 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Lock "755ae6cb63977e865a71a8c38a1a67ce95c9d0b7" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.073s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:20:33 compute-0 nova_compute[185723]: 2026-02-16 13:20:33.844 185727 DEBUG oslo_concurrency.processutils [None req-97176d72-5811-4702-8d64-4218af710c9f 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:20:33 compute-0 nova_compute[185723]: 2026-02-16 13:20:33.883 185727 DEBUG oslo_concurrency.processutils [None req-97176d72-5811-4702-8d64-4218af710c9f 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json" returned: 0 in 0.039s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:20:33 compute-0 nova_compute[185723]: 2026-02-16 13:20:33.884 185727 DEBUG nova.virt.disk.api [None req-97176d72-5811-4702-8d64-4218af710c9f 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Checking if we can resize image /var/lib/nova/instances/934dfad2-33a3-44dd-82c8-0b913e89cb8e/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Feb 16 13:20:33 compute-0 nova_compute[185723]: 2026-02-16 13:20:33.884 185727 DEBUG oslo_concurrency.processutils [None req-97176d72-5811-4702-8d64-4218af710c9f 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/934dfad2-33a3-44dd-82c8-0b913e89cb8e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:20:33 compute-0 nova_compute[185723]: 2026-02-16 13:20:33.924 185727 DEBUG oslo_concurrency.processutils [None req-97176d72-5811-4702-8d64-4218af710c9f 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/934dfad2-33a3-44dd-82c8-0b913e89cb8e/disk --force-share --output=json" returned: 0 in 0.040s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:20:33 compute-0 nova_compute[185723]: 2026-02-16 13:20:33.925 185727 DEBUG nova.virt.disk.api [None req-97176d72-5811-4702-8d64-4218af710c9f 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Cannot resize image /var/lib/nova/instances/934dfad2-33a3-44dd-82c8-0b913e89cb8e/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Feb 16 13:20:33 compute-0 nova_compute[185723]: 2026-02-16 13:20:33.925 185727 DEBUG nova.objects.instance [None req-97176d72-5811-4702-8d64-4218af710c9f 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Lazy-loading 'migration_context' on Instance uuid 934dfad2-33a3-44dd-82c8-0b913e89cb8e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 13:20:33 compute-0 nova_compute[185723]: 2026-02-16 13:20:33.951 185727 DEBUG nova.virt.libvirt.driver [None req-97176d72-5811-4702-8d64-4218af710c9f 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 934dfad2-33a3-44dd-82c8-0b913e89cb8e] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Feb 16 13:20:33 compute-0 nova_compute[185723]: 2026-02-16 13:20:33.952 185727 DEBUG nova.virt.libvirt.driver [None req-97176d72-5811-4702-8d64-4218af710c9f 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 934dfad2-33a3-44dd-82c8-0b913e89cb8e] Ensure instance console log exists: /var/lib/nova/instances/934dfad2-33a3-44dd-82c8-0b913e89cb8e/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Feb 16 13:20:33 compute-0 nova_compute[185723]: 2026-02-16 13:20:33.952 185727 DEBUG oslo_concurrency.lockutils [None req-97176d72-5811-4702-8d64-4218af710c9f 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:20:33 compute-0 nova_compute[185723]: 2026-02-16 13:20:33.952 185727 DEBUG oslo_concurrency.lockutils [None req-97176d72-5811-4702-8d64-4218af710c9f 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:20:33 compute-0 nova_compute[185723]: 2026-02-16 13:20:33.953 185727 DEBUG oslo_concurrency.lockutils [None req-97176d72-5811-4702-8d64-4218af710c9f 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:20:35 compute-0 nova_compute[185723]: 2026-02-16 13:20:35.066 185727 DEBUG nova.network.neutron [None req-97176d72-5811-4702-8d64-4218af710c9f 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 934dfad2-33a3-44dd-82c8-0b913e89cb8e] Successfully updated port: b0642d70-aac9-4a19-b18b-6f6a914d307a _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Feb 16 13:20:35 compute-0 nova_compute[185723]: 2026-02-16 13:20:35.083 185727 DEBUG oslo_concurrency.lockutils [None req-97176d72-5811-4702-8d64-4218af710c9f 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Acquiring lock "refresh_cache-934dfad2-33a3-44dd-82c8-0b913e89cb8e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 13:20:35 compute-0 nova_compute[185723]: 2026-02-16 13:20:35.084 185727 DEBUG oslo_concurrency.lockutils [None req-97176d72-5811-4702-8d64-4218af710c9f 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Acquired lock "refresh_cache-934dfad2-33a3-44dd-82c8-0b913e89cb8e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 13:20:35 compute-0 nova_compute[185723]: 2026-02-16 13:20:35.084 185727 DEBUG nova.network.neutron [None req-97176d72-5811-4702-8d64-4218af710c9f 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 934dfad2-33a3-44dd-82c8-0b913e89cb8e] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 16 13:20:35 compute-0 nova_compute[185723]: 2026-02-16 13:20:35.254 185727 DEBUG nova.network.neutron [None req-97176d72-5811-4702-8d64-4218af710c9f 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 934dfad2-33a3-44dd-82c8-0b913e89cb8e] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Feb 16 13:20:35 compute-0 nova_compute[185723]: 2026-02-16 13:20:35.611 185727 DEBUG nova.compute.manager [req-d0417391-fdab-428c-8cbe-b4ed1b20318d req-10b86116-022d-4ea9-8008-ec124c4d5c48 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 934dfad2-33a3-44dd-82c8-0b913e89cb8e] Received event network-changed-b0642d70-aac9-4a19-b18b-6f6a914d307a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:20:35 compute-0 nova_compute[185723]: 2026-02-16 13:20:35.611 185727 DEBUG nova.compute.manager [req-d0417391-fdab-428c-8cbe-b4ed1b20318d req-10b86116-022d-4ea9-8008-ec124c4d5c48 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 934dfad2-33a3-44dd-82c8-0b913e89cb8e] Refreshing instance network info cache due to event network-changed-b0642d70-aac9-4a19-b18b-6f6a914d307a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 16 13:20:35 compute-0 nova_compute[185723]: 2026-02-16 13:20:35.612 185727 DEBUG oslo_concurrency.lockutils [req-d0417391-fdab-428c-8cbe-b4ed1b20318d req-10b86116-022d-4ea9-8008-ec124c4d5c48 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "refresh_cache-934dfad2-33a3-44dd-82c8-0b913e89cb8e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 13:20:36 compute-0 nova_compute[185723]: 2026-02-16 13:20:36.426 185727 DEBUG nova.network.neutron [None req-97176d72-5811-4702-8d64-4218af710c9f 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 934dfad2-33a3-44dd-82c8-0b913e89cb8e] Updating instance_info_cache with network_info: [{"id": "b0642d70-aac9-4a19-b18b-6f6a914d307a", "address": "fa:16:3e:b1:7c:d9", "network": {"id": "a6199784-1742-41a7-9152-bb54abb7bef1", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-926379118-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5e0321e3a614b62a46eef7fb2e737ff", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb0642d70-aa", "ovs_interfaceid": "b0642d70-aac9-4a19-b18b-6f6a914d307a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 13:20:36 compute-0 nova_compute[185723]: 2026-02-16 13:20:36.452 185727 DEBUG oslo_concurrency.lockutils [None req-97176d72-5811-4702-8d64-4218af710c9f 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Releasing lock "refresh_cache-934dfad2-33a3-44dd-82c8-0b913e89cb8e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 13:20:36 compute-0 nova_compute[185723]: 2026-02-16 13:20:36.453 185727 DEBUG nova.compute.manager [None req-97176d72-5811-4702-8d64-4218af710c9f 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 934dfad2-33a3-44dd-82c8-0b913e89cb8e] Instance network_info: |[{"id": "b0642d70-aac9-4a19-b18b-6f6a914d307a", "address": "fa:16:3e:b1:7c:d9", "network": {"id": "a6199784-1742-41a7-9152-bb54abb7bef1", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-926379118-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5e0321e3a614b62a46eef7fb2e737ff", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb0642d70-aa", "ovs_interfaceid": "b0642d70-aac9-4a19-b18b-6f6a914d307a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Feb 16 13:20:36 compute-0 nova_compute[185723]: 2026-02-16 13:20:36.453 185727 DEBUG oslo_concurrency.lockutils [req-d0417391-fdab-428c-8cbe-b4ed1b20318d req-10b86116-022d-4ea9-8008-ec124c4d5c48 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquired lock "refresh_cache-934dfad2-33a3-44dd-82c8-0b913e89cb8e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 13:20:36 compute-0 nova_compute[185723]: 2026-02-16 13:20:36.454 185727 DEBUG nova.network.neutron [req-d0417391-fdab-428c-8cbe-b4ed1b20318d req-10b86116-022d-4ea9-8008-ec124c4d5c48 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 934dfad2-33a3-44dd-82c8-0b913e89cb8e] Refreshing network info cache for port b0642d70-aac9-4a19-b18b-6f6a914d307a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 16 13:20:36 compute-0 nova_compute[185723]: 2026-02-16 13:20:36.457 185727 DEBUG nova.virt.libvirt.driver [None req-97176d72-5811-4702-8d64-4218af710c9f 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 934dfad2-33a3-44dd-82c8-0b913e89cb8e] Start _get_guest_xml network_info=[{"id": "b0642d70-aac9-4a19-b18b-6f6a914d307a", "address": "fa:16:3e:b1:7c:d9", "network": {"id": "a6199784-1742-41a7-9152-bb54abb7bef1", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-926379118-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5e0321e3a614b62a46eef7fb2e737ff", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb0642d70-aa", "ovs_interfaceid": "b0642d70-aac9-4a19-b18b-6f6a914d307a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-16T13:17:01Z,direct_url=<?>,disk_format='qcow2',id=6fb9af7f-2971-4890-a777-6e99e888717f,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='fa8ec31824694513a42cc22c81880a5c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-16T13:17:03Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'encrypted': False, 'device_type': 'disk', 'disk_bus': 'virtio', 'encryption_options': None, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'size': 0, 'image_id': '6fb9af7f-2971-4890-a777-6e99e888717f'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Feb 16 13:20:36 compute-0 nova_compute[185723]: 2026-02-16 13:20:36.461 185727 WARNING nova.virt.libvirt.driver [None req-97176d72-5811-4702-8d64-4218af710c9f 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 13:20:36 compute-0 nova_compute[185723]: 2026-02-16 13:20:36.466 185727 DEBUG nova.virt.libvirt.host [None req-97176d72-5811-4702-8d64-4218af710c9f 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Feb 16 13:20:36 compute-0 nova_compute[185723]: 2026-02-16 13:20:36.467 185727 DEBUG nova.virt.libvirt.host [None req-97176d72-5811-4702-8d64-4218af710c9f 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Feb 16 13:20:36 compute-0 nova_compute[185723]: 2026-02-16 13:20:36.472 185727 DEBUG nova.virt.libvirt.host [None req-97176d72-5811-4702-8d64-4218af710c9f 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Feb 16 13:20:36 compute-0 nova_compute[185723]: 2026-02-16 13:20:36.473 185727 DEBUG nova.virt.libvirt.host [None req-97176d72-5811-4702-8d64-4218af710c9f 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Feb 16 13:20:36 compute-0 nova_compute[185723]: 2026-02-16 13:20:36.474 185727 DEBUG nova.virt.libvirt.driver [None req-97176d72-5811-4702-8d64-4218af710c9f 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Feb 16 13:20:36 compute-0 nova_compute[185723]: 2026-02-16 13:20:36.474 185727 DEBUG nova.virt.hardware [None req-97176d72-5811-4702-8d64-4218af710c9f 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-02-16T13:16:59Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='6d89f72c-1760-421e-a5f2-83dfc3723b84',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-16T13:17:01Z,direct_url=<?>,disk_format='qcow2',id=6fb9af7f-2971-4890-a777-6e99e888717f,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='fa8ec31824694513a42cc22c81880a5c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-16T13:17:03Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Feb 16 13:20:36 compute-0 nova_compute[185723]: 2026-02-16 13:20:36.474 185727 DEBUG nova.virt.hardware [None req-97176d72-5811-4702-8d64-4218af710c9f 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Feb 16 13:20:36 compute-0 nova_compute[185723]: 2026-02-16 13:20:36.474 185727 DEBUG nova.virt.hardware [None req-97176d72-5811-4702-8d64-4218af710c9f 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Feb 16 13:20:36 compute-0 nova_compute[185723]: 2026-02-16 13:20:36.475 185727 DEBUG nova.virt.hardware [None req-97176d72-5811-4702-8d64-4218af710c9f 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Feb 16 13:20:36 compute-0 nova_compute[185723]: 2026-02-16 13:20:36.475 185727 DEBUG nova.virt.hardware [None req-97176d72-5811-4702-8d64-4218af710c9f 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Feb 16 13:20:36 compute-0 nova_compute[185723]: 2026-02-16 13:20:36.475 185727 DEBUG nova.virt.hardware [None req-97176d72-5811-4702-8d64-4218af710c9f 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Feb 16 13:20:36 compute-0 nova_compute[185723]: 2026-02-16 13:20:36.475 185727 DEBUG nova.virt.hardware [None req-97176d72-5811-4702-8d64-4218af710c9f 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Feb 16 13:20:36 compute-0 nova_compute[185723]: 2026-02-16 13:20:36.475 185727 DEBUG nova.virt.hardware [None req-97176d72-5811-4702-8d64-4218af710c9f 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Feb 16 13:20:36 compute-0 nova_compute[185723]: 2026-02-16 13:20:36.475 185727 DEBUG nova.virt.hardware [None req-97176d72-5811-4702-8d64-4218af710c9f 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Feb 16 13:20:36 compute-0 nova_compute[185723]: 2026-02-16 13:20:36.476 185727 DEBUG nova.virt.hardware [None req-97176d72-5811-4702-8d64-4218af710c9f 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Feb 16 13:20:36 compute-0 nova_compute[185723]: 2026-02-16 13:20:36.476 185727 DEBUG nova.virt.hardware [None req-97176d72-5811-4702-8d64-4218af710c9f 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Feb 16 13:20:36 compute-0 nova_compute[185723]: 2026-02-16 13:20:36.479 185727 DEBUG nova.privsep.utils [None req-97176d72-5811-4702-8d64-4218af710c9f 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Feb 16 13:20:36 compute-0 nova_compute[185723]: 2026-02-16 13:20:36.480 185727 DEBUG nova.virt.libvirt.vif [None req-97176d72-5811-4702-8d64-4218af710c9f 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-16T13:20:27Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-727824786',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-727824786',id=1,image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b5e0321e3a614b62a46eef7fb2e737ff',ramdisk_id='',reservation_id='r-7k7vpckb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteActionsViaActuator-1504038973',owner_user_name='tempest-TestExecuteActionsViaActuator-1504038973-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-16T13:20:30Z,user_data=None,user_id='53b5045c5aaf4a7d8d84dce2ac4aa424',uuid=934dfad2-33a3-44dd-82c8-0b913e89cb8e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b0642d70-aac9-4a19-b18b-6f6a914d307a", "address": "fa:16:3e:b1:7c:d9", "network": {"id": "a6199784-1742-41a7-9152-bb54abb7bef1", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-926379118-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5e0321e3a614b62a46eef7fb2e737ff", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb0642d70-aa", "ovs_interfaceid": "b0642d70-aac9-4a19-b18b-6f6a914d307a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Feb 16 13:20:36 compute-0 nova_compute[185723]: 2026-02-16 13:20:36.480 185727 DEBUG nova.network.os_vif_util [None req-97176d72-5811-4702-8d64-4218af710c9f 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Converting VIF {"id": "b0642d70-aac9-4a19-b18b-6f6a914d307a", "address": "fa:16:3e:b1:7c:d9", "network": {"id": "a6199784-1742-41a7-9152-bb54abb7bef1", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-926379118-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5e0321e3a614b62a46eef7fb2e737ff", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb0642d70-aa", "ovs_interfaceid": "b0642d70-aac9-4a19-b18b-6f6a914d307a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 13:20:36 compute-0 nova_compute[185723]: 2026-02-16 13:20:36.481 185727 DEBUG nova.network.os_vif_util [None req-97176d72-5811-4702-8d64-4218af710c9f 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b1:7c:d9,bridge_name='br-int',has_traffic_filtering=True,id=b0642d70-aac9-4a19-b18b-6f6a914d307a,network=Network(a6199784-1742-41a7-9152-bb54abb7bef1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb0642d70-aa') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 13:20:36 compute-0 nova_compute[185723]: 2026-02-16 13:20:36.482 185727 DEBUG nova.objects.instance [None req-97176d72-5811-4702-8d64-4218af710c9f 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Lazy-loading 'pci_devices' on Instance uuid 934dfad2-33a3-44dd-82c8-0b913e89cb8e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 13:20:36 compute-0 nova_compute[185723]: 2026-02-16 13:20:36.498 185727 DEBUG nova.virt.libvirt.driver [None req-97176d72-5811-4702-8d64-4218af710c9f 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 934dfad2-33a3-44dd-82c8-0b913e89cb8e] End _get_guest_xml xml=<domain type="kvm">
Feb 16 13:20:36 compute-0 nova_compute[185723]:   <uuid>934dfad2-33a3-44dd-82c8-0b913e89cb8e</uuid>
Feb 16 13:20:36 compute-0 nova_compute[185723]:   <name>instance-00000001</name>
Feb 16 13:20:36 compute-0 nova_compute[185723]:   <memory>131072</memory>
Feb 16 13:20:36 compute-0 nova_compute[185723]:   <vcpu>1</vcpu>
Feb 16 13:20:36 compute-0 nova_compute[185723]:   <metadata>
Feb 16 13:20:36 compute-0 nova_compute[185723]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb 16 13:20:36 compute-0 nova_compute[185723]:       <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Feb 16 13:20:36 compute-0 nova_compute[185723]:       <nova:name>tempest-TestExecuteActionsViaActuator-server-727824786</nova:name>
Feb 16 13:20:36 compute-0 nova_compute[185723]:       <nova:creationTime>2026-02-16 13:20:36</nova:creationTime>
Feb 16 13:20:36 compute-0 nova_compute[185723]:       <nova:flavor name="m1.nano">
Feb 16 13:20:36 compute-0 nova_compute[185723]:         <nova:memory>128</nova:memory>
Feb 16 13:20:36 compute-0 nova_compute[185723]:         <nova:disk>1</nova:disk>
Feb 16 13:20:36 compute-0 nova_compute[185723]:         <nova:swap>0</nova:swap>
Feb 16 13:20:36 compute-0 nova_compute[185723]:         <nova:ephemeral>0</nova:ephemeral>
Feb 16 13:20:36 compute-0 nova_compute[185723]:         <nova:vcpus>1</nova:vcpus>
Feb 16 13:20:36 compute-0 nova_compute[185723]:       </nova:flavor>
Feb 16 13:20:36 compute-0 nova_compute[185723]:       <nova:owner>
Feb 16 13:20:36 compute-0 nova_compute[185723]:         <nova:user uuid="53b5045c5aaf4a7d8d84dce2ac4aa424">tempest-TestExecuteActionsViaActuator-1504038973-project-member</nova:user>
Feb 16 13:20:36 compute-0 nova_compute[185723]:         <nova:project uuid="b5e0321e3a614b62a46eef7fb2e737ff">tempest-TestExecuteActionsViaActuator-1504038973</nova:project>
Feb 16 13:20:36 compute-0 nova_compute[185723]:       </nova:owner>
Feb 16 13:20:36 compute-0 nova_compute[185723]:       <nova:root type="image" uuid="6fb9af7f-2971-4890-a777-6e99e888717f"/>
Feb 16 13:20:36 compute-0 nova_compute[185723]:       <nova:ports>
Feb 16 13:20:36 compute-0 nova_compute[185723]:         <nova:port uuid="b0642d70-aac9-4a19-b18b-6f6a914d307a">
Feb 16 13:20:36 compute-0 nova_compute[185723]:           <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Feb 16 13:20:36 compute-0 nova_compute[185723]:         </nova:port>
Feb 16 13:20:36 compute-0 nova_compute[185723]:       </nova:ports>
Feb 16 13:20:36 compute-0 nova_compute[185723]:     </nova:instance>
Feb 16 13:20:36 compute-0 nova_compute[185723]:   </metadata>
Feb 16 13:20:36 compute-0 nova_compute[185723]:   <sysinfo type="smbios">
Feb 16 13:20:36 compute-0 nova_compute[185723]:     <system>
Feb 16 13:20:36 compute-0 nova_compute[185723]:       <entry name="manufacturer">RDO</entry>
Feb 16 13:20:36 compute-0 nova_compute[185723]:       <entry name="product">OpenStack Compute</entry>
Feb 16 13:20:36 compute-0 nova_compute[185723]:       <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Feb 16 13:20:36 compute-0 nova_compute[185723]:       <entry name="serial">934dfad2-33a3-44dd-82c8-0b913e89cb8e</entry>
Feb 16 13:20:36 compute-0 nova_compute[185723]:       <entry name="uuid">934dfad2-33a3-44dd-82c8-0b913e89cb8e</entry>
Feb 16 13:20:36 compute-0 nova_compute[185723]:       <entry name="family">Virtual Machine</entry>
Feb 16 13:20:36 compute-0 nova_compute[185723]:     </system>
Feb 16 13:20:36 compute-0 nova_compute[185723]:   </sysinfo>
Feb 16 13:20:36 compute-0 nova_compute[185723]:   <os>
Feb 16 13:20:36 compute-0 nova_compute[185723]:     <type arch="x86_64" machine="q35">hvm</type>
Feb 16 13:20:36 compute-0 nova_compute[185723]:     <boot dev="hd"/>
Feb 16 13:20:36 compute-0 nova_compute[185723]:     <smbios mode="sysinfo"/>
Feb 16 13:20:36 compute-0 nova_compute[185723]:   </os>
Feb 16 13:20:36 compute-0 nova_compute[185723]:   <features>
Feb 16 13:20:36 compute-0 nova_compute[185723]:     <acpi/>
Feb 16 13:20:36 compute-0 nova_compute[185723]:     <apic/>
Feb 16 13:20:36 compute-0 nova_compute[185723]:     <vmcoreinfo/>
Feb 16 13:20:36 compute-0 nova_compute[185723]:   </features>
Feb 16 13:20:36 compute-0 nova_compute[185723]:   <clock offset="utc">
Feb 16 13:20:36 compute-0 nova_compute[185723]:     <timer name="pit" tickpolicy="delay"/>
Feb 16 13:20:36 compute-0 nova_compute[185723]:     <timer name="rtc" tickpolicy="catchup"/>
Feb 16 13:20:36 compute-0 nova_compute[185723]:     <timer name="hpet" present="no"/>
Feb 16 13:20:36 compute-0 nova_compute[185723]:   </clock>
Feb 16 13:20:36 compute-0 nova_compute[185723]:   <cpu mode="custom" match="exact">
Feb 16 13:20:36 compute-0 nova_compute[185723]:     <model>Nehalem</model>
Feb 16 13:20:36 compute-0 nova_compute[185723]:     <topology sockets="1" cores="1" threads="1"/>
Feb 16 13:20:36 compute-0 nova_compute[185723]:   </cpu>
Feb 16 13:20:36 compute-0 nova_compute[185723]:   <devices>
Feb 16 13:20:36 compute-0 nova_compute[185723]:     <disk type="file" device="disk">
Feb 16 13:20:36 compute-0 nova_compute[185723]:       <driver name="qemu" type="qcow2" cache="none"/>
Feb 16 13:20:36 compute-0 nova_compute[185723]:       <source file="/var/lib/nova/instances/934dfad2-33a3-44dd-82c8-0b913e89cb8e/disk"/>
Feb 16 13:20:36 compute-0 nova_compute[185723]:       <target dev="vda" bus="virtio"/>
Feb 16 13:20:36 compute-0 nova_compute[185723]:     </disk>
Feb 16 13:20:36 compute-0 nova_compute[185723]:     <disk type="file" device="cdrom">
Feb 16 13:20:36 compute-0 nova_compute[185723]:       <driver name="qemu" type="raw" cache="none"/>
Feb 16 13:20:36 compute-0 nova_compute[185723]:       <source file="/var/lib/nova/instances/934dfad2-33a3-44dd-82c8-0b913e89cb8e/disk.config"/>
Feb 16 13:20:36 compute-0 nova_compute[185723]:       <target dev="sda" bus="sata"/>
Feb 16 13:20:36 compute-0 nova_compute[185723]:     </disk>
Feb 16 13:20:36 compute-0 nova_compute[185723]:     <interface type="ethernet">
Feb 16 13:20:36 compute-0 nova_compute[185723]:       <mac address="fa:16:3e:b1:7c:d9"/>
Feb 16 13:20:36 compute-0 nova_compute[185723]:       <model type="virtio"/>
Feb 16 13:20:36 compute-0 nova_compute[185723]:       <driver name="vhost" rx_queue_size="512"/>
Feb 16 13:20:36 compute-0 nova_compute[185723]:       <mtu size="1442"/>
Feb 16 13:20:36 compute-0 nova_compute[185723]:       <target dev="tapb0642d70-aa"/>
Feb 16 13:20:36 compute-0 nova_compute[185723]:     </interface>
Feb 16 13:20:36 compute-0 nova_compute[185723]:     <serial type="pty">
Feb 16 13:20:36 compute-0 nova_compute[185723]:       <log file="/var/lib/nova/instances/934dfad2-33a3-44dd-82c8-0b913e89cb8e/console.log" append="off"/>
Feb 16 13:20:36 compute-0 nova_compute[185723]:     </serial>
Feb 16 13:20:36 compute-0 nova_compute[185723]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Feb 16 13:20:36 compute-0 nova_compute[185723]:     <video>
Feb 16 13:20:36 compute-0 nova_compute[185723]:       <model type="virtio"/>
Feb 16 13:20:36 compute-0 nova_compute[185723]:     </video>
Feb 16 13:20:36 compute-0 nova_compute[185723]:     <input type="tablet" bus="usb"/>
Feb 16 13:20:36 compute-0 nova_compute[185723]:     <rng model="virtio">
Feb 16 13:20:36 compute-0 nova_compute[185723]:       <backend model="random">/dev/urandom</backend>
Feb 16 13:20:36 compute-0 nova_compute[185723]:     </rng>
Feb 16 13:20:36 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root"/>
Feb 16 13:20:36 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:20:36 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:20:36 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:20:36 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:20:36 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:20:36 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:20:36 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:20:36 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:20:36 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:20:36 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:20:36 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:20:36 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:20:36 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:20:36 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:20:36 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:20:36 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:20:36 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:20:36 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:20:36 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:20:36 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:20:36 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:20:36 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:20:36 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:20:36 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:20:36 compute-0 nova_compute[185723]:     <controller type="usb" index="0"/>
Feb 16 13:20:36 compute-0 nova_compute[185723]:     <memballoon model="virtio">
Feb 16 13:20:36 compute-0 nova_compute[185723]:       <stats period="10"/>
Feb 16 13:20:36 compute-0 nova_compute[185723]:     </memballoon>
Feb 16 13:20:36 compute-0 nova_compute[185723]:   </devices>
Feb 16 13:20:36 compute-0 nova_compute[185723]: </domain>
Feb 16 13:20:36 compute-0 nova_compute[185723]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Feb 16 13:20:36 compute-0 nova_compute[185723]: 2026-02-16 13:20:36.499 185727 DEBUG nova.compute.manager [None req-97176d72-5811-4702-8d64-4218af710c9f 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 934dfad2-33a3-44dd-82c8-0b913e89cb8e] Preparing to wait for external event network-vif-plugged-b0642d70-aac9-4a19-b18b-6f6a914d307a prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Feb 16 13:20:36 compute-0 nova_compute[185723]: 2026-02-16 13:20:36.500 185727 DEBUG oslo_concurrency.lockutils [None req-97176d72-5811-4702-8d64-4218af710c9f 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Acquiring lock "934dfad2-33a3-44dd-82c8-0b913e89cb8e-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:20:36 compute-0 nova_compute[185723]: 2026-02-16 13:20:36.500 185727 DEBUG oslo_concurrency.lockutils [None req-97176d72-5811-4702-8d64-4218af710c9f 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Lock "934dfad2-33a3-44dd-82c8-0b913e89cb8e-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:20:36 compute-0 nova_compute[185723]: 2026-02-16 13:20:36.500 185727 DEBUG oslo_concurrency.lockutils [None req-97176d72-5811-4702-8d64-4218af710c9f 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Lock "934dfad2-33a3-44dd-82c8-0b913e89cb8e-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:20:36 compute-0 nova_compute[185723]: 2026-02-16 13:20:36.501 185727 DEBUG nova.virt.libvirt.vif [None req-97176d72-5811-4702-8d64-4218af710c9f 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-16T13:20:27Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-727824786',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-727824786',id=1,image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b5e0321e3a614b62a46eef7fb2e737ff',ramdisk_id='',reservation_id='r-7k7vpckb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteActionsViaActuator-1504038973',owner_user_name='tempest-TestExecuteActionsViaActuator-1504038973-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-16T13:20:30Z,user_data=None,user_id='53b5045c5aaf4a7d8d84dce2ac4aa424',uuid=934dfad2-33a3-44dd-82c8-0b913e89cb8e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b0642d70-aac9-4a19-b18b-6f6a914d307a", "address": "fa:16:3e:b1:7c:d9", "network": {"id": "a6199784-1742-41a7-9152-bb54abb7bef1", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-926379118-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5e0321e3a614b62a46eef7fb2e737ff", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb0642d70-aa", "ovs_interfaceid": "b0642d70-aac9-4a19-b18b-6f6a914d307a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Feb 16 13:20:36 compute-0 nova_compute[185723]: 2026-02-16 13:20:36.501 185727 DEBUG nova.network.os_vif_util [None req-97176d72-5811-4702-8d64-4218af710c9f 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Converting VIF {"id": "b0642d70-aac9-4a19-b18b-6f6a914d307a", "address": "fa:16:3e:b1:7c:d9", "network": {"id": "a6199784-1742-41a7-9152-bb54abb7bef1", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-926379118-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5e0321e3a614b62a46eef7fb2e737ff", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb0642d70-aa", "ovs_interfaceid": "b0642d70-aac9-4a19-b18b-6f6a914d307a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 13:20:36 compute-0 nova_compute[185723]: 2026-02-16 13:20:36.502 185727 DEBUG nova.network.os_vif_util [None req-97176d72-5811-4702-8d64-4218af710c9f 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b1:7c:d9,bridge_name='br-int',has_traffic_filtering=True,id=b0642d70-aac9-4a19-b18b-6f6a914d307a,network=Network(a6199784-1742-41a7-9152-bb54abb7bef1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb0642d70-aa') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 13:20:36 compute-0 nova_compute[185723]: 2026-02-16 13:20:36.502 185727 DEBUG os_vif [None req-97176d72-5811-4702-8d64-4218af710c9f 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b1:7c:d9,bridge_name='br-int',has_traffic_filtering=True,id=b0642d70-aac9-4a19-b18b-6f6a914d307a,network=Network(a6199784-1742-41a7-9152-bb54abb7bef1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb0642d70-aa') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Feb 16 13:20:36 compute-0 nova_compute[185723]: 2026-02-16 13:20:36.531 185727 DEBUG ovsdbapp.backend.ovs_idl [None req-97176d72-5811-4702-8d64-4218af710c9f 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Feb 16 13:20:36 compute-0 nova_compute[185723]: 2026-02-16 13:20:36.531 185727 DEBUG ovsdbapp.backend.ovs_idl [None req-97176d72-5811-4702-8d64-4218af710c9f 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Feb 16 13:20:36 compute-0 nova_compute[185723]: 2026-02-16 13:20:36.531 185727 DEBUG ovsdbapp.backend.ovs_idl [None req-97176d72-5811-4702-8d64-4218af710c9f 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Feb 16 13:20:36 compute-0 nova_compute[185723]: 2026-02-16 13:20:36.532 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-97176d72-5811-4702-8d64-4218af710c9f 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] tcp:127.0.0.1:6640: entering CONNECTING _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 16 13:20:36 compute-0 nova_compute[185723]: 2026-02-16 13:20:36.533 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-97176d72-5811-4702-8d64-4218af710c9f 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [POLLOUT] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:20:36 compute-0 nova_compute[185723]: 2026-02-16 13:20:36.533 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-97176d72-5811-4702-8d64-4218af710c9f 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 16 13:20:36 compute-0 nova_compute[185723]: 2026-02-16 13:20:36.533 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-97176d72-5811-4702-8d64-4218af710c9f 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:20:36 compute-0 nova_compute[185723]: 2026-02-16 13:20:36.535 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-97176d72-5811-4702-8d64-4218af710c9f 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:20:36 compute-0 nova_compute[185723]: 2026-02-16 13:20:36.537 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-97176d72-5811-4702-8d64-4218af710c9f 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:20:36 compute-0 nova_compute[185723]: 2026-02-16 13:20:36.545 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:20:36 compute-0 nova_compute[185723]: 2026-02-16 13:20:36.545 185727 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:20:36 compute-0 nova_compute[185723]: 2026-02-16 13:20:36.546 185727 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 13:20:36 compute-0 nova_compute[185723]: 2026-02-16 13:20:36.546 185727 INFO oslo.privsep.daemon [None req-97176d72-5811-4702-8d64-4218af710c9f 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'vif_plug_ovs.privsep.vif_plug', '--privsep_sock_path', '/tmp/tmpfp44kkav/privsep.sock']
Feb 16 13:20:36 compute-0 nova_compute[185723]: 2026-02-16 13:20:36.980 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:20:37 compute-0 nova_compute[185723]: 2026-02-16 13:20:37.205 185727 INFO oslo.privsep.daemon [None req-97176d72-5811-4702-8d64-4218af710c9f 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Spawned new privsep daemon via rootwrap
Feb 16 13:20:37 compute-0 nova_compute[185723]: 2026-02-16 13:20:37.089 206386 INFO oslo.privsep.daemon [-] privsep daemon starting
Feb 16 13:20:37 compute-0 nova_compute[185723]: 2026-02-16 13:20:37.093 206386 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Feb 16 13:20:37 compute-0 nova_compute[185723]: 2026-02-16 13:20:37.094 206386 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_NET_ADMIN/CAP_DAC_OVERRIDE|CAP_NET_ADMIN/none
Feb 16 13:20:37 compute-0 nova_compute[185723]: 2026-02-16 13:20:37.095 206386 INFO oslo.privsep.daemon [-] privsep daemon running as pid 206386
Feb 16 13:20:37 compute-0 nova_compute[185723]: 2026-02-16 13:20:37.586 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:20:37 compute-0 nova_compute[185723]: 2026-02-16 13:20:37.586 185727 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb0642d70-aa, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:20:37 compute-0 nova_compute[185723]: 2026-02-16 13:20:37.587 185727 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapb0642d70-aa, col_values=(('external_ids', {'iface-id': 'b0642d70-aac9-4a19-b18b-6f6a914d307a', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:b1:7c:d9', 'vm-uuid': '934dfad2-33a3-44dd-82c8-0b913e89cb8e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:20:37 compute-0 NetworkManager[56177]: <info>  [1771248037.7781] manager: (tapb0642d70-aa): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/20)
Feb 16 13:20:37 compute-0 nova_compute[185723]: 2026-02-16 13:20:37.778 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:20:37 compute-0 nova_compute[185723]: 2026-02-16 13:20:37.781 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 13:20:37 compute-0 nova_compute[185723]: 2026-02-16 13:20:37.783 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:20:37 compute-0 nova_compute[185723]: 2026-02-16 13:20:37.783 185727 INFO os_vif [None req-97176d72-5811-4702-8d64-4218af710c9f 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b1:7c:d9,bridge_name='br-int',has_traffic_filtering=True,id=b0642d70-aac9-4a19-b18b-6f6a914d307a,network=Network(a6199784-1742-41a7-9152-bb54abb7bef1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb0642d70-aa')
Feb 16 13:20:37 compute-0 nova_compute[185723]: 2026-02-16 13:20:37.844 185727 DEBUG nova.virt.libvirt.driver [None req-97176d72-5811-4702-8d64-4218af710c9f 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 16 13:20:37 compute-0 nova_compute[185723]: 2026-02-16 13:20:37.845 185727 DEBUG nova.virt.libvirt.driver [None req-97176d72-5811-4702-8d64-4218af710c9f 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 16 13:20:37 compute-0 nova_compute[185723]: 2026-02-16 13:20:37.845 185727 DEBUG nova.virt.libvirt.driver [None req-97176d72-5811-4702-8d64-4218af710c9f 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] No VIF found with MAC fa:16:3e:b1:7c:d9, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Feb 16 13:20:37 compute-0 nova_compute[185723]: 2026-02-16 13:20:37.845 185727 INFO nova.virt.libvirt.driver [None req-97176d72-5811-4702-8d64-4218af710c9f 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 934dfad2-33a3-44dd-82c8-0b913e89cb8e] Using config drive
Feb 16 13:20:39 compute-0 sshd-session[206390]: Connection closed by authenticating user root 146.190.226.24 port 50614 [preauth]
Feb 16 13:20:39 compute-0 nova_compute[185723]: 2026-02-16 13:20:39.818 185727 DEBUG nova.network.neutron [req-d0417391-fdab-428c-8cbe-b4ed1b20318d req-10b86116-022d-4ea9-8008-ec124c4d5c48 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 934dfad2-33a3-44dd-82c8-0b913e89cb8e] Updated VIF entry in instance network info cache for port b0642d70-aac9-4a19-b18b-6f6a914d307a. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 16 13:20:39 compute-0 nova_compute[185723]: 2026-02-16 13:20:39.819 185727 DEBUG nova.network.neutron [req-d0417391-fdab-428c-8cbe-b4ed1b20318d req-10b86116-022d-4ea9-8008-ec124c4d5c48 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 934dfad2-33a3-44dd-82c8-0b913e89cb8e] Updating instance_info_cache with network_info: [{"id": "b0642d70-aac9-4a19-b18b-6f6a914d307a", "address": "fa:16:3e:b1:7c:d9", "network": {"id": "a6199784-1742-41a7-9152-bb54abb7bef1", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-926379118-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5e0321e3a614b62a46eef7fb2e737ff", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb0642d70-aa", "ovs_interfaceid": "b0642d70-aac9-4a19-b18b-6f6a914d307a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 13:20:39 compute-0 nova_compute[185723]: 2026-02-16 13:20:39.842 185727 DEBUG oslo_concurrency.lockutils [req-d0417391-fdab-428c-8cbe-b4ed1b20318d req-10b86116-022d-4ea9-8008-ec124c4d5c48 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Releasing lock "refresh_cache-934dfad2-33a3-44dd-82c8-0b913e89cb8e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 13:20:40 compute-0 nova_compute[185723]: 2026-02-16 13:20:40.031 185727 INFO nova.virt.libvirt.driver [None req-97176d72-5811-4702-8d64-4218af710c9f 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 934dfad2-33a3-44dd-82c8-0b913e89cb8e] Creating config drive at /var/lib/nova/instances/934dfad2-33a3-44dd-82c8-0b913e89cb8e/disk.config
Feb 16 13:20:40 compute-0 nova_compute[185723]: 2026-02-16 13:20:40.035 185727 DEBUG oslo_concurrency.processutils [None req-97176d72-5811-4702-8d64-4218af710c9f 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/934dfad2-33a3-44dd-82c8-0b913e89cb8e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpre2id2b3 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:20:40 compute-0 nova_compute[185723]: 2026-02-16 13:20:40.154 185727 DEBUG oslo_concurrency.processutils [None req-97176d72-5811-4702-8d64-4218af710c9f 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/934dfad2-33a3-44dd-82c8-0b913e89cb8e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpre2id2b3" returned: 0 in 0.119s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:20:40 compute-0 kernel: tun: Universal TUN/TAP device driver, 1.6
Feb 16 13:20:40 compute-0 kernel: tapb0642d70-aa: entered promiscuous mode
Feb 16 13:20:40 compute-0 NetworkManager[56177]: <info>  [1771248040.2442] manager: (tapb0642d70-aa): new Tun device (/org/freedesktop/NetworkManager/Devices/21)
Feb 16 13:20:40 compute-0 ovn_controller[96072]: 2026-02-16T13:20:40Z|00027|binding|INFO|Claiming lport b0642d70-aac9-4a19-b18b-6f6a914d307a for this chassis.
Feb 16 13:20:40 compute-0 ovn_controller[96072]: 2026-02-16T13:20:40Z|00028|binding|INFO|b0642d70-aac9-4a19-b18b-6f6a914d307a: Claiming fa:16:3e:b1:7c:d9 10.100.0.6
Feb 16 13:20:40 compute-0 nova_compute[185723]: 2026-02-16 13:20:40.259 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:20:40 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:20:40.280 105360 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b1:7c:d9 10.100.0.6'], port_security=['fa:16:3e:b1:7c:d9 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '934dfad2-33a3-44dd-82c8-0b913e89cb8e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a6199784-1742-41a7-9152-bb54abb7bef1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b5e0321e3a614b62a46eef7fb2e737ff', 'neutron:revision_number': '2', 'neutron:security_group_ids': '22e3f3ae-6435-49f2-b1a3-ead6d5ff75b1', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5a8796a1-c459-4e68-a95d-23fef829aa8d, chassis=[<ovs.db.idl.Row object at 0x7f2e53046760>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2e53046760>], logical_port=b0642d70-aac9-4a19-b18b-6f6a914d307a) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 13:20:40 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:20:40.281 105360 INFO neutron.agent.ovn.metadata.agent [-] Port b0642d70-aac9-4a19-b18b-6f6a914d307a in datapath a6199784-1742-41a7-9152-bb54abb7bef1 bound to our chassis
Feb 16 13:20:40 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:20:40.283 105360 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network a6199784-1742-41a7-9152-bb54abb7bef1
Feb 16 13:20:40 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:20:40.284 105360 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.default', '--privsep_sock_path', '/tmp/tmpbmq7jd2e/privsep.sock']
Feb 16 13:20:40 compute-0 systemd-udevd[206414]: Network interface NamePolicy= disabled on kernel command line.
Feb 16 13:20:40 compute-0 NetworkManager[56177]: <info>  [1771248040.2980] device (tapb0642d70-aa): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 16 13:20:40 compute-0 NetworkManager[56177]: <info>  [1771248040.2991] device (tapb0642d70-aa): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Feb 16 13:20:40 compute-0 nova_compute[185723]: 2026-02-16 13:20:40.302 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:20:40 compute-0 systemd-machined[155229]: New machine qemu-1-instance-00000001.
Feb 16 13:20:40 compute-0 ovn_controller[96072]: 2026-02-16T13:20:40Z|00029|binding|INFO|Setting lport b0642d70-aac9-4a19-b18b-6f6a914d307a ovn-installed in OVS
Feb 16 13:20:40 compute-0 ovn_controller[96072]: 2026-02-16T13:20:40Z|00030|binding|INFO|Setting lport b0642d70-aac9-4a19-b18b-6f6a914d307a up in Southbound
Feb 16 13:20:40 compute-0 nova_compute[185723]: 2026-02-16 13:20:40.305 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:20:40 compute-0 systemd[1]: Started Virtual Machine qemu-1-instance-00000001.
Feb 16 13:20:40 compute-0 nova_compute[185723]: 2026-02-16 13:20:40.798 185727 DEBUG nova.virt.driver [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] Emitting event <LifecycleEvent: 1771248040.7979403, 934dfad2-33a3-44dd-82c8-0b913e89cb8e => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 13:20:40 compute-0 nova_compute[185723]: 2026-02-16 13:20:40.800 185727 INFO nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: 934dfad2-33a3-44dd-82c8-0b913e89cb8e] VM Started (Lifecycle Event)
Feb 16 13:20:40 compute-0 nova_compute[185723]: 2026-02-16 13:20:40.877 185727 DEBUG nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: 934dfad2-33a3-44dd-82c8-0b913e89cb8e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:20:40 compute-0 nova_compute[185723]: 2026-02-16 13:20:40.880 185727 DEBUG nova.virt.driver [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] Emitting event <LifecycleEvent: 1771248040.7981591, 934dfad2-33a3-44dd-82c8-0b913e89cb8e => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 13:20:40 compute-0 nova_compute[185723]: 2026-02-16 13:20:40.881 185727 INFO nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: 934dfad2-33a3-44dd-82c8-0b913e89cb8e] VM Paused (Lifecycle Event)
Feb 16 13:20:40 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:20:40.907 105360 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Feb 16 13:20:40 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:20:40.908 105360 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpbmq7jd2e/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362
Feb 16 13:20:40 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:20:40.790 206438 INFO oslo.privsep.daemon [-] privsep daemon starting
Feb 16 13:20:40 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:20:40.795 206438 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Feb 16 13:20:40 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:20:40.797 206438 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/none
Feb 16 13:20:40 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:20:40.797 206438 INFO oslo.privsep.daemon [-] privsep daemon running as pid 206438
Feb 16 13:20:40 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:20:40.910 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[ae34ff2b-d08e-4420-8bb9-236d4979682a]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:20:40 compute-0 nova_compute[185723]: 2026-02-16 13:20:40.922 185727 DEBUG nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: 934dfad2-33a3-44dd-82c8-0b913e89cb8e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:20:40 compute-0 nova_compute[185723]: 2026-02-16 13:20:40.925 185727 DEBUG nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: 934dfad2-33a3-44dd-82c8-0b913e89cb8e] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 16 13:20:40 compute-0 nova_compute[185723]: 2026-02-16 13:20:40.959 185727 INFO nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: 934dfad2-33a3-44dd-82c8-0b913e89cb8e] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 16 13:20:41 compute-0 nova_compute[185723]: 2026-02-16 13:20:41.057 185727 DEBUG nova.compute.manager [req-740c4928-c6a7-4698-94a3-e214569e988a req-64b84306-ca95-4e44-9353-0648d176d31f faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 934dfad2-33a3-44dd-82c8-0b913e89cb8e] Received event network-vif-plugged-b0642d70-aac9-4a19-b18b-6f6a914d307a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:20:41 compute-0 nova_compute[185723]: 2026-02-16 13:20:41.057 185727 DEBUG oslo_concurrency.lockutils [req-740c4928-c6a7-4698-94a3-e214569e988a req-64b84306-ca95-4e44-9353-0648d176d31f faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "934dfad2-33a3-44dd-82c8-0b913e89cb8e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:20:41 compute-0 nova_compute[185723]: 2026-02-16 13:20:41.057 185727 DEBUG oslo_concurrency.lockutils [req-740c4928-c6a7-4698-94a3-e214569e988a req-64b84306-ca95-4e44-9353-0648d176d31f faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "934dfad2-33a3-44dd-82c8-0b913e89cb8e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:20:41 compute-0 nova_compute[185723]: 2026-02-16 13:20:41.058 185727 DEBUG oslo_concurrency.lockutils [req-740c4928-c6a7-4698-94a3-e214569e988a req-64b84306-ca95-4e44-9353-0648d176d31f faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "934dfad2-33a3-44dd-82c8-0b913e89cb8e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:20:41 compute-0 nova_compute[185723]: 2026-02-16 13:20:41.058 185727 DEBUG nova.compute.manager [req-740c4928-c6a7-4698-94a3-e214569e988a req-64b84306-ca95-4e44-9353-0648d176d31f faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 934dfad2-33a3-44dd-82c8-0b913e89cb8e] Processing event network-vif-plugged-b0642d70-aac9-4a19-b18b-6f6a914d307a _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Feb 16 13:20:41 compute-0 nova_compute[185723]: 2026-02-16 13:20:41.058 185727 DEBUG nova.compute.manager [None req-97176d72-5811-4702-8d64-4218af710c9f 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 934dfad2-33a3-44dd-82c8-0b913e89cb8e] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Feb 16 13:20:41 compute-0 nova_compute[185723]: 2026-02-16 13:20:41.063 185727 DEBUG nova.virt.libvirt.driver [None req-97176d72-5811-4702-8d64-4218af710c9f 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 934dfad2-33a3-44dd-82c8-0b913e89cb8e] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Feb 16 13:20:41 compute-0 nova_compute[185723]: 2026-02-16 13:20:41.064 185727 DEBUG nova.virt.driver [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] Emitting event <LifecycleEvent: 1771248041.063506, 934dfad2-33a3-44dd-82c8-0b913e89cb8e => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 13:20:41 compute-0 nova_compute[185723]: 2026-02-16 13:20:41.064 185727 INFO nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: 934dfad2-33a3-44dd-82c8-0b913e89cb8e] VM Resumed (Lifecycle Event)
Feb 16 13:20:41 compute-0 nova_compute[185723]: 2026-02-16 13:20:41.072 185727 INFO nova.virt.libvirt.driver [-] [instance: 934dfad2-33a3-44dd-82c8-0b913e89cb8e] Instance spawned successfully.
Feb 16 13:20:41 compute-0 nova_compute[185723]: 2026-02-16 13:20:41.072 185727 DEBUG nova.virt.libvirt.driver [None req-97176d72-5811-4702-8d64-4218af710c9f 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 934dfad2-33a3-44dd-82c8-0b913e89cb8e] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Feb 16 13:20:41 compute-0 nova_compute[185723]: 2026-02-16 13:20:41.091 185727 DEBUG nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: 934dfad2-33a3-44dd-82c8-0b913e89cb8e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:20:41 compute-0 nova_compute[185723]: 2026-02-16 13:20:41.098 185727 DEBUG nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: 934dfad2-33a3-44dd-82c8-0b913e89cb8e] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 16 13:20:41 compute-0 nova_compute[185723]: 2026-02-16 13:20:41.100 185727 DEBUG nova.virt.libvirt.driver [None req-97176d72-5811-4702-8d64-4218af710c9f 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 934dfad2-33a3-44dd-82c8-0b913e89cb8e] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:20:41 compute-0 nova_compute[185723]: 2026-02-16 13:20:41.101 185727 DEBUG nova.virt.libvirt.driver [None req-97176d72-5811-4702-8d64-4218af710c9f 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 934dfad2-33a3-44dd-82c8-0b913e89cb8e] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:20:41 compute-0 nova_compute[185723]: 2026-02-16 13:20:41.101 185727 DEBUG nova.virt.libvirt.driver [None req-97176d72-5811-4702-8d64-4218af710c9f 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 934dfad2-33a3-44dd-82c8-0b913e89cb8e] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:20:41 compute-0 nova_compute[185723]: 2026-02-16 13:20:41.101 185727 DEBUG nova.virt.libvirt.driver [None req-97176d72-5811-4702-8d64-4218af710c9f 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 934dfad2-33a3-44dd-82c8-0b913e89cb8e] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:20:41 compute-0 nova_compute[185723]: 2026-02-16 13:20:41.102 185727 DEBUG nova.virt.libvirt.driver [None req-97176d72-5811-4702-8d64-4218af710c9f 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 934dfad2-33a3-44dd-82c8-0b913e89cb8e] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:20:41 compute-0 nova_compute[185723]: 2026-02-16 13:20:41.102 185727 DEBUG nova.virt.libvirt.driver [None req-97176d72-5811-4702-8d64-4218af710c9f 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 934dfad2-33a3-44dd-82c8-0b913e89cb8e] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:20:41 compute-0 nova_compute[185723]: 2026-02-16 13:20:41.127 185727 INFO nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: 934dfad2-33a3-44dd-82c8-0b913e89cb8e] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 16 13:20:41 compute-0 nova_compute[185723]: 2026-02-16 13:20:41.213 185727 INFO nova.compute.manager [None req-97176d72-5811-4702-8d64-4218af710c9f 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 934dfad2-33a3-44dd-82c8-0b913e89cb8e] Took 10.19 seconds to spawn the instance on the hypervisor.
Feb 16 13:20:41 compute-0 nova_compute[185723]: 2026-02-16 13:20:41.214 185727 DEBUG nova.compute.manager [None req-97176d72-5811-4702-8d64-4218af710c9f 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 934dfad2-33a3-44dd-82c8-0b913e89cb8e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:20:41 compute-0 nova_compute[185723]: 2026-02-16 13:20:41.290 185727 INFO nova.compute.manager [None req-97176d72-5811-4702-8d64-4218af710c9f 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 934dfad2-33a3-44dd-82c8-0b913e89cb8e] Took 11.04 seconds to build instance.
Feb 16 13:20:41 compute-0 nova_compute[185723]: 2026-02-16 13:20:41.311 185727 DEBUG oslo_concurrency.lockutils [None req-97176d72-5811-4702-8d64-4218af710c9f 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Lock "934dfad2-33a3-44dd-82c8-0b913e89cb8e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 11.214s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:20:41 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:20:41.401 206438 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:20:41 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:20:41.401 206438 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:20:41 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:20:41.401 206438 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:20:41 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:20:41.950 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[60924345-50ae-43e3-b60e-36bda785a6bd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:20:41 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:20:41.951 105360 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapa6199784-11 in ovnmeta-a6199784-1742-41a7-9152-bb54abb7bef1 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Feb 16 13:20:41 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:20:41.953 206438 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapa6199784-10 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Feb 16 13:20:41 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:20:41.953 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[d17e0eed-0be6-4942-9f73-4b5e8dd4a660]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:20:41 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:20:41.955 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[c10bfe6d-a006-4f59-93fa-ce7f222af249]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:20:41 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:20:41.969 105762 DEBUG oslo.privsep.daemon [-] privsep: reply[6598400c-9b82-4470-abad-06f01377f8c5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:20:41 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:20:41.978 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[8d02b7e8-86b4-4c0a-a24e-1f7b9b1dfe91]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:20:41 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:20:41.980 105360 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.link_cmd', '--privsep_sock_path', '/tmp/tmpt3hz38v5/privsep.sock']
Feb 16 13:20:41 compute-0 nova_compute[185723]: 2026-02-16 13:20:41.982 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:20:42 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:20:42.557 105360 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Feb 16 13:20:42 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:20:42.558 105360 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpt3hz38v5/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362
Feb 16 13:20:42 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:20:42.453 206452 INFO oslo.privsep.daemon [-] privsep daemon starting
Feb 16 13:20:42 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:20:42.458 206452 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Feb 16 13:20:42 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:20:42.460 206452 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_NET_ADMIN|CAP_SYS_ADMIN/none
Feb 16 13:20:42 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:20:42.461 206452 INFO oslo.privsep.daemon [-] privsep daemon running as pid 206452
Feb 16 13:20:42 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:20:42.560 206452 DEBUG oslo.privsep.daemon [-] privsep: reply[b7099422-7a62-44da-a014-2c3fdcf91a78]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:20:42 compute-0 nova_compute[185723]: 2026-02-16 13:20:42.821 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:20:43 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:20:43.068 206452 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:20:43 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:20:43.069 206452 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:20:43 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:20:43.069 206452 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:20:43 compute-0 nova_compute[185723]: 2026-02-16 13:20:43.487 185727 DEBUG nova.compute.manager [req-6029e958-5749-4910-a561-eea58535d347 req-6eb3c98b-952b-4e7f-a272-7c9d5f56e6ab faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 934dfad2-33a3-44dd-82c8-0b913e89cb8e] Received event network-vif-plugged-b0642d70-aac9-4a19-b18b-6f6a914d307a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:20:43 compute-0 nova_compute[185723]: 2026-02-16 13:20:43.487 185727 DEBUG oslo_concurrency.lockutils [req-6029e958-5749-4910-a561-eea58535d347 req-6eb3c98b-952b-4e7f-a272-7c9d5f56e6ab faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "934dfad2-33a3-44dd-82c8-0b913e89cb8e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:20:43 compute-0 nova_compute[185723]: 2026-02-16 13:20:43.487 185727 DEBUG oslo_concurrency.lockutils [req-6029e958-5749-4910-a561-eea58535d347 req-6eb3c98b-952b-4e7f-a272-7c9d5f56e6ab faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "934dfad2-33a3-44dd-82c8-0b913e89cb8e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:20:43 compute-0 nova_compute[185723]: 2026-02-16 13:20:43.488 185727 DEBUG oslo_concurrency.lockutils [req-6029e958-5749-4910-a561-eea58535d347 req-6eb3c98b-952b-4e7f-a272-7c9d5f56e6ab faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "934dfad2-33a3-44dd-82c8-0b913e89cb8e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:20:43 compute-0 nova_compute[185723]: 2026-02-16 13:20:43.488 185727 DEBUG nova.compute.manager [req-6029e958-5749-4910-a561-eea58535d347 req-6eb3c98b-952b-4e7f-a272-7c9d5f56e6ab faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 934dfad2-33a3-44dd-82c8-0b913e89cb8e] No waiting events found dispatching network-vif-plugged-b0642d70-aac9-4a19-b18b-6f6a914d307a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:20:43 compute-0 nova_compute[185723]: 2026-02-16 13:20:43.488 185727 WARNING nova.compute.manager [req-6029e958-5749-4910-a561-eea58535d347 req-6eb3c98b-952b-4e7f-a272-7c9d5f56e6ab faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 934dfad2-33a3-44dd-82c8-0b913e89cb8e] Received unexpected event network-vif-plugged-b0642d70-aac9-4a19-b18b-6f6a914d307a for instance with vm_state active and task_state None.
Feb 16 13:20:43 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:20:43.608 206452 DEBUG oslo.privsep.daemon [-] privsep: reply[8a5f585d-f646-4621-9611-7d766dfd0c22]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:20:43 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:20:43.622 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[32226858-3406-42df-b615-782371996c7b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:20:43 compute-0 NetworkManager[56177]: <info>  [1771248043.6229] manager: (tapa6199784-10): new Veth device (/org/freedesktop/NetworkManager/Devices/22)
Feb 16 13:20:43 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:20:43.642 206452 DEBUG oslo.privsep.daemon [-] privsep: reply[fa2104a9-64a2-42b5-bcaf-5a39b61db72c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:20:43 compute-0 systemd-udevd[206464]: Network interface NamePolicy= disabled on kernel command line.
Feb 16 13:20:43 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:20:43.645 206452 DEBUG oslo.privsep.daemon [-] privsep: reply[ee96b1c3-ece0-4b20-a777-216905544536]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:20:43 compute-0 NetworkManager[56177]: <info>  [1771248043.6649] device (tapa6199784-10): carrier: link connected
Feb 16 13:20:43 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:20:43.664 206452 DEBUG oslo.privsep.daemon [-] privsep: reply[1dfe895f-f3ad-4d69-944d-fea53253de80]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:20:43 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:20:43.678 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[064684ac-5e8b-4a22-9a83-1c22d68b4efd]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa6199784-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:dd:b9:43'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 12], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 412705, 'reachable_time': 43643, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 206482, 'error': None, 'target': 'ovnmeta-a6199784-1742-41a7-9152-bb54abb7bef1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:20:43 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:20:43.688 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[2b095ea8-ce40-4914-a6fa-aee4caf05124]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fedd:b943'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 412705, 'tstamp': 412705}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 206483, 'error': None, 'target': 'ovnmeta-a6199784-1742-41a7-9152-bb54abb7bef1', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:20:43 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:20:43.699 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[abdb999b-56f3-4972-bb4b-9f91ad88c548]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa6199784-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:dd:b9:43'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 12], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 412705, 'reachable_time': 43643, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 206484, 'error': None, 'target': 'ovnmeta-a6199784-1742-41a7-9152-bb54abb7bef1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:20:43 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:20:43.719 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[ce8f7e3f-103a-4b40-a13b-d652da35e665]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:20:43 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:20:43.756 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[e036ed2d-ddfa-44da-82ae-eb51e591aa74]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:20:43 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:20:43.760 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa6199784-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:20:43 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:20:43.761 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 13:20:43 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:20:43.761 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa6199784-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:20:43 compute-0 kernel: tapa6199784-10: entered promiscuous mode
Feb 16 13:20:43 compute-0 nova_compute[185723]: 2026-02-16 13:20:43.763 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:20:43 compute-0 NetworkManager[56177]: <info>  [1771248043.7658] manager: (tapa6199784-10): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/23)
Feb 16 13:20:43 compute-0 nova_compute[185723]: 2026-02-16 13:20:43.765 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:20:43 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:20:43.767 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapa6199784-10, col_values=(('external_ids', {'iface-id': '3b5a298b-9fc2-4705-8faa-2b8cfb88937b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:20:43 compute-0 ovn_controller[96072]: 2026-02-16T13:20:43Z|00031|binding|INFO|Releasing lport 3b5a298b-9fc2-4705-8faa-2b8cfb88937b from this chassis (sb_readonly=0)
Feb 16 13:20:43 compute-0 nova_compute[185723]: 2026-02-16 13:20:43.767 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:20:43 compute-0 nova_compute[185723]: 2026-02-16 13:20:43.768 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:20:43 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:20:43.769 105360 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/a6199784-1742-41a7-9152-bb54abb7bef1.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/a6199784-1742-41a7-9152-bb54abb7bef1.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Feb 16 13:20:43 compute-0 nova_compute[185723]: 2026-02-16 13:20:43.771 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:20:43 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:20:43.771 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[f97f9657-da50-402f-b224-336c6b345c29]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:20:43 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:20:43.773 105360 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Feb 16 13:20:43 compute-0 ovn_metadata_agent[105355]: global
Feb 16 13:20:43 compute-0 ovn_metadata_agent[105355]:     log         /dev/log local0 debug
Feb 16 13:20:43 compute-0 ovn_metadata_agent[105355]:     log-tag     haproxy-metadata-proxy-a6199784-1742-41a7-9152-bb54abb7bef1
Feb 16 13:20:43 compute-0 ovn_metadata_agent[105355]:     user        root
Feb 16 13:20:43 compute-0 ovn_metadata_agent[105355]:     group       root
Feb 16 13:20:43 compute-0 ovn_metadata_agent[105355]:     maxconn     1024
Feb 16 13:20:43 compute-0 ovn_metadata_agent[105355]:     pidfile     /var/lib/neutron/external/pids/a6199784-1742-41a7-9152-bb54abb7bef1.pid.haproxy
Feb 16 13:20:43 compute-0 ovn_metadata_agent[105355]:     daemon
Feb 16 13:20:43 compute-0 ovn_metadata_agent[105355]: 
Feb 16 13:20:43 compute-0 ovn_metadata_agent[105355]: defaults
Feb 16 13:20:43 compute-0 ovn_metadata_agent[105355]:     log global
Feb 16 13:20:43 compute-0 ovn_metadata_agent[105355]:     mode http
Feb 16 13:20:43 compute-0 ovn_metadata_agent[105355]:     option httplog
Feb 16 13:20:43 compute-0 ovn_metadata_agent[105355]:     option dontlognull
Feb 16 13:20:43 compute-0 ovn_metadata_agent[105355]:     option http-server-close
Feb 16 13:20:43 compute-0 ovn_metadata_agent[105355]:     option forwardfor
Feb 16 13:20:43 compute-0 ovn_metadata_agent[105355]:     retries                 3
Feb 16 13:20:43 compute-0 ovn_metadata_agent[105355]:     timeout http-request    30s
Feb 16 13:20:43 compute-0 ovn_metadata_agent[105355]:     timeout connect         30s
Feb 16 13:20:43 compute-0 ovn_metadata_agent[105355]:     timeout client          32s
Feb 16 13:20:43 compute-0 ovn_metadata_agent[105355]:     timeout server          32s
Feb 16 13:20:43 compute-0 ovn_metadata_agent[105355]:     timeout http-keep-alive 30s
Feb 16 13:20:43 compute-0 ovn_metadata_agent[105355]: 
Feb 16 13:20:43 compute-0 ovn_metadata_agent[105355]: 
Feb 16 13:20:43 compute-0 ovn_metadata_agent[105355]: listen listener
Feb 16 13:20:43 compute-0 ovn_metadata_agent[105355]:     bind 169.254.169.254:80
Feb 16 13:20:43 compute-0 ovn_metadata_agent[105355]:     server metadata /var/lib/neutron/metadata_proxy
Feb 16 13:20:43 compute-0 ovn_metadata_agent[105355]:     http-request add-header X-OVN-Network-ID a6199784-1742-41a7-9152-bb54abb7bef1
Feb 16 13:20:43 compute-0 ovn_metadata_agent[105355]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Feb 16 13:20:43 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:20:43.774 105360 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-a6199784-1742-41a7-9152-bb54abb7bef1', 'env', 'PROCESS_TAG=haproxy-a6199784-1742-41a7-9152-bb54abb7bef1', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/a6199784-1742-41a7-9152-bb54abb7bef1.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Feb 16 13:20:44 compute-0 podman[206517]: 2026-02-16 13:20:44.109971718 +0000 UTC m=+0.062166769 container create 2f073a90404d37d7251c8be552ddb516e5ee5f47decb92befaf8777edb5cddc2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a6199784-1742-41a7-9152-bb54abb7bef1, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 16 13:20:44 compute-0 systemd[1]: Started libpod-conmon-2f073a90404d37d7251c8be552ddb516e5ee5f47decb92befaf8777edb5cddc2.scope.
Feb 16 13:20:44 compute-0 podman[206517]: 2026-02-16 13:20:44.06982988 +0000 UTC m=+0.022024951 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Feb 16 13:20:44 compute-0 systemd[1]: Started libcrun container.
Feb 16 13:20:44 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bd8642a8643f19c1f11b26a1d234f877e0d110687b4ce329a83b1179ce2033d0/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 16 13:20:44 compute-0 podman[206517]: 2026-02-16 13:20:44.212527629 +0000 UTC m=+0.164722730 container init 2f073a90404d37d7251c8be552ddb516e5ee5f47decb92befaf8777edb5cddc2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a6199784-1742-41a7-9152-bb54abb7bef1, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Feb 16 13:20:44 compute-0 podman[206517]: 2026-02-16 13:20:44.21766227 +0000 UTC m=+0.169857331 container start 2f073a90404d37d7251c8be552ddb516e5ee5f47decb92befaf8777edb5cddc2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a6199784-1742-41a7-9152-bb54abb7bef1, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20260127)
Feb 16 13:20:44 compute-0 neutron-haproxy-ovnmeta-a6199784-1742-41a7-9152-bb54abb7bef1[206532]: [NOTICE]   (206536) : New worker (206538) forked
Feb 16 13:20:44 compute-0 neutron-haproxy-ovnmeta-a6199784-1742-41a7-9152-bb54abb7bef1[206532]: [NOTICE]   (206536) : Loading success.
Feb 16 13:20:45 compute-0 sshd-session[206547]: Connection closed by authenticating user root 64.227.72.94 port 40598 [preauth]
Feb 16 13:20:46 compute-0 nova_compute[185723]: 2026-02-16 13:20:46.984 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:20:47 compute-0 nova_compute[185723]: 2026-02-16 13:20:47.859 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:20:50 compute-0 podman[206549]: 2026-02-16 13:20:50.015146223 +0000 UTC m=+0.055754267 container health_status 93151dcbec9f159583042df5101bdd7b5b1bcb5f3117955b4bc9cd1c0e465b98 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Red Hat, Inc., io.openshift.expose-services=, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, org.opencontainers.image.created=2026-02-05T04:57:10Z, vendor=Red Hat, Inc., version=9.7, name=ubi9/ubi-minimal, release=1770267347, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, vcs-type=git, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2026-02-05T04:57:10Z)
Feb 16 13:20:50 compute-0 podman[206550]: 2026-02-16 13:20:50.033772923 +0000 UTC m=+0.061011352 container health_status d69bd0be854ec8043fcba555b70c2cd311c7a2510d07d6bc9929fe7684abece9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Feb 16 13:20:51 compute-0 nova_compute[185723]: 2026-02-16 13:20:51.986 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:20:52 compute-0 nova_compute[185723]: 2026-02-16 13:20:52.862 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:20:53 compute-0 ovn_controller[96072]: 2026-02-16T13:20:53Z|00004|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:b1:7c:d9 10.100.0.6
Feb 16 13:20:53 compute-0 ovn_controller[96072]: 2026-02-16T13:20:53Z|00005|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:b1:7c:d9 10.100.0.6
Feb 16 13:20:53 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:20:53.819 105360 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=4, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '52:96:1b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '16:6c:7a:bd:49:39'}, ipsec=False) old=SB_Global(nb_cfg=3) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 13:20:53 compute-0 nova_compute[185723]: 2026-02-16 13:20:53.819 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:20:53 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:20:53.820 105360 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 16 13:20:54 compute-0 podman[206609]: 2026-02-16 13:20:54.031556634 +0000 UTC m=+0.071074939 container health_status 19961a2f872cc72daf954f4dd556f980b5f58f137d145776df6132c167a8a93c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 16 13:20:56 compute-0 nova_compute[185723]: 2026-02-16 13:20:56.988 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:20:57 compute-0 nova_compute[185723]: 2026-02-16 13:20:57.896 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:20:58 compute-0 nova_compute[185723]: 2026-02-16 13:20:58.433 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:20:58 compute-0 nova_compute[185723]: 2026-02-16 13:20:58.455 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:20:58 compute-0 nova_compute[185723]: 2026-02-16 13:20:58.455 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:20:58 compute-0 nova_compute[185723]: 2026-02-16 13:20:58.455 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:20:58 compute-0 nova_compute[185723]: 2026-02-16 13:20:58.486 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:20:58 compute-0 nova_compute[185723]: 2026-02-16 13:20:58.486 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:20:58 compute-0 nova_compute[185723]: 2026-02-16 13:20:58.486 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:20:58 compute-0 nova_compute[185723]: 2026-02-16 13:20:58.486 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 16 13:20:58 compute-0 nova_compute[185723]: 2026-02-16 13:20:58.569 185727 DEBUG oslo_concurrency.processutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/934dfad2-33a3-44dd-82c8-0b913e89cb8e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:20:58 compute-0 nova_compute[185723]: 2026-02-16 13:20:58.622 185727 DEBUG oslo_concurrency.processutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/934dfad2-33a3-44dd-82c8-0b913e89cb8e/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:20:58 compute-0 nova_compute[185723]: 2026-02-16 13:20:58.623 185727 DEBUG oslo_concurrency.processutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/934dfad2-33a3-44dd-82c8-0b913e89cb8e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:20:58 compute-0 nova_compute[185723]: 2026-02-16 13:20:58.679 185727 DEBUG oslo_concurrency.processutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/934dfad2-33a3-44dd-82c8-0b913e89cb8e/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:20:58 compute-0 nova_compute[185723]: 2026-02-16 13:20:58.805 185727 WARNING nova.virt.libvirt.driver [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 13:20:58 compute-0 nova_compute[185723]: 2026-02-16 13:20:58.806 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5663MB free_disk=73.19853591918945GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 16 13:20:58 compute-0 nova_compute[185723]: 2026-02-16 13:20:58.806 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:20:58 compute-0 nova_compute[185723]: 2026-02-16 13:20:58.806 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:20:58 compute-0 nova_compute[185723]: 2026-02-16 13:20:58.898 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Instance 934dfad2-33a3-44dd-82c8-0b913e89cb8e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 16 13:20:58 compute-0 nova_compute[185723]: 2026-02-16 13:20:58.898 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 16 13:20:58 compute-0 nova_compute[185723]: 2026-02-16 13:20:58.899 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 16 13:20:58 compute-0 nova_compute[185723]: 2026-02-16 13:20:58.956 185727 DEBUG nova.compute.provider_tree [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Updating inventory in ProviderTree for provider c9501a85-df32-4b8f-bce0-9425ef1e7866 with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Feb 16 13:20:59 compute-0 nova_compute[185723]: 2026-02-16 13:20:59.004 185727 ERROR nova.scheduler.client.report [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] [req-7433285e-38a6-4e9e-a6ba-b0b0cafe661b] Failed to update inventory to [{'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}}] for resource provider with UUID c9501a85-df32-4b8f-bce0-9425ef1e7866.  Got 409: {"errors": [{"status": 409, "title": "Conflict", "detail": "There was a conflict when trying to complete your request.\n\n resource provider generation conflict  ", "code": "placement.concurrent_update", "request_id": "req-7433285e-38a6-4e9e-a6ba-b0b0cafe661b"}]}
Feb 16 13:20:59 compute-0 nova_compute[185723]: 2026-02-16 13:20:59.031 185727 DEBUG nova.scheduler.client.report [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Refreshing inventories for resource provider c9501a85-df32-4b8f-bce0-9425ef1e7866 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Feb 16 13:20:59 compute-0 nova_compute[185723]: 2026-02-16 13:20:59.068 185727 DEBUG nova.scheduler.client.report [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Updating ProviderTree inventory for provider c9501a85-df32-4b8f-bce0-9425ef1e7866 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Feb 16 13:20:59 compute-0 nova_compute[185723]: 2026-02-16 13:20:59.069 185727 DEBUG nova.compute.provider_tree [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Updating inventory in ProviderTree for provider c9501a85-df32-4b8f-bce0-9425ef1e7866 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Feb 16 13:20:59 compute-0 nova_compute[185723]: 2026-02-16 13:20:59.084 185727 DEBUG nova.scheduler.client.report [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Refreshing aggregate associations for resource provider c9501a85-df32-4b8f-bce0-9425ef1e7866, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Feb 16 13:20:59 compute-0 nova_compute[185723]: 2026-02-16 13:20:59.117 185727 DEBUG nova.scheduler.client.report [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Refreshing trait associations for resource provider c9501a85-df32-4b8f-bce0-9425ef1e7866, traits: COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_MMX,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_SSE,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_SSE2,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_SECURITY_TPM_1_2,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_VOLUME_EXTEND,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_STORAGE_BUS_USB,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_SECURITY_TPM_2_0,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_ACCELERATORS,COMPUTE_STORAGE_BUS_FDC,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NODE,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_SSE41,COMPUTE_IMAGE_TYPE_AKI _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Feb 16 13:20:59 compute-0 nova_compute[185723]: 2026-02-16 13:20:59.170 185727 DEBUG nova.compute.provider_tree [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Updating inventory in ProviderTree for provider c9501a85-df32-4b8f-bce0-9425ef1e7866 with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Feb 16 13:20:59 compute-0 nova_compute[185723]: 2026-02-16 13:20:59.222 185727 DEBUG nova.scheduler.client.report [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Updated inventory for provider c9501a85-df32-4b8f-bce0-9425ef1e7866 with generation 3 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:957
Feb 16 13:20:59 compute-0 nova_compute[185723]: 2026-02-16 13:20:59.222 185727 DEBUG nova.compute.provider_tree [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Updating resource provider c9501a85-df32-4b8f-bce0-9425ef1e7866 generation from 3 to 4 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Feb 16 13:20:59 compute-0 nova_compute[185723]: 2026-02-16 13:20:59.223 185727 DEBUG nova.compute.provider_tree [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Updating inventory in ProviderTree for provider c9501a85-df32-4b8f-bce0-9425ef1e7866 with inventory: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Feb 16 13:20:59 compute-0 nova_compute[185723]: 2026-02-16 13:20:59.272 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 16 13:20:59 compute-0 nova_compute[185723]: 2026-02-16 13:20:59.272 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.466s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:20:59 compute-0 podman[195053]: time="2026-02-16T13:20:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:20:59 compute-0 podman[195053]: @ - - [16/Feb/2026:13:20:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17245 "" "Go-http-client/1.1"
Feb 16 13:20:59 compute-0 podman[195053]: @ - - [16/Feb/2026:13:20:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2623 "" "Go-http-client/1.1"
Feb 16 13:21:01 compute-0 nova_compute[185723]: 2026-02-16 13:21:01.251 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:21:01 compute-0 nova_compute[185723]: 2026-02-16 13:21:01.428 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:21:01 compute-0 nova_compute[185723]: 2026-02-16 13:21:01.432 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:21:01 compute-0 nova_compute[185723]: 2026-02-16 13:21:01.433 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 16 13:21:01 compute-0 openstack_network_exporter[197909]: ERROR   13:21:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:21:01 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:21:01 compute-0 openstack_network_exporter[197909]: ERROR   13:21:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:21:01 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:21:01 compute-0 nova_compute[185723]: 2026-02-16 13:21:01.989 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:21:02 compute-0 nova_compute[185723]: 2026-02-16 13:21:02.434 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:21:02 compute-0 nova_compute[185723]: 2026-02-16 13:21:02.899 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:21:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:21:03.213 105360 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:21:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:21:03.213 105360 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:21:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:21:03.214 105360 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:21:03 compute-0 nova_compute[185723]: 2026-02-16 13:21:03.434 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:21:03 compute-0 nova_compute[185723]: 2026-02-16 13:21:03.435 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 16 13:21:03 compute-0 nova_compute[185723]: 2026-02-16 13:21:03.436 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 16 13:21:03 compute-0 nova_compute[185723]: 2026-02-16 13:21:03.587 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Acquiring lock "refresh_cache-934dfad2-33a3-44dd-82c8-0b913e89cb8e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 13:21:03 compute-0 nova_compute[185723]: 2026-02-16 13:21:03.588 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Acquired lock "refresh_cache-934dfad2-33a3-44dd-82c8-0b913e89cb8e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 13:21:03 compute-0 nova_compute[185723]: 2026-02-16 13:21:03.588 185727 DEBUG nova.network.neutron [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] [instance: 934dfad2-33a3-44dd-82c8-0b913e89cb8e] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Feb 16 13:21:03 compute-0 nova_compute[185723]: 2026-02-16 13:21:03.588 185727 DEBUG nova.objects.instance [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 934dfad2-33a3-44dd-82c8-0b913e89cb8e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 13:21:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:21:03.822 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=b0e583b2-47d7-4bde-bbd6-282143e0c194, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '4'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:21:04 compute-0 podman[206642]: 2026-02-16 13:21:04.014059049 +0000 UTC m=+0.055461630 container health_status 4ec6105d1727f201f656d7e6e358931be8ceabc5af37fb5ad779abc620122180 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 16 13:21:05 compute-0 nova_compute[185723]: 2026-02-16 13:21:05.890 185727 DEBUG nova.network.neutron [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] [instance: 934dfad2-33a3-44dd-82c8-0b913e89cb8e] Updating instance_info_cache with network_info: [{"id": "b0642d70-aac9-4a19-b18b-6f6a914d307a", "address": "fa:16:3e:b1:7c:d9", "network": {"id": "a6199784-1742-41a7-9152-bb54abb7bef1", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-926379118-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5e0321e3a614b62a46eef7fb2e737ff", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb0642d70-aa", "ovs_interfaceid": "b0642d70-aac9-4a19-b18b-6f6a914d307a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 13:21:05 compute-0 nova_compute[185723]: 2026-02-16 13:21:05.924 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Releasing lock "refresh_cache-934dfad2-33a3-44dd-82c8-0b913e89cb8e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 13:21:05 compute-0 nova_compute[185723]: 2026-02-16 13:21:05.925 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] [instance: 934dfad2-33a3-44dd-82c8-0b913e89cb8e] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Feb 16 13:21:05 compute-0 nova_compute[185723]: 2026-02-16 13:21:05.925 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:21:06 compute-0 nova_compute[185723]: 2026-02-16 13:21:06.990 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:21:07 compute-0 nova_compute[185723]: 2026-02-16 13:21:07.902 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:21:10 compute-0 sshd-session[206668]: Connection closed by authenticating user root 146.190.22.227 port 46006 [preauth]
Feb 16 13:21:11 compute-0 nova_compute[185723]: 2026-02-16 13:21:11.991 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:21:12 compute-0 nova_compute[185723]: 2026-02-16 13:21:12.740 185727 DEBUG oslo_concurrency.lockutils [None req-70f5ec0a-04de-4d96-8596-c95bfa428503 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "refresh_cache-934dfad2-33a3-44dd-82c8-0b913e89cb8e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 13:21:12 compute-0 nova_compute[185723]: 2026-02-16 13:21:12.741 185727 DEBUG oslo_concurrency.lockutils [None req-70f5ec0a-04de-4d96-8596-c95bfa428503 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Acquired lock "refresh_cache-934dfad2-33a3-44dd-82c8-0b913e89cb8e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 13:21:12 compute-0 nova_compute[185723]: 2026-02-16 13:21:12.741 185727 DEBUG nova.network.neutron [None req-70f5ec0a-04de-4d96-8596-c95bfa428503 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 934dfad2-33a3-44dd-82c8-0b913e89cb8e] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 16 13:21:12 compute-0 nova_compute[185723]: 2026-02-16 13:21:12.954 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:21:14 compute-0 nova_compute[185723]: 2026-02-16 13:21:14.458 185727 DEBUG nova.network.neutron [None req-70f5ec0a-04de-4d96-8596-c95bfa428503 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 934dfad2-33a3-44dd-82c8-0b913e89cb8e] Updating instance_info_cache with network_info: [{"id": "b0642d70-aac9-4a19-b18b-6f6a914d307a", "address": "fa:16:3e:b1:7c:d9", "network": {"id": "a6199784-1742-41a7-9152-bb54abb7bef1", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-926379118-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5e0321e3a614b62a46eef7fb2e737ff", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb0642d70-aa", "ovs_interfaceid": "b0642d70-aac9-4a19-b18b-6f6a914d307a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 13:21:14 compute-0 nova_compute[185723]: 2026-02-16 13:21:14.476 185727 DEBUG oslo_concurrency.lockutils [None req-70f5ec0a-04de-4d96-8596-c95bfa428503 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Releasing lock "refresh_cache-934dfad2-33a3-44dd-82c8-0b913e89cb8e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 13:21:14 compute-0 nova_compute[185723]: 2026-02-16 13:21:14.583 185727 DEBUG nova.virt.libvirt.driver [None req-70f5ec0a-04de-4d96-8596-c95bfa428503 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 934dfad2-33a3-44dd-82c8-0b913e89cb8e] Starting migrate_disk_and_power_off migrate_disk_and_power_off /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11511
Feb 16 13:21:14 compute-0 nova_compute[185723]: 2026-02-16 13:21:14.584 185727 DEBUG nova.virt.libvirt.volume.remotefs [None req-70f5ec0a-04de-4d96-8596-c95bfa428503 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Creating file /var/lib/nova/instances/934dfad2-33a3-44dd-82c8-0b913e89cb8e/9f18d95fd87642a4afdfcfe29f115003.tmp on remote host 192.168.122.101 create_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:79
Feb 16 13:21:14 compute-0 nova_compute[185723]: 2026-02-16 13:21:14.584 185727 DEBUG oslo_concurrency.processutils [None req-70f5ec0a-04de-4d96-8596-c95bfa428503 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.101 touch /var/lib/nova/instances/934dfad2-33a3-44dd-82c8-0b913e89cb8e/9f18d95fd87642a4afdfcfe29f115003.tmp execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:21:15 compute-0 nova_compute[185723]: 2026-02-16 13:21:15.071 185727 DEBUG oslo_concurrency.processutils [None req-70f5ec0a-04de-4d96-8596-c95bfa428503 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] CMD "ssh -o BatchMode=yes 192.168.122.101 touch /var/lib/nova/instances/934dfad2-33a3-44dd-82c8-0b913e89cb8e/9f18d95fd87642a4afdfcfe29f115003.tmp" returned: 1 in 0.487s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:21:15 compute-0 nova_compute[185723]: 2026-02-16 13:21:15.072 185727 DEBUG oslo_concurrency.processutils [None req-70f5ec0a-04de-4d96-8596-c95bfa428503 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] 'ssh -o BatchMode=yes 192.168.122.101 touch /var/lib/nova/instances/934dfad2-33a3-44dd-82c8-0b913e89cb8e/9f18d95fd87642a4afdfcfe29f115003.tmp' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473
Feb 16 13:21:15 compute-0 nova_compute[185723]: 2026-02-16 13:21:15.072 185727 DEBUG nova.virt.libvirt.volume.remotefs [None req-70f5ec0a-04de-4d96-8596-c95bfa428503 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Creating directory /var/lib/nova/instances/934dfad2-33a3-44dd-82c8-0b913e89cb8e on remote host 192.168.122.101 create_dir /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:91
Feb 16 13:21:15 compute-0 nova_compute[185723]: 2026-02-16 13:21:15.072 185727 DEBUG oslo_concurrency.processutils [None req-70f5ec0a-04de-4d96-8596-c95bfa428503 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.101 mkdir -p /var/lib/nova/instances/934dfad2-33a3-44dd-82c8-0b913e89cb8e execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:21:15 compute-0 nova_compute[185723]: 2026-02-16 13:21:15.266 185727 DEBUG oslo_concurrency.processutils [None req-70f5ec0a-04de-4d96-8596-c95bfa428503 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] CMD "ssh -o BatchMode=yes 192.168.122.101 mkdir -p /var/lib/nova/instances/934dfad2-33a3-44dd-82c8-0b913e89cb8e" returned: 0 in 0.194s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:21:15 compute-0 nova_compute[185723]: 2026-02-16 13:21:15.271 185727 DEBUG nova.virt.libvirt.driver [None req-70f5ec0a-04de-4d96-8596-c95bfa428503 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 934dfad2-33a3-44dd-82c8-0b913e89cb8e] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Feb 16 13:21:16 compute-0 nova_compute[185723]: 2026-02-16 13:21:16.993 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:21:17 compute-0 kernel: tapb0642d70-aa (unregistering): left promiscuous mode
Feb 16 13:21:17 compute-0 NetworkManager[56177]: <info>  [1771248077.5563] device (tapb0642d70-aa): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Feb 16 13:21:17 compute-0 nova_compute[185723]: 2026-02-16 13:21:17.556 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:21:17 compute-0 ovn_controller[96072]: 2026-02-16T13:21:17Z|00032|binding|INFO|Releasing lport b0642d70-aac9-4a19-b18b-6f6a914d307a from this chassis (sb_readonly=0)
Feb 16 13:21:17 compute-0 nova_compute[185723]: 2026-02-16 13:21:17.564 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:21:17 compute-0 ovn_controller[96072]: 2026-02-16T13:21:17Z|00033|binding|INFO|Setting lport b0642d70-aac9-4a19-b18b-6f6a914d307a down in Southbound
Feb 16 13:21:17 compute-0 ovn_controller[96072]: 2026-02-16T13:21:17Z|00034|binding|INFO|Removing iface tapb0642d70-aa ovn-installed in OVS
Feb 16 13:21:17 compute-0 nova_compute[185723]: 2026-02-16 13:21:17.568 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:21:17 compute-0 nova_compute[185723]: 2026-02-16 13:21:17.571 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:21:17 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:21:17.573 105360 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b1:7c:d9 10.100.0.6'], port_security=['fa:16:3e:b1:7c:d9 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '934dfad2-33a3-44dd-82c8-0b913e89cb8e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a6199784-1742-41a7-9152-bb54abb7bef1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b5e0321e3a614b62a46eef7fb2e737ff', 'neutron:revision_number': '4', 'neutron:security_group_ids': '22e3f3ae-6435-49f2-b1a3-ead6d5ff75b1', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5a8796a1-c459-4e68-a95d-23fef829aa8d, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2e53046760>], logical_port=b0642d70-aac9-4a19-b18b-6f6a914d307a) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2e53046760>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 13:21:17 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:21:17.574 105360 INFO neutron.agent.ovn.metadata.agent [-] Port b0642d70-aac9-4a19-b18b-6f6a914d307a in datapath a6199784-1742-41a7-9152-bb54abb7bef1 unbound from our chassis
Feb 16 13:21:17 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:21:17.575 105360 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network a6199784-1742-41a7-9152-bb54abb7bef1, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 16 13:21:17 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:21:17.576 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[c150ab06-641f-40e6-a2e9-9de535d50704]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:21:17 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:21:17.577 105360 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-a6199784-1742-41a7-9152-bb54abb7bef1 namespace which is not needed anymore
Feb 16 13:21:17 compute-0 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000001.scope: Deactivated successfully.
Feb 16 13:21:17 compute-0 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000001.scope: Consumed 13.331s CPU time.
Feb 16 13:21:17 compute-0 systemd-machined[155229]: Machine qemu-1-instance-00000001 terminated.
Feb 16 13:21:17 compute-0 neutron-haproxy-ovnmeta-a6199784-1742-41a7-9152-bb54abb7bef1[206532]: [NOTICE]   (206536) : haproxy version is 2.8.14-c23fe91
Feb 16 13:21:17 compute-0 neutron-haproxy-ovnmeta-a6199784-1742-41a7-9152-bb54abb7bef1[206532]: [NOTICE]   (206536) : path to executable is /usr/sbin/haproxy
Feb 16 13:21:17 compute-0 neutron-haproxy-ovnmeta-a6199784-1742-41a7-9152-bb54abb7bef1[206532]: [WARNING]  (206536) : Exiting Master process...
Feb 16 13:21:17 compute-0 neutron-haproxy-ovnmeta-a6199784-1742-41a7-9152-bb54abb7bef1[206532]: [WARNING]  (206536) : Exiting Master process...
Feb 16 13:21:17 compute-0 neutron-haproxy-ovnmeta-a6199784-1742-41a7-9152-bb54abb7bef1[206532]: [ALERT]    (206536) : Current worker (206538) exited with code 143 (Terminated)
Feb 16 13:21:17 compute-0 neutron-haproxy-ovnmeta-a6199784-1742-41a7-9152-bb54abb7bef1[206532]: [WARNING]  (206536) : All workers exited. Exiting... (0)
Feb 16 13:21:17 compute-0 systemd[1]: libpod-2f073a90404d37d7251c8be552ddb516e5ee5f47decb92befaf8777edb5cddc2.scope: Deactivated successfully.
Feb 16 13:21:17 compute-0 podman[206696]: 2026-02-16 13:21:17.740607737 +0000 UTC m=+0.082492606 container died 2f073a90404d37d7251c8be552ddb516e5ee5f47decb92befaf8777edb5cddc2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a6199784-1742-41a7-9152-bb54abb7bef1, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Feb 16 13:21:17 compute-0 nova_compute[185723]: 2026-02-16 13:21:17.816 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:21:17 compute-0 nova_compute[185723]: 2026-02-16 13:21:17.882 185727 DEBUG nova.compute.manager [req-0cc544bd-12e7-4651-9796-70df5b44fe8f req-f300d51b-8c0b-419a-9fc8-1f92732bacd0 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 934dfad2-33a3-44dd-82c8-0b913e89cb8e] Received event network-vif-unplugged-b0642d70-aac9-4a19-b18b-6f6a914d307a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:21:17 compute-0 nova_compute[185723]: 2026-02-16 13:21:17.882 185727 DEBUG oslo_concurrency.lockutils [req-0cc544bd-12e7-4651-9796-70df5b44fe8f req-f300d51b-8c0b-419a-9fc8-1f92732bacd0 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "934dfad2-33a3-44dd-82c8-0b913e89cb8e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:21:17 compute-0 nova_compute[185723]: 2026-02-16 13:21:17.883 185727 DEBUG oslo_concurrency.lockutils [req-0cc544bd-12e7-4651-9796-70df5b44fe8f req-f300d51b-8c0b-419a-9fc8-1f92732bacd0 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "934dfad2-33a3-44dd-82c8-0b913e89cb8e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:21:17 compute-0 nova_compute[185723]: 2026-02-16 13:21:17.883 185727 DEBUG oslo_concurrency.lockutils [req-0cc544bd-12e7-4651-9796-70df5b44fe8f req-f300d51b-8c0b-419a-9fc8-1f92732bacd0 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "934dfad2-33a3-44dd-82c8-0b913e89cb8e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:21:17 compute-0 nova_compute[185723]: 2026-02-16 13:21:17.883 185727 DEBUG nova.compute.manager [req-0cc544bd-12e7-4651-9796-70df5b44fe8f req-f300d51b-8c0b-419a-9fc8-1f92732bacd0 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 934dfad2-33a3-44dd-82c8-0b913e89cb8e] No waiting events found dispatching network-vif-unplugged-b0642d70-aac9-4a19-b18b-6f6a914d307a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:21:17 compute-0 nova_compute[185723]: 2026-02-16 13:21:17.884 185727 WARNING nova.compute.manager [req-0cc544bd-12e7-4651-9796-70df5b44fe8f req-f300d51b-8c0b-419a-9fc8-1f92732bacd0 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 934dfad2-33a3-44dd-82c8-0b913e89cb8e] Received unexpected event network-vif-unplugged-b0642d70-aac9-4a19-b18b-6f6a914d307a for instance with vm_state active and task_state resize_migrating.
Feb 16 13:21:17 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-2f073a90404d37d7251c8be552ddb516e5ee5f47decb92befaf8777edb5cddc2-userdata-shm.mount: Deactivated successfully.
Feb 16 13:21:17 compute-0 systemd[1]: var-lib-containers-storage-overlay-bd8642a8643f19c1f11b26a1d234f877e0d110687b4ce329a83b1179ce2033d0-merged.mount: Deactivated successfully.
Feb 16 13:21:17 compute-0 nova_compute[185723]: 2026-02-16 13:21:17.956 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:21:18 compute-0 podman[206696]: 2026-02-16 13:21:18.146289673 +0000 UTC m=+0.488174552 container cleanup 2f073a90404d37d7251c8be552ddb516e5ee5f47decb92befaf8777edb5cddc2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a6199784-1742-41a7-9152-bb54abb7bef1, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 16 13:21:18 compute-0 systemd[1]: libpod-conmon-2f073a90404d37d7251c8be552ddb516e5ee5f47decb92befaf8777edb5cddc2.scope: Deactivated successfully.
Feb 16 13:21:18 compute-0 nova_compute[185723]: 2026-02-16 13:21:18.290 185727 INFO nova.virt.libvirt.driver [None req-70f5ec0a-04de-4d96-8596-c95bfa428503 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 934dfad2-33a3-44dd-82c8-0b913e89cb8e] Instance shutdown successfully after 3 seconds.
Feb 16 13:21:18 compute-0 nova_compute[185723]: 2026-02-16 13:21:18.302 185727 INFO nova.virt.libvirt.driver [-] [instance: 934dfad2-33a3-44dd-82c8-0b913e89cb8e] Instance destroyed successfully.
Feb 16 13:21:18 compute-0 nova_compute[185723]: 2026-02-16 13:21:18.303 185727 DEBUG nova.virt.libvirt.vif [None req-70f5ec0a-04de-4d96-8596-c95bfa428503 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-16T13:20:27Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-727824786',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-727824786',id=1,image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-16T13:20:41Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='b5e0321e3a614b62a46eef7fb2e737ff',ramdisk_id='',reservation_id='r-7k7vpckb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestExecuteActionsViaActuator-1504038973',owner_user_name='tempest-TestExecuteActionsViaActuator-1504038973-project-member'},tags=<?>,task_state='resize_migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-16T13:21:12Z,user_data=None,user_id='53b5045c5aaf4a7d8d84dce2ac4aa424',uuid=934dfad2-33a3-44dd-82c8-0b913e89cb8e,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b0642d70-aac9-4a19-b18b-6f6a914d307a", "address": "fa:16:3e:b1:7c:d9", "network": {"id": "a6199784-1742-41a7-9152-bb54abb7bef1", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-926379118-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-TestExecuteActionsViaActuator-926379118-network", "vif_mac": "fa:16:3e:b1:7c:d9"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5e0321e3a614b62a46eef7fb2e737ff", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb0642d70-aa", "ovs_interfaceid": "b0642d70-aac9-4a19-b18b-6f6a914d307a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Feb 16 13:21:18 compute-0 nova_compute[185723]: 2026-02-16 13:21:18.303 185727 DEBUG nova.network.os_vif_util [None req-70f5ec0a-04de-4d96-8596-c95bfa428503 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Converting VIF {"id": "b0642d70-aac9-4a19-b18b-6f6a914d307a", "address": "fa:16:3e:b1:7c:d9", "network": {"id": "a6199784-1742-41a7-9152-bb54abb7bef1", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-926379118-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-TestExecuteActionsViaActuator-926379118-network", "vif_mac": "fa:16:3e:b1:7c:d9"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5e0321e3a614b62a46eef7fb2e737ff", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb0642d70-aa", "ovs_interfaceid": "b0642d70-aac9-4a19-b18b-6f6a914d307a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 13:21:18 compute-0 nova_compute[185723]: 2026-02-16 13:21:18.304 185727 DEBUG nova.network.os_vif_util [None req-70f5ec0a-04de-4d96-8596-c95bfa428503 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:b1:7c:d9,bridge_name='br-int',has_traffic_filtering=True,id=b0642d70-aac9-4a19-b18b-6f6a914d307a,network=Network(a6199784-1742-41a7-9152-bb54abb7bef1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb0642d70-aa') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 13:21:18 compute-0 nova_compute[185723]: 2026-02-16 13:21:18.304 185727 DEBUG os_vif [None req-70f5ec0a-04de-4d96-8596-c95bfa428503 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:b1:7c:d9,bridge_name='br-int',has_traffic_filtering=True,id=b0642d70-aac9-4a19-b18b-6f6a914d307a,network=Network(a6199784-1742-41a7-9152-bb54abb7bef1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb0642d70-aa') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Feb 16 13:21:18 compute-0 nova_compute[185723]: 2026-02-16 13:21:18.306 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:21:18 compute-0 nova_compute[185723]: 2026-02-16 13:21:18.306 185727 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb0642d70-aa, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:21:18 compute-0 nova_compute[185723]: 2026-02-16 13:21:18.307 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:21:18 compute-0 nova_compute[185723]: 2026-02-16 13:21:18.308 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:21:18 compute-0 nova_compute[185723]: 2026-02-16 13:21:18.311 185727 INFO os_vif [None req-70f5ec0a-04de-4d96-8596-c95bfa428503 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:b1:7c:d9,bridge_name='br-int',has_traffic_filtering=True,id=b0642d70-aac9-4a19-b18b-6f6a914d307a,network=Network(a6199784-1742-41a7-9152-bb54abb7bef1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb0642d70-aa')
Feb 16 13:21:18 compute-0 nova_compute[185723]: 2026-02-16 13:21:18.314 185727 DEBUG oslo_concurrency.processutils [None req-70f5ec0a-04de-4d96-8596-c95bfa428503 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/934dfad2-33a3-44dd-82c8-0b913e89cb8e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:21:18 compute-0 nova_compute[185723]: 2026-02-16 13:21:18.370 185727 DEBUG oslo_concurrency.processutils [None req-70f5ec0a-04de-4d96-8596-c95bfa428503 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/934dfad2-33a3-44dd-82c8-0b913e89cb8e/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:21:18 compute-0 nova_compute[185723]: 2026-02-16 13:21:18.371 185727 DEBUG oslo_concurrency.processutils [None req-70f5ec0a-04de-4d96-8596-c95bfa428503 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/934dfad2-33a3-44dd-82c8-0b913e89cb8e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:21:18 compute-0 nova_compute[185723]: 2026-02-16 13:21:18.418 185727 DEBUG oslo_concurrency.processutils [None req-70f5ec0a-04de-4d96-8596-c95bfa428503 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/934dfad2-33a3-44dd-82c8-0b913e89cb8e/disk --force-share --output=json" returned: 0 in 0.047s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:21:18 compute-0 nova_compute[185723]: 2026-02-16 13:21:18.420 185727 DEBUG nova.virt.libvirt.volume.remotefs [None req-70f5ec0a-04de-4d96-8596-c95bfa428503 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Copying file /var/lib/nova/instances/934dfad2-33a3-44dd-82c8-0b913e89cb8e_resize/disk to 192.168.122.101:/var/lib/nova/instances/934dfad2-33a3-44dd-82c8-0b913e89cb8e/disk copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Feb 16 13:21:18 compute-0 nova_compute[185723]: 2026-02-16 13:21:18.420 185727 DEBUG oslo_concurrency.processutils [None req-70f5ec0a-04de-4d96-8596-c95bfa428503 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Running cmd (subprocess): scp -r /var/lib/nova/instances/934dfad2-33a3-44dd-82c8-0b913e89cb8e_resize/disk 192.168.122.101:/var/lib/nova/instances/934dfad2-33a3-44dd-82c8-0b913e89cb8e/disk execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:21:18 compute-0 podman[206743]: 2026-02-16 13:21:18.536484582 +0000 UTC m=+0.373355311 container remove 2f073a90404d37d7251c8be552ddb516e5ee5f47decb92befaf8777edb5cddc2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a6199784-1742-41a7-9152-bb54abb7bef1, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20260127)
Feb 16 13:21:18 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:21:18.539 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[ac9aa55f-751e-4bbf-bf57-f9ded17b4630]: (4, ('Mon Feb 16 01:21:17 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-a6199784-1742-41a7-9152-bb54abb7bef1 (2f073a90404d37d7251c8be552ddb516e5ee5f47decb92befaf8777edb5cddc2)\n2f073a90404d37d7251c8be552ddb516e5ee5f47decb92befaf8777edb5cddc2\nMon Feb 16 01:21:18 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-a6199784-1742-41a7-9152-bb54abb7bef1 (2f073a90404d37d7251c8be552ddb516e5ee5f47decb92befaf8777edb5cddc2)\n2f073a90404d37d7251c8be552ddb516e5ee5f47decb92befaf8777edb5cddc2\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:21:18 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:21:18.540 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[75f3a6fb-a4f8-4969-9400-bccff0000516]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:21:18 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:21:18.541 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa6199784-10, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:21:18 compute-0 nova_compute[185723]: 2026-02-16 13:21:18.542 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:21:18 compute-0 kernel: tapa6199784-10: left promiscuous mode
Feb 16 13:21:18 compute-0 nova_compute[185723]: 2026-02-16 13:21:18.547 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:21:18 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:21:18.549 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[8cccd29c-08fe-44b0-9ff9-e1fe7d72e84e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:21:18 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:21:18.564 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[721eb0e6-3332-400e-8b36-8bd13383943b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:21:18 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:21:18.566 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[456435af-fb35-4673-9c0a-17228952f8eb]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:21:18 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:21:18.577 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[d86c078b-045b-4162-b416-4c503ce45350]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 412700, 'reachable_time': 37773, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 206768, 'error': None, 'target': 'ovnmeta-a6199784-1742-41a7-9152-bb54abb7bef1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:21:18 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:21:18.583 105762 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-a6199784-1742-41a7-9152-bb54abb7bef1 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Feb 16 13:21:18 compute-0 systemd[1]: run-netns-ovnmeta\x2da6199784\x2d1742\x2d41a7\x2d9152\x2dbb54abb7bef1.mount: Deactivated successfully.
Feb 16 13:21:18 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:21:18.584 105762 DEBUG oslo.privsep.daemon [-] privsep: reply[82ca958a-d4d9-40b4-b5a9-9220992ba872]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:21:18 compute-0 nova_compute[185723]: 2026-02-16 13:21:18.947 185727 DEBUG oslo_concurrency.processutils [None req-70f5ec0a-04de-4d96-8596-c95bfa428503 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] CMD "scp -r /var/lib/nova/instances/934dfad2-33a3-44dd-82c8-0b913e89cb8e_resize/disk 192.168.122.101:/var/lib/nova/instances/934dfad2-33a3-44dd-82c8-0b913e89cb8e/disk" returned: 0 in 0.527s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:21:18 compute-0 nova_compute[185723]: 2026-02-16 13:21:18.948 185727 DEBUG nova.virt.libvirt.volume.remotefs [None req-70f5ec0a-04de-4d96-8596-c95bfa428503 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Copying file /var/lib/nova/instances/934dfad2-33a3-44dd-82c8-0b913e89cb8e_resize/disk.config to 192.168.122.101:/var/lib/nova/instances/934dfad2-33a3-44dd-82c8-0b913e89cb8e/disk.config copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Feb 16 13:21:18 compute-0 nova_compute[185723]: 2026-02-16 13:21:18.948 185727 DEBUG oslo_concurrency.processutils [None req-70f5ec0a-04de-4d96-8596-c95bfa428503 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Running cmd (subprocess): scp -C -r /var/lib/nova/instances/934dfad2-33a3-44dd-82c8-0b913e89cb8e_resize/disk.config 192.168.122.101:/var/lib/nova/instances/934dfad2-33a3-44dd-82c8-0b913e89cb8e/disk.config execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:21:19 compute-0 nova_compute[185723]: 2026-02-16 13:21:19.143 185727 DEBUG oslo_concurrency.processutils [None req-70f5ec0a-04de-4d96-8596-c95bfa428503 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] CMD "scp -C -r /var/lib/nova/instances/934dfad2-33a3-44dd-82c8-0b913e89cb8e_resize/disk.config 192.168.122.101:/var/lib/nova/instances/934dfad2-33a3-44dd-82c8-0b913e89cb8e/disk.config" returned: 0 in 0.195s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:21:19 compute-0 nova_compute[185723]: 2026-02-16 13:21:19.145 185727 DEBUG nova.virt.libvirt.volume.remotefs [None req-70f5ec0a-04de-4d96-8596-c95bfa428503 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Copying file /var/lib/nova/instances/934dfad2-33a3-44dd-82c8-0b913e89cb8e_resize/disk.info to 192.168.122.101:/var/lib/nova/instances/934dfad2-33a3-44dd-82c8-0b913e89cb8e/disk.info copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Feb 16 13:21:19 compute-0 nova_compute[185723]: 2026-02-16 13:21:19.145 185727 DEBUG oslo_concurrency.processutils [None req-70f5ec0a-04de-4d96-8596-c95bfa428503 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Running cmd (subprocess): scp -C -r /var/lib/nova/instances/934dfad2-33a3-44dd-82c8-0b913e89cb8e_resize/disk.info 192.168.122.101:/var/lib/nova/instances/934dfad2-33a3-44dd-82c8-0b913e89cb8e/disk.info execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:21:19 compute-0 sshd-session[206756]: Connection closed by authenticating user root 188.166.42.159 port 42810 [preauth]
Feb 16 13:21:19 compute-0 nova_compute[185723]: 2026-02-16 13:21:19.326 185727 DEBUG oslo_concurrency.processutils [None req-70f5ec0a-04de-4d96-8596-c95bfa428503 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] CMD "scp -C -r /var/lib/nova/instances/934dfad2-33a3-44dd-82c8-0b913e89cb8e_resize/disk.info 192.168.122.101:/var/lib/nova/instances/934dfad2-33a3-44dd-82c8-0b913e89cb8e/disk.info" returned: 0 in 0.181s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:21:19 compute-0 nova_compute[185723]: 2026-02-16 13:21:19.851 185727 DEBUG neutronclient.v2_0.client [None req-70f5ec0a-04de-4d96-8596-c95bfa428503 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Error message: {"NeutronError": {"type": "PortBindingNotFound", "message": "Binding for port b0642d70-aac9-4a19-b18b-6f6a914d307a for host compute-1.ctlplane.example.com could not be found.", "detail": ""}} _handle_fault_response /usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py:262
Feb 16 13:21:19 compute-0 nova_compute[185723]: 2026-02-16 13:21:19.956 185727 DEBUG oslo_concurrency.lockutils [None req-70f5ec0a-04de-4d96-8596-c95bfa428503 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "compute-rpcapi-router" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 13:21:19 compute-0 nova_compute[185723]: 2026-02-16 13:21:19.956 185727 DEBUG oslo_concurrency.lockutils [None req-70f5ec0a-04de-4d96-8596-c95bfa428503 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Acquired lock "compute-rpcapi-router" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 13:21:19 compute-0 nova_compute[185723]: 2026-02-16 13:21:19.965 185727 INFO nova.compute.rpcapi [None req-70f5ec0a-04de-4d96-8596-c95bfa428503 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Automatically selected compute RPC version 6.2 from minimum service version 66
Feb 16 13:21:19 compute-0 nova_compute[185723]: 2026-02-16 13:21:19.966 185727 DEBUG oslo_concurrency.lockutils [None req-70f5ec0a-04de-4d96-8596-c95bfa428503 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Releasing lock "compute-rpcapi-router" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 13:21:19 compute-0 nova_compute[185723]: 2026-02-16 13:21:19.977 185727 DEBUG oslo_concurrency.lockutils [None req-70f5ec0a-04de-4d96-8596-c95bfa428503 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "934dfad2-33a3-44dd-82c8-0b913e89cb8e-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:21:19 compute-0 nova_compute[185723]: 2026-02-16 13:21:19.978 185727 DEBUG oslo_concurrency.lockutils [None req-70f5ec0a-04de-4d96-8596-c95bfa428503 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "934dfad2-33a3-44dd-82c8-0b913e89cb8e-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:21:19 compute-0 nova_compute[185723]: 2026-02-16 13:21:19.978 185727 DEBUG oslo_concurrency.lockutils [None req-70f5ec0a-04de-4d96-8596-c95bfa428503 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "934dfad2-33a3-44dd-82c8-0b913e89cb8e-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:21:20 compute-0 nova_compute[185723]: 2026-02-16 13:21:20.029 185727 DEBUG nova.compute.manager [req-5be759d1-2a6d-4735-b9e9-a00b8edde096 req-b75d7c02-feab-4f24-9047-1d64e91e106d faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 934dfad2-33a3-44dd-82c8-0b913e89cb8e] Received event network-vif-plugged-b0642d70-aac9-4a19-b18b-6f6a914d307a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:21:20 compute-0 nova_compute[185723]: 2026-02-16 13:21:20.030 185727 DEBUG oslo_concurrency.lockutils [req-5be759d1-2a6d-4735-b9e9-a00b8edde096 req-b75d7c02-feab-4f24-9047-1d64e91e106d faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "934dfad2-33a3-44dd-82c8-0b913e89cb8e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:21:20 compute-0 nova_compute[185723]: 2026-02-16 13:21:20.030 185727 DEBUG oslo_concurrency.lockutils [req-5be759d1-2a6d-4735-b9e9-a00b8edde096 req-b75d7c02-feab-4f24-9047-1d64e91e106d faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "934dfad2-33a3-44dd-82c8-0b913e89cb8e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:21:20 compute-0 nova_compute[185723]: 2026-02-16 13:21:20.031 185727 DEBUG oslo_concurrency.lockutils [req-5be759d1-2a6d-4735-b9e9-a00b8edde096 req-b75d7c02-feab-4f24-9047-1d64e91e106d faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "934dfad2-33a3-44dd-82c8-0b913e89cb8e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:21:20 compute-0 nova_compute[185723]: 2026-02-16 13:21:20.031 185727 DEBUG nova.compute.manager [req-5be759d1-2a6d-4735-b9e9-a00b8edde096 req-b75d7c02-feab-4f24-9047-1d64e91e106d faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 934dfad2-33a3-44dd-82c8-0b913e89cb8e] No waiting events found dispatching network-vif-plugged-b0642d70-aac9-4a19-b18b-6f6a914d307a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:21:20 compute-0 nova_compute[185723]: 2026-02-16 13:21:20.032 185727 WARNING nova.compute.manager [req-5be759d1-2a6d-4735-b9e9-a00b8edde096 req-b75d7c02-feab-4f24-9047-1d64e91e106d faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 934dfad2-33a3-44dd-82c8-0b913e89cb8e] Received unexpected event network-vif-plugged-b0642d70-aac9-4a19-b18b-6f6a914d307a for instance with vm_state active and task_state resize_migrated.
Feb 16 13:21:21 compute-0 podman[206775]: 2026-02-16 13:21:21.024303043 +0000 UTC m=+0.052310554 container health_status d69bd0be854ec8043fcba555b70c2cd311c7a2510d07d6bc9929fe7684abece9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Feb 16 13:21:21 compute-0 podman[206774]: 2026-02-16 13:21:21.043722857 +0000 UTC m=+0.078196099 container health_status 93151dcbec9f159583042df5101bdd7b5b1bcb5f3117955b4bc9cd1c0e465b98 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, release=1770267347, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, maintainer=Red Hat, Inc., architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.created=2026-02-05T04:57:10Z, distribution-scope=public, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, version=9.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, managed_by=edpm_ansible, build-date=2026-02-05T04:57:10Z, config_id=openstack_network_exporter, name=ubi9/ubi-minimal, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c)
Feb 16 13:21:21 compute-0 nova_compute[185723]: 2026-02-16 13:21:21.996 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:21:22 compute-0 nova_compute[185723]: 2026-02-16 13:21:22.160 185727 DEBUG nova.compute.manager [req-58a8cfd8-48a5-42e2-8e4a-bdf02da22039 req-f71dad49-aa73-47a2-8969-ed90f537276b faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 934dfad2-33a3-44dd-82c8-0b913e89cb8e] Received event network-changed-b0642d70-aac9-4a19-b18b-6f6a914d307a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:21:22 compute-0 nova_compute[185723]: 2026-02-16 13:21:22.161 185727 DEBUG nova.compute.manager [req-58a8cfd8-48a5-42e2-8e4a-bdf02da22039 req-f71dad49-aa73-47a2-8969-ed90f537276b faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 934dfad2-33a3-44dd-82c8-0b913e89cb8e] Refreshing instance network info cache due to event network-changed-b0642d70-aac9-4a19-b18b-6f6a914d307a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 16 13:21:22 compute-0 nova_compute[185723]: 2026-02-16 13:21:22.161 185727 DEBUG oslo_concurrency.lockutils [req-58a8cfd8-48a5-42e2-8e4a-bdf02da22039 req-f71dad49-aa73-47a2-8969-ed90f537276b faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "refresh_cache-934dfad2-33a3-44dd-82c8-0b913e89cb8e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 13:21:22 compute-0 nova_compute[185723]: 2026-02-16 13:21:22.161 185727 DEBUG oslo_concurrency.lockutils [req-58a8cfd8-48a5-42e2-8e4a-bdf02da22039 req-f71dad49-aa73-47a2-8969-ed90f537276b faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquired lock "refresh_cache-934dfad2-33a3-44dd-82c8-0b913e89cb8e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 13:21:22 compute-0 nova_compute[185723]: 2026-02-16 13:21:22.161 185727 DEBUG nova.network.neutron [req-58a8cfd8-48a5-42e2-8e4a-bdf02da22039 req-f71dad49-aa73-47a2-8969-ed90f537276b faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 934dfad2-33a3-44dd-82c8-0b913e89cb8e] Refreshing network info cache for port b0642d70-aac9-4a19-b18b-6f6a914d307a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 16 13:21:23 compute-0 nova_compute[185723]: 2026-02-16 13:21:23.308 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:21:24 compute-0 nova_compute[185723]: 2026-02-16 13:21:24.468 185727 DEBUG nova.network.neutron [req-58a8cfd8-48a5-42e2-8e4a-bdf02da22039 req-f71dad49-aa73-47a2-8969-ed90f537276b faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 934dfad2-33a3-44dd-82c8-0b913e89cb8e] Updated VIF entry in instance network info cache for port b0642d70-aac9-4a19-b18b-6f6a914d307a. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 16 13:21:24 compute-0 nova_compute[185723]: 2026-02-16 13:21:24.468 185727 DEBUG nova.network.neutron [req-58a8cfd8-48a5-42e2-8e4a-bdf02da22039 req-f71dad49-aa73-47a2-8969-ed90f537276b faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 934dfad2-33a3-44dd-82c8-0b913e89cb8e] Updating instance_info_cache with network_info: [{"id": "b0642d70-aac9-4a19-b18b-6f6a914d307a", "address": "fa:16:3e:b1:7c:d9", "network": {"id": "a6199784-1742-41a7-9152-bb54abb7bef1", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-926379118-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5e0321e3a614b62a46eef7fb2e737ff", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb0642d70-aa", "ovs_interfaceid": "b0642d70-aac9-4a19-b18b-6f6a914d307a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 13:21:24 compute-0 nova_compute[185723]: 2026-02-16 13:21:24.490 185727 DEBUG oslo_concurrency.lockutils [req-58a8cfd8-48a5-42e2-8e4a-bdf02da22039 req-f71dad49-aa73-47a2-8969-ed90f537276b faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Releasing lock "refresh_cache-934dfad2-33a3-44dd-82c8-0b913e89cb8e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 13:21:24 compute-0 nova_compute[185723]: 2026-02-16 13:21:24.612 185727 DEBUG nova.compute.manager [req-093cdede-9796-4280-a983-39904a79ac64 req-f214749d-a9ba-451e-b779-8c4fd22d9759 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 934dfad2-33a3-44dd-82c8-0b913e89cb8e] Received event network-vif-plugged-b0642d70-aac9-4a19-b18b-6f6a914d307a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:21:24 compute-0 nova_compute[185723]: 2026-02-16 13:21:24.613 185727 DEBUG oslo_concurrency.lockutils [req-093cdede-9796-4280-a983-39904a79ac64 req-f214749d-a9ba-451e-b779-8c4fd22d9759 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "934dfad2-33a3-44dd-82c8-0b913e89cb8e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:21:24 compute-0 nova_compute[185723]: 2026-02-16 13:21:24.613 185727 DEBUG oslo_concurrency.lockutils [req-093cdede-9796-4280-a983-39904a79ac64 req-f214749d-a9ba-451e-b779-8c4fd22d9759 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "934dfad2-33a3-44dd-82c8-0b913e89cb8e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:21:24 compute-0 nova_compute[185723]: 2026-02-16 13:21:24.613 185727 DEBUG oslo_concurrency.lockutils [req-093cdede-9796-4280-a983-39904a79ac64 req-f214749d-a9ba-451e-b779-8c4fd22d9759 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "934dfad2-33a3-44dd-82c8-0b913e89cb8e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:21:24 compute-0 nova_compute[185723]: 2026-02-16 13:21:24.613 185727 DEBUG nova.compute.manager [req-093cdede-9796-4280-a983-39904a79ac64 req-f214749d-a9ba-451e-b779-8c4fd22d9759 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 934dfad2-33a3-44dd-82c8-0b913e89cb8e] No waiting events found dispatching network-vif-plugged-b0642d70-aac9-4a19-b18b-6f6a914d307a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:21:24 compute-0 nova_compute[185723]: 2026-02-16 13:21:24.613 185727 WARNING nova.compute.manager [req-093cdede-9796-4280-a983-39904a79ac64 req-f214749d-a9ba-451e-b779-8c4fd22d9759 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 934dfad2-33a3-44dd-82c8-0b913e89cb8e] Received unexpected event network-vif-plugged-b0642d70-aac9-4a19-b18b-6f6a914d307a for instance with vm_state active and task_state resize_finish.
Feb 16 13:21:25 compute-0 podman[206813]: 2026-02-16 13:21:25.072572703 +0000 UTC m=+0.109387866 container health_status 19961a2f872cc72daf954f4dd556f980b5f58f137d145776df6132c167a8a93c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller)
Feb 16 13:21:26 compute-0 nova_compute[185723]: 2026-02-16 13:21:26.736 185727 DEBUG nova.compute.manager [req-a2e154dd-49ab-4823-8a62-6474b9871349 req-41f02f5e-4b3d-4324-a2de-0d126dc9e55d faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 934dfad2-33a3-44dd-82c8-0b913e89cb8e] Received event network-vif-plugged-b0642d70-aac9-4a19-b18b-6f6a914d307a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:21:26 compute-0 nova_compute[185723]: 2026-02-16 13:21:26.737 185727 DEBUG oslo_concurrency.lockutils [req-a2e154dd-49ab-4823-8a62-6474b9871349 req-41f02f5e-4b3d-4324-a2de-0d126dc9e55d faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "934dfad2-33a3-44dd-82c8-0b913e89cb8e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:21:26 compute-0 nova_compute[185723]: 2026-02-16 13:21:26.737 185727 DEBUG oslo_concurrency.lockutils [req-a2e154dd-49ab-4823-8a62-6474b9871349 req-41f02f5e-4b3d-4324-a2de-0d126dc9e55d faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "934dfad2-33a3-44dd-82c8-0b913e89cb8e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:21:26 compute-0 nova_compute[185723]: 2026-02-16 13:21:26.737 185727 DEBUG oslo_concurrency.lockutils [req-a2e154dd-49ab-4823-8a62-6474b9871349 req-41f02f5e-4b3d-4324-a2de-0d126dc9e55d faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "934dfad2-33a3-44dd-82c8-0b913e89cb8e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:21:26 compute-0 nova_compute[185723]: 2026-02-16 13:21:26.737 185727 DEBUG nova.compute.manager [req-a2e154dd-49ab-4823-8a62-6474b9871349 req-41f02f5e-4b3d-4324-a2de-0d126dc9e55d faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 934dfad2-33a3-44dd-82c8-0b913e89cb8e] No waiting events found dispatching network-vif-plugged-b0642d70-aac9-4a19-b18b-6f6a914d307a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:21:26 compute-0 nova_compute[185723]: 2026-02-16 13:21:26.737 185727 WARNING nova.compute.manager [req-a2e154dd-49ab-4823-8a62-6474b9871349 req-41f02f5e-4b3d-4324-a2de-0d126dc9e55d faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 934dfad2-33a3-44dd-82c8-0b913e89cb8e] Received unexpected event network-vif-plugged-b0642d70-aac9-4a19-b18b-6f6a914d307a for instance with vm_state resized and task_state None.
Feb 16 13:21:26 compute-0 nova_compute[185723]: 2026-02-16 13:21:26.997 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:21:28 compute-0 nova_compute[185723]: 2026-02-16 13:21:28.310 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:21:29 compute-0 nova_compute[185723]: 2026-02-16 13:21:29.067 185727 DEBUG oslo_concurrency.lockutils [None req-719fb864-75aa-44e2-b163-2f3efd0f8c69 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "934dfad2-33a3-44dd-82c8-0b913e89cb8e" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:21:29 compute-0 nova_compute[185723]: 2026-02-16 13:21:29.068 185727 DEBUG oslo_concurrency.lockutils [None req-719fb864-75aa-44e2-b163-2f3efd0f8c69 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "934dfad2-33a3-44dd-82c8-0b913e89cb8e" acquired by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:21:29 compute-0 nova_compute[185723]: 2026-02-16 13:21:29.068 185727 DEBUG nova.compute.manager [None req-719fb864-75aa-44e2-b163-2f3efd0f8c69 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 934dfad2-33a3-44dd-82c8-0b913e89cb8e] Going to confirm migration 1 do_confirm_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:4679
Feb 16 13:21:29 compute-0 podman[195053]: time="2026-02-16T13:21:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:21:29 compute-0 podman[195053]: @ - - [16/Feb/2026:13:21:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16012 "" "Go-http-client/1.1"
Feb 16 13:21:29 compute-0 podman[195053]: @ - - [16/Feb/2026:13:21:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2163 "" "Go-http-client/1.1"
Feb 16 13:21:29 compute-0 nova_compute[185723]: 2026-02-16 13:21:29.969 185727 DEBUG neutronclient.v2_0.client [None req-719fb864-75aa-44e2-b163-2f3efd0f8c69 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Error message: {"NeutronError": {"type": "PortBindingNotFound", "message": "Binding for port b0642d70-aac9-4a19-b18b-6f6a914d307a for host compute-0.ctlplane.example.com could not be found.", "detail": ""}} _handle_fault_response /usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py:262
Feb 16 13:21:29 compute-0 nova_compute[185723]: 2026-02-16 13:21:29.970 185727 DEBUG oslo_concurrency.lockutils [None req-719fb864-75aa-44e2-b163-2f3efd0f8c69 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "refresh_cache-934dfad2-33a3-44dd-82c8-0b913e89cb8e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 13:21:29 compute-0 nova_compute[185723]: 2026-02-16 13:21:29.970 185727 DEBUG oslo_concurrency.lockutils [None req-719fb864-75aa-44e2-b163-2f3efd0f8c69 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Acquired lock "refresh_cache-934dfad2-33a3-44dd-82c8-0b913e89cb8e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 13:21:29 compute-0 nova_compute[185723]: 2026-02-16 13:21:29.970 185727 DEBUG nova.network.neutron [None req-719fb864-75aa-44e2-b163-2f3efd0f8c69 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 934dfad2-33a3-44dd-82c8-0b913e89cb8e] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 16 13:21:29 compute-0 nova_compute[185723]: 2026-02-16 13:21:29.970 185727 DEBUG nova.objects.instance [None req-719fb864-75aa-44e2-b163-2f3efd0f8c69 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lazy-loading 'info_cache' on Instance uuid 934dfad2-33a3-44dd-82c8-0b913e89cb8e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 13:21:31 compute-0 openstack_network_exporter[197909]: ERROR   13:21:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:21:31 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:21:31 compute-0 openstack_network_exporter[197909]: ERROR   13:21:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:21:31 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:21:32 compute-0 nova_compute[185723]: 2026-02-16 13:21:31.999 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:21:32 compute-0 nova_compute[185723]: 2026-02-16 13:21:32.160 185727 DEBUG nova.network.neutron [None req-719fb864-75aa-44e2-b163-2f3efd0f8c69 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 934dfad2-33a3-44dd-82c8-0b913e89cb8e] Updating instance_info_cache with network_info: [{"id": "b0642d70-aac9-4a19-b18b-6f6a914d307a", "address": "fa:16:3e:b1:7c:d9", "network": {"id": "a6199784-1742-41a7-9152-bb54abb7bef1", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-926379118-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5e0321e3a614b62a46eef7fb2e737ff", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb0642d70-aa", "ovs_interfaceid": "b0642d70-aac9-4a19-b18b-6f6a914d307a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 13:21:32 compute-0 nova_compute[185723]: 2026-02-16 13:21:32.226 185727 DEBUG oslo_concurrency.lockutils [None req-719fb864-75aa-44e2-b163-2f3efd0f8c69 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Releasing lock "refresh_cache-934dfad2-33a3-44dd-82c8-0b913e89cb8e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 13:21:32 compute-0 nova_compute[185723]: 2026-02-16 13:21:32.226 185727 DEBUG nova.objects.instance [None req-719fb864-75aa-44e2-b163-2f3efd0f8c69 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lazy-loading 'migration_context' on Instance uuid 934dfad2-33a3-44dd-82c8-0b913e89cb8e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 13:21:32 compute-0 nova_compute[185723]: 2026-02-16 13:21:32.252 185727 DEBUG nova.virt.libvirt.host [None req-719fb864-75aa-44e2-b163-2f3efd0f8c69 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Checking UEFI support for host arch (x86_64) supports_uefi /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1754
Feb 16 13:21:32 compute-0 nova_compute[185723]: 2026-02-16 13:21:32.253 185727 INFO nova.virt.libvirt.host [None req-719fb864-75aa-44e2-b163-2f3efd0f8c69 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] UEFI support detected
Feb 16 13:21:32 compute-0 nova_compute[185723]: 2026-02-16 13:21:32.256 185727 DEBUG nova.virt.libvirt.vif [None req-719fb864-75aa-44e2-b163-2f3efd0f8c69 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-16T13:20:27Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-727824786',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-727824786',id=1,image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-16T13:21:25Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=Flavor(1),os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b5e0321e3a614b62a46eef7fb2e737ff',ramdisk_id='',reservation_id='r-7k7vpckb',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestExecuteActionsViaActuator-1504038973',owner_user_name='tempest-TestExecuteActionsViaActuator-1504038973-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-02-16T13:21:25Z,user_data=None,user_id='53b5045c5aaf4a7d8d84dce2ac4aa424',uuid=934dfad2-33a3-44dd-82c8-0b913e89cb8e,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='resized') vif={"id": "b0642d70-aac9-4a19-b18b-6f6a914d307a", "address": "fa:16:3e:b1:7c:d9", "network": {"id": "a6199784-1742-41a7-9152-bb54abb7bef1", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-926379118-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5e0321e3a614b62a46eef7fb2e737ff", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb0642d70-aa", "ovs_interfaceid": "b0642d70-aac9-4a19-b18b-6f6a914d307a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Feb 16 13:21:32 compute-0 nova_compute[185723]: 2026-02-16 13:21:32.257 185727 DEBUG nova.network.os_vif_util [None req-719fb864-75aa-44e2-b163-2f3efd0f8c69 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Converting VIF {"id": "b0642d70-aac9-4a19-b18b-6f6a914d307a", "address": "fa:16:3e:b1:7c:d9", "network": {"id": "a6199784-1742-41a7-9152-bb54abb7bef1", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-926379118-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5e0321e3a614b62a46eef7fb2e737ff", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb0642d70-aa", "ovs_interfaceid": "b0642d70-aac9-4a19-b18b-6f6a914d307a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 13:21:32 compute-0 nova_compute[185723]: 2026-02-16 13:21:32.258 185727 DEBUG nova.network.os_vif_util [None req-719fb864-75aa-44e2-b163-2f3efd0f8c69 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:b1:7c:d9,bridge_name='br-int',has_traffic_filtering=True,id=b0642d70-aac9-4a19-b18b-6f6a914d307a,network=Network(a6199784-1742-41a7-9152-bb54abb7bef1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb0642d70-aa') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 13:21:32 compute-0 nova_compute[185723]: 2026-02-16 13:21:32.259 185727 DEBUG os_vif [None req-719fb864-75aa-44e2-b163-2f3efd0f8c69 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:b1:7c:d9,bridge_name='br-int',has_traffic_filtering=True,id=b0642d70-aac9-4a19-b18b-6f6a914d307a,network=Network(a6199784-1742-41a7-9152-bb54abb7bef1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb0642d70-aa') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Feb 16 13:21:32 compute-0 nova_compute[185723]: 2026-02-16 13:21:32.262 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:21:32 compute-0 nova_compute[185723]: 2026-02-16 13:21:32.263 185727 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb0642d70-aa, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:21:32 compute-0 nova_compute[185723]: 2026-02-16 13:21:32.263 185727 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 13:21:32 compute-0 nova_compute[185723]: 2026-02-16 13:21:32.267 185727 INFO os_vif [None req-719fb864-75aa-44e2-b163-2f3efd0f8c69 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:b1:7c:d9,bridge_name='br-int',has_traffic_filtering=True,id=b0642d70-aac9-4a19-b18b-6f6a914d307a,network=Network(a6199784-1742-41a7-9152-bb54abb7bef1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb0642d70-aa')
Feb 16 13:21:32 compute-0 nova_compute[185723]: 2026-02-16 13:21:32.268 185727 DEBUG oslo_concurrency.lockutils [None req-719fb864-75aa-44e2-b163-2f3efd0f8c69 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:21:32 compute-0 nova_compute[185723]: 2026-02-16 13:21:32.268 185727 DEBUG oslo_concurrency.lockutils [None req-719fb864-75aa-44e2-b163-2f3efd0f8c69 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:21:32 compute-0 nova_compute[185723]: 2026-02-16 13:21:32.399 185727 DEBUG nova.compute.provider_tree [None req-719fb864-75aa-44e2-b163-2f3efd0f8c69 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Inventory has not changed in ProviderTree for provider: c9501a85-df32-4b8f-bce0-9425ef1e7866 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 13:21:32 compute-0 nova_compute[185723]: 2026-02-16 13:21:32.418 185727 DEBUG nova.scheduler.client.report [None req-719fb864-75aa-44e2-b163-2f3efd0f8c69 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Inventory has not changed for provider c9501a85-df32-4b8f-bce0-9425ef1e7866 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 13:21:32 compute-0 nova_compute[185723]: 2026-02-16 13:21:32.473 185727 DEBUG oslo_concurrency.lockutils [None req-719fb864-75aa-44e2-b163-2f3efd0f8c69 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: held 0.205s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:21:32 compute-0 nova_compute[185723]: 2026-02-16 13:21:32.687 185727 INFO nova.scheduler.client.report [None req-719fb864-75aa-44e2-b163-2f3efd0f8c69 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Deleted allocation for migration 8cb2fbbd-d3e9-4aa3-a7d8-6931faa17b05
Feb 16 13:21:32 compute-0 nova_compute[185723]: 2026-02-16 13:21:32.771 185727 DEBUG oslo_concurrency.lockutils [None req-719fb864-75aa-44e2-b163-2f3efd0f8c69 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "934dfad2-33a3-44dd-82c8-0b913e89cb8e" "released" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: held 3.703s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:21:32 compute-0 nova_compute[185723]: 2026-02-16 13:21:32.835 185727 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1771248077.8344762, 934dfad2-33a3-44dd-82c8-0b913e89cb8e => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 13:21:32 compute-0 nova_compute[185723]: 2026-02-16 13:21:32.836 185727 INFO nova.compute.manager [-] [instance: 934dfad2-33a3-44dd-82c8-0b913e89cb8e] VM Stopped (Lifecycle Event)
Feb 16 13:21:32 compute-0 nova_compute[185723]: 2026-02-16 13:21:32.910 185727 DEBUG nova.compute.manager [None req-4797c16e-79da-4bc8-ae12-8c4caff389b3 - - - - - -] [instance: 934dfad2-33a3-44dd-82c8-0b913e89cb8e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:21:33 compute-0 nova_compute[185723]: 2026-02-16 13:21:33.312 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:21:34 compute-0 nova_compute[185723]: 2026-02-16 13:21:34.423 185727 DEBUG oslo_concurrency.lockutils [None req-0397efa7-33b2-4726-9e67-1a7c9d6a31a1 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Acquiring lock "b21f8b55-68d7-4cd7-beed-2d61f932f84e" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:21:34 compute-0 nova_compute[185723]: 2026-02-16 13:21:34.424 185727 DEBUG oslo_concurrency.lockutils [None req-0397efa7-33b2-4726-9e67-1a7c9d6a31a1 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Lock "b21f8b55-68d7-4cd7-beed-2d61f932f84e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:21:34 compute-0 nova_compute[185723]: 2026-02-16 13:21:34.450 185727 DEBUG nova.compute.manager [None req-0397efa7-33b2-4726-9e67-1a7c9d6a31a1 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: b21f8b55-68d7-4cd7-beed-2d61f932f84e] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Feb 16 13:21:34 compute-0 nova_compute[185723]: 2026-02-16 13:21:34.549 185727 DEBUG oslo_concurrency.lockutils [None req-0397efa7-33b2-4726-9e67-1a7c9d6a31a1 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:21:34 compute-0 nova_compute[185723]: 2026-02-16 13:21:34.550 185727 DEBUG oslo_concurrency.lockutils [None req-0397efa7-33b2-4726-9e67-1a7c9d6a31a1 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:21:34 compute-0 nova_compute[185723]: 2026-02-16 13:21:34.559 185727 DEBUG nova.virt.hardware [None req-0397efa7-33b2-4726-9e67-1a7c9d6a31a1 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Feb 16 13:21:34 compute-0 nova_compute[185723]: 2026-02-16 13:21:34.559 185727 INFO nova.compute.claims [None req-0397efa7-33b2-4726-9e67-1a7c9d6a31a1 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: b21f8b55-68d7-4cd7-beed-2d61f932f84e] Claim successful on node compute-0.ctlplane.example.com
Feb 16 13:21:34 compute-0 nova_compute[185723]: 2026-02-16 13:21:34.729 185727 DEBUG nova.compute.provider_tree [None req-0397efa7-33b2-4726-9e67-1a7c9d6a31a1 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Inventory has not changed in ProviderTree for provider: c9501a85-df32-4b8f-bce0-9425ef1e7866 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 13:21:34 compute-0 nova_compute[185723]: 2026-02-16 13:21:34.750 185727 DEBUG nova.scheduler.client.report [None req-0397efa7-33b2-4726-9e67-1a7c9d6a31a1 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Inventory has not changed for provider c9501a85-df32-4b8f-bce0-9425ef1e7866 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 13:21:34 compute-0 nova_compute[185723]: 2026-02-16 13:21:34.780 185727 DEBUG oslo_concurrency.lockutils [None req-0397efa7-33b2-4726-9e67-1a7c9d6a31a1 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.230s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:21:34 compute-0 nova_compute[185723]: 2026-02-16 13:21:34.781 185727 DEBUG nova.compute.manager [None req-0397efa7-33b2-4726-9e67-1a7c9d6a31a1 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: b21f8b55-68d7-4cd7-beed-2d61f932f84e] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Feb 16 13:21:34 compute-0 nova_compute[185723]: 2026-02-16 13:21:34.861 185727 DEBUG nova.compute.manager [None req-0397efa7-33b2-4726-9e67-1a7c9d6a31a1 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: b21f8b55-68d7-4cd7-beed-2d61f932f84e] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Feb 16 13:21:34 compute-0 nova_compute[185723]: 2026-02-16 13:21:34.861 185727 DEBUG nova.network.neutron [None req-0397efa7-33b2-4726-9e67-1a7c9d6a31a1 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: b21f8b55-68d7-4cd7-beed-2d61f932f84e] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Feb 16 13:21:34 compute-0 nova_compute[185723]: 2026-02-16 13:21:34.898 185727 INFO nova.virt.libvirt.driver [None req-0397efa7-33b2-4726-9e67-1a7c9d6a31a1 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: b21f8b55-68d7-4cd7-beed-2d61f932f84e] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Feb 16 13:21:34 compute-0 nova_compute[185723]: 2026-02-16 13:21:34.930 185727 DEBUG nova.compute.manager [None req-0397efa7-33b2-4726-9e67-1a7c9d6a31a1 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: b21f8b55-68d7-4cd7-beed-2d61f932f84e] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Feb 16 13:21:35 compute-0 podman[206839]: 2026-02-16 13:21:35.045345853 +0000 UTC m=+0.085783448 container health_status 4ec6105d1727f201f656d7e6e358931be8ceabc5af37fb5ad779abc620122180 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 16 13:21:35 compute-0 nova_compute[185723]: 2026-02-16 13:21:35.092 185727 DEBUG nova.compute.manager [None req-0397efa7-33b2-4726-9e67-1a7c9d6a31a1 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: b21f8b55-68d7-4cd7-beed-2d61f932f84e] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Feb 16 13:21:35 compute-0 nova_compute[185723]: 2026-02-16 13:21:35.093 185727 DEBUG nova.virt.libvirt.driver [None req-0397efa7-33b2-4726-9e67-1a7c9d6a31a1 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: b21f8b55-68d7-4cd7-beed-2d61f932f84e] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Feb 16 13:21:35 compute-0 nova_compute[185723]: 2026-02-16 13:21:35.094 185727 INFO nova.virt.libvirt.driver [None req-0397efa7-33b2-4726-9e67-1a7c9d6a31a1 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: b21f8b55-68d7-4cd7-beed-2d61f932f84e] Creating image(s)
Feb 16 13:21:35 compute-0 nova_compute[185723]: 2026-02-16 13:21:35.095 185727 DEBUG oslo_concurrency.lockutils [None req-0397efa7-33b2-4726-9e67-1a7c9d6a31a1 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Acquiring lock "/var/lib/nova/instances/b21f8b55-68d7-4cd7-beed-2d61f932f84e/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:21:35 compute-0 nova_compute[185723]: 2026-02-16 13:21:35.095 185727 DEBUG oslo_concurrency.lockutils [None req-0397efa7-33b2-4726-9e67-1a7c9d6a31a1 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Lock "/var/lib/nova/instances/b21f8b55-68d7-4cd7-beed-2d61f932f84e/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:21:35 compute-0 nova_compute[185723]: 2026-02-16 13:21:35.096 185727 DEBUG oslo_concurrency.lockutils [None req-0397efa7-33b2-4726-9e67-1a7c9d6a31a1 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Lock "/var/lib/nova/instances/b21f8b55-68d7-4cd7-beed-2d61f932f84e/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:21:35 compute-0 nova_compute[185723]: 2026-02-16 13:21:35.108 185727 DEBUG oslo_concurrency.processutils [None req-0397efa7-33b2-4726-9e67-1a7c9d6a31a1 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:21:35 compute-0 nova_compute[185723]: 2026-02-16 13:21:35.160 185727 DEBUG oslo_concurrency.processutils [None req-0397efa7-33b2-4726-9e67-1a7c9d6a31a1 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:21:35 compute-0 nova_compute[185723]: 2026-02-16 13:21:35.162 185727 DEBUG oslo_concurrency.lockutils [None req-0397efa7-33b2-4726-9e67-1a7c9d6a31a1 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Acquiring lock "755ae6cb63977e865a71a8c38a1a67ce95c9d0b7" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:21:35 compute-0 nova_compute[185723]: 2026-02-16 13:21:35.162 185727 DEBUG oslo_concurrency.lockutils [None req-0397efa7-33b2-4726-9e67-1a7c9d6a31a1 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Lock "755ae6cb63977e865a71a8c38a1a67ce95c9d0b7" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:21:35 compute-0 nova_compute[185723]: 2026-02-16 13:21:35.174 185727 DEBUG oslo_concurrency.processutils [None req-0397efa7-33b2-4726-9e67-1a7c9d6a31a1 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:21:35 compute-0 nova_compute[185723]: 2026-02-16 13:21:35.228 185727 DEBUG oslo_concurrency.processutils [None req-0397efa7-33b2-4726-9e67-1a7c9d6a31a1 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:21:35 compute-0 nova_compute[185723]: 2026-02-16 13:21:35.230 185727 DEBUG oslo_concurrency.processutils [None req-0397efa7-33b2-4726-9e67-1a7c9d6a31a1 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7,backing_fmt=raw /var/lib/nova/instances/b21f8b55-68d7-4cd7-beed-2d61f932f84e/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:21:35 compute-0 nova_compute[185723]: 2026-02-16 13:21:35.267 185727 DEBUG oslo_concurrency.processutils [None req-0397efa7-33b2-4726-9e67-1a7c9d6a31a1 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7,backing_fmt=raw /var/lib/nova/instances/b21f8b55-68d7-4cd7-beed-2d61f932f84e/disk 1073741824" returned: 0 in 0.037s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:21:35 compute-0 nova_compute[185723]: 2026-02-16 13:21:35.268 185727 DEBUG oslo_concurrency.lockutils [None req-0397efa7-33b2-4726-9e67-1a7c9d6a31a1 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Lock "755ae6cb63977e865a71a8c38a1a67ce95c9d0b7" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.106s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:21:35 compute-0 nova_compute[185723]: 2026-02-16 13:21:35.269 185727 DEBUG oslo_concurrency.processutils [None req-0397efa7-33b2-4726-9e67-1a7c9d6a31a1 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:21:35 compute-0 nova_compute[185723]: 2026-02-16 13:21:35.321 185727 DEBUG oslo_concurrency.processutils [None req-0397efa7-33b2-4726-9e67-1a7c9d6a31a1 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:21:35 compute-0 nova_compute[185723]: 2026-02-16 13:21:35.323 185727 DEBUG nova.virt.disk.api [None req-0397efa7-33b2-4726-9e67-1a7c9d6a31a1 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Checking if we can resize image /var/lib/nova/instances/b21f8b55-68d7-4cd7-beed-2d61f932f84e/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Feb 16 13:21:35 compute-0 nova_compute[185723]: 2026-02-16 13:21:35.324 185727 DEBUG oslo_concurrency.processutils [None req-0397efa7-33b2-4726-9e67-1a7c9d6a31a1 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b21f8b55-68d7-4cd7-beed-2d61f932f84e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:21:35 compute-0 nova_compute[185723]: 2026-02-16 13:21:35.401 185727 DEBUG oslo_concurrency.processutils [None req-0397efa7-33b2-4726-9e67-1a7c9d6a31a1 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b21f8b55-68d7-4cd7-beed-2d61f932f84e/disk --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:21:35 compute-0 nova_compute[185723]: 2026-02-16 13:21:35.403 185727 DEBUG nova.virt.disk.api [None req-0397efa7-33b2-4726-9e67-1a7c9d6a31a1 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Cannot resize image /var/lib/nova/instances/b21f8b55-68d7-4cd7-beed-2d61f932f84e/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Feb 16 13:21:35 compute-0 nova_compute[185723]: 2026-02-16 13:21:35.403 185727 DEBUG nova.objects.instance [None req-0397efa7-33b2-4726-9e67-1a7c9d6a31a1 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Lazy-loading 'migration_context' on Instance uuid b21f8b55-68d7-4cd7-beed-2d61f932f84e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 13:21:35 compute-0 nova_compute[185723]: 2026-02-16 13:21:35.430 185727 DEBUG nova.virt.libvirt.driver [None req-0397efa7-33b2-4726-9e67-1a7c9d6a31a1 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: b21f8b55-68d7-4cd7-beed-2d61f932f84e] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Feb 16 13:21:35 compute-0 nova_compute[185723]: 2026-02-16 13:21:35.431 185727 DEBUG nova.virt.libvirt.driver [None req-0397efa7-33b2-4726-9e67-1a7c9d6a31a1 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: b21f8b55-68d7-4cd7-beed-2d61f932f84e] Ensure instance console log exists: /var/lib/nova/instances/b21f8b55-68d7-4cd7-beed-2d61f932f84e/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Feb 16 13:21:35 compute-0 nova_compute[185723]: 2026-02-16 13:21:35.432 185727 DEBUG oslo_concurrency.lockutils [None req-0397efa7-33b2-4726-9e67-1a7c9d6a31a1 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:21:35 compute-0 nova_compute[185723]: 2026-02-16 13:21:35.433 185727 DEBUG oslo_concurrency.lockutils [None req-0397efa7-33b2-4726-9e67-1a7c9d6a31a1 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:21:35 compute-0 nova_compute[185723]: 2026-02-16 13:21:35.433 185727 DEBUG oslo_concurrency.lockutils [None req-0397efa7-33b2-4726-9e67-1a7c9d6a31a1 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:21:35 compute-0 nova_compute[185723]: 2026-02-16 13:21:35.854 185727 DEBUG nova.policy [None req-0397efa7-33b2-4726-9e67-1a7c9d6a31a1 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '53b5045c5aaf4a7d8d84dce2ac4aa424', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b5e0321e3a614b62a46eef7fb2e737ff', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Feb 16 13:21:37 compute-0 nova_compute[185723]: 2026-02-16 13:21:37.001 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:21:38 compute-0 nova_compute[185723]: 2026-02-16 13:21:38.314 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:21:38 compute-0 nova_compute[185723]: 2026-02-16 13:21:38.426 185727 DEBUG nova.network.neutron [None req-0397efa7-33b2-4726-9e67-1a7c9d6a31a1 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: b21f8b55-68d7-4cd7-beed-2d61f932f84e] Successfully created port: 3bdc1813-a8d3-43b8-805c-95acd138d9d6 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Feb 16 13:21:39 compute-0 nova_compute[185723]: 2026-02-16 13:21:39.565 185727 DEBUG nova.network.neutron [None req-0397efa7-33b2-4726-9e67-1a7c9d6a31a1 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: b21f8b55-68d7-4cd7-beed-2d61f932f84e] Successfully updated port: 3bdc1813-a8d3-43b8-805c-95acd138d9d6 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Feb 16 13:21:39 compute-0 nova_compute[185723]: 2026-02-16 13:21:39.593 185727 DEBUG oslo_concurrency.lockutils [None req-0397efa7-33b2-4726-9e67-1a7c9d6a31a1 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Acquiring lock "refresh_cache-b21f8b55-68d7-4cd7-beed-2d61f932f84e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 13:21:39 compute-0 nova_compute[185723]: 2026-02-16 13:21:39.594 185727 DEBUG oslo_concurrency.lockutils [None req-0397efa7-33b2-4726-9e67-1a7c9d6a31a1 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Acquired lock "refresh_cache-b21f8b55-68d7-4cd7-beed-2d61f932f84e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 13:21:39 compute-0 nova_compute[185723]: 2026-02-16 13:21:39.594 185727 DEBUG nova.network.neutron [None req-0397efa7-33b2-4726-9e67-1a7c9d6a31a1 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: b21f8b55-68d7-4cd7-beed-2d61f932f84e] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 16 13:21:39 compute-0 nova_compute[185723]: 2026-02-16 13:21:39.719 185727 DEBUG nova.compute.manager [req-e8db086c-27ac-4de5-88e8-69cd2a288832 req-5be8caaa-3c53-4e5c-82dc-490fdd00dee8 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: b21f8b55-68d7-4cd7-beed-2d61f932f84e] Received event network-changed-3bdc1813-a8d3-43b8-805c-95acd138d9d6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:21:39 compute-0 nova_compute[185723]: 2026-02-16 13:21:39.719 185727 DEBUG nova.compute.manager [req-e8db086c-27ac-4de5-88e8-69cd2a288832 req-5be8caaa-3c53-4e5c-82dc-490fdd00dee8 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: b21f8b55-68d7-4cd7-beed-2d61f932f84e] Refreshing instance network info cache due to event network-changed-3bdc1813-a8d3-43b8-805c-95acd138d9d6. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 16 13:21:39 compute-0 nova_compute[185723]: 2026-02-16 13:21:39.719 185727 DEBUG oslo_concurrency.lockutils [req-e8db086c-27ac-4de5-88e8-69cd2a288832 req-5be8caaa-3c53-4e5c-82dc-490fdd00dee8 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "refresh_cache-b21f8b55-68d7-4cd7-beed-2d61f932f84e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 13:21:39 compute-0 nova_compute[185723]: 2026-02-16 13:21:39.919 185727 DEBUG nova.network.neutron [None req-0397efa7-33b2-4726-9e67-1a7c9d6a31a1 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: b21f8b55-68d7-4cd7-beed-2d61f932f84e] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Feb 16 13:21:41 compute-0 nova_compute[185723]: 2026-02-16 13:21:41.614 185727 DEBUG nova.network.neutron [None req-0397efa7-33b2-4726-9e67-1a7c9d6a31a1 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: b21f8b55-68d7-4cd7-beed-2d61f932f84e] Updating instance_info_cache with network_info: [{"id": "3bdc1813-a8d3-43b8-805c-95acd138d9d6", "address": "fa:16:3e:8a:da:08", "network": {"id": "a6199784-1742-41a7-9152-bb54abb7bef1", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-926379118-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5e0321e3a614b62a46eef7fb2e737ff", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3bdc1813-a8", "ovs_interfaceid": "3bdc1813-a8d3-43b8-805c-95acd138d9d6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 13:21:41 compute-0 nova_compute[185723]: 2026-02-16 13:21:41.685 185727 DEBUG oslo_concurrency.lockutils [None req-0397efa7-33b2-4726-9e67-1a7c9d6a31a1 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Releasing lock "refresh_cache-b21f8b55-68d7-4cd7-beed-2d61f932f84e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 13:21:41 compute-0 nova_compute[185723]: 2026-02-16 13:21:41.686 185727 DEBUG nova.compute.manager [None req-0397efa7-33b2-4726-9e67-1a7c9d6a31a1 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: b21f8b55-68d7-4cd7-beed-2d61f932f84e] Instance network_info: |[{"id": "3bdc1813-a8d3-43b8-805c-95acd138d9d6", "address": "fa:16:3e:8a:da:08", "network": {"id": "a6199784-1742-41a7-9152-bb54abb7bef1", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-926379118-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5e0321e3a614b62a46eef7fb2e737ff", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3bdc1813-a8", "ovs_interfaceid": "3bdc1813-a8d3-43b8-805c-95acd138d9d6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Feb 16 13:21:41 compute-0 nova_compute[185723]: 2026-02-16 13:21:41.686 185727 DEBUG oslo_concurrency.lockutils [req-e8db086c-27ac-4de5-88e8-69cd2a288832 req-5be8caaa-3c53-4e5c-82dc-490fdd00dee8 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquired lock "refresh_cache-b21f8b55-68d7-4cd7-beed-2d61f932f84e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 13:21:41 compute-0 nova_compute[185723]: 2026-02-16 13:21:41.687 185727 DEBUG nova.network.neutron [req-e8db086c-27ac-4de5-88e8-69cd2a288832 req-5be8caaa-3c53-4e5c-82dc-490fdd00dee8 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: b21f8b55-68d7-4cd7-beed-2d61f932f84e] Refreshing network info cache for port 3bdc1813-a8d3-43b8-805c-95acd138d9d6 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 16 13:21:41 compute-0 nova_compute[185723]: 2026-02-16 13:21:41.690 185727 DEBUG nova.virt.libvirt.driver [None req-0397efa7-33b2-4726-9e67-1a7c9d6a31a1 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: b21f8b55-68d7-4cd7-beed-2d61f932f84e] Start _get_guest_xml network_info=[{"id": "3bdc1813-a8d3-43b8-805c-95acd138d9d6", "address": "fa:16:3e:8a:da:08", "network": {"id": "a6199784-1742-41a7-9152-bb54abb7bef1", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-926379118-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5e0321e3a614b62a46eef7fb2e737ff", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3bdc1813-a8", "ovs_interfaceid": "3bdc1813-a8d3-43b8-805c-95acd138d9d6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-16T13:17:01Z,direct_url=<?>,disk_format='qcow2',id=6fb9af7f-2971-4890-a777-6e99e888717f,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='fa8ec31824694513a42cc22c81880a5c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-16T13:17:03Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'encrypted': False, 'device_type': 'disk', 'disk_bus': 'virtio', 'encryption_options': None, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'size': 0, 'image_id': '6fb9af7f-2971-4890-a777-6e99e888717f'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Feb 16 13:21:41 compute-0 nova_compute[185723]: 2026-02-16 13:21:41.696 185727 WARNING nova.virt.libvirt.driver [None req-0397efa7-33b2-4726-9e67-1a7c9d6a31a1 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 13:21:41 compute-0 nova_compute[185723]: 2026-02-16 13:21:41.711 185727 DEBUG nova.virt.libvirt.host [None req-0397efa7-33b2-4726-9e67-1a7c9d6a31a1 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Feb 16 13:21:41 compute-0 nova_compute[185723]: 2026-02-16 13:21:41.712 185727 DEBUG nova.virt.libvirt.host [None req-0397efa7-33b2-4726-9e67-1a7c9d6a31a1 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Feb 16 13:21:41 compute-0 nova_compute[185723]: 2026-02-16 13:21:41.722 185727 DEBUG nova.virt.libvirt.host [None req-0397efa7-33b2-4726-9e67-1a7c9d6a31a1 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Feb 16 13:21:41 compute-0 nova_compute[185723]: 2026-02-16 13:21:41.722 185727 DEBUG nova.virt.libvirt.host [None req-0397efa7-33b2-4726-9e67-1a7c9d6a31a1 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Feb 16 13:21:41 compute-0 nova_compute[185723]: 2026-02-16 13:21:41.724 185727 DEBUG nova.virt.libvirt.driver [None req-0397efa7-33b2-4726-9e67-1a7c9d6a31a1 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Feb 16 13:21:41 compute-0 nova_compute[185723]: 2026-02-16 13:21:41.724 185727 DEBUG nova.virt.hardware [None req-0397efa7-33b2-4726-9e67-1a7c9d6a31a1 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-02-16T13:16:59Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='6d89f72c-1760-421e-a5f2-83dfc3723b84',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-16T13:17:01Z,direct_url=<?>,disk_format='qcow2',id=6fb9af7f-2971-4890-a777-6e99e888717f,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='fa8ec31824694513a42cc22c81880a5c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-16T13:17:03Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Feb 16 13:21:41 compute-0 nova_compute[185723]: 2026-02-16 13:21:41.724 185727 DEBUG nova.virt.hardware [None req-0397efa7-33b2-4726-9e67-1a7c9d6a31a1 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Feb 16 13:21:41 compute-0 nova_compute[185723]: 2026-02-16 13:21:41.725 185727 DEBUG nova.virt.hardware [None req-0397efa7-33b2-4726-9e67-1a7c9d6a31a1 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Feb 16 13:21:41 compute-0 nova_compute[185723]: 2026-02-16 13:21:41.725 185727 DEBUG nova.virt.hardware [None req-0397efa7-33b2-4726-9e67-1a7c9d6a31a1 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Feb 16 13:21:41 compute-0 nova_compute[185723]: 2026-02-16 13:21:41.725 185727 DEBUG nova.virt.hardware [None req-0397efa7-33b2-4726-9e67-1a7c9d6a31a1 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Feb 16 13:21:41 compute-0 nova_compute[185723]: 2026-02-16 13:21:41.725 185727 DEBUG nova.virt.hardware [None req-0397efa7-33b2-4726-9e67-1a7c9d6a31a1 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Feb 16 13:21:41 compute-0 nova_compute[185723]: 2026-02-16 13:21:41.725 185727 DEBUG nova.virt.hardware [None req-0397efa7-33b2-4726-9e67-1a7c9d6a31a1 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Feb 16 13:21:41 compute-0 nova_compute[185723]: 2026-02-16 13:21:41.726 185727 DEBUG nova.virt.hardware [None req-0397efa7-33b2-4726-9e67-1a7c9d6a31a1 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Feb 16 13:21:41 compute-0 nova_compute[185723]: 2026-02-16 13:21:41.726 185727 DEBUG nova.virt.hardware [None req-0397efa7-33b2-4726-9e67-1a7c9d6a31a1 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Feb 16 13:21:41 compute-0 nova_compute[185723]: 2026-02-16 13:21:41.726 185727 DEBUG nova.virt.hardware [None req-0397efa7-33b2-4726-9e67-1a7c9d6a31a1 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Feb 16 13:21:41 compute-0 nova_compute[185723]: 2026-02-16 13:21:41.726 185727 DEBUG nova.virt.hardware [None req-0397efa7-33b2-4726-9e67-1a7c9d6a31a1 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Feb 16 13:21:41 compute-0 nova_compute[185723]: 2026-02-16 13:21:41.729 185727 DEBUG nova.virt.libvirt.vif [None req-0397efa7-33b2-4726-9e67-1a7c9d6a31a1 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-16T13:21:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-2049385443',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-2049385443',id=3,image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b5e0321e3a614b62a46eef7fb2e737ff',ramdisk_id='',reservation_id='r-h5dq1f8v',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteActionsViaActuator-1504038973',owner_user_name='tempest-TestExecuteActionsViaActuator-1504038973-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-16T13:21:35Z,user_data=None,user_id='53b5045c5aaf4a7d8d84dce2ac4aa424',uuid=b21f8b55-68d7-4cd7-beed-2d61f932f84e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3bdc1813-a8d3-43b8-805c-95acd138d9d6", "address": "fa:16:3e:8a:da:08", "network": {"id": "a6199784-1742-41a7-9152-bb54abb7bef1", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-926379118-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5e0321e3a614b62a46eef7fb2e737ff", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3bdc1813-a8", "ovs_interfaceid": "3bdc1813-a8d3-43b8-805c-95acd138d9d6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Feb 16 13:21:41 compute-0 nova_compute[185723]: 2026-02-16 13:21:41.729 185727 DEBUG nova.network.os_vif_util [None req-0397efa7-33b2-4726-9e67-1a7c9d6a31a1 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Converting VIF {"id": "3bdc1813-a8d3-43b8-805c-95acd138d9d6", "address": "fa:16:3e:8a:da:08", "network": {"id": "a6199784-1742-41a7-9152-bb54abb7bef1", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-926379118-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5e0321e3a614b62a46eef7fb2e737ff", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3bdc1813-a8", "ovs_interfaceid": "3bdc1813-a8d3-43b8-805c-95acd138d9d6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 13:21:41 compute-0 nova_compute[185723]: 2026-02-16 13:21:41.730 185727 DEBUG nova.network.os_vif_util [None req-0397efa7-33b2-4726-9e67-1a7c9d6a31a1 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8a:da:08,bridge_name='br-int',has_traffic_filtering=True,id=3bdc1813-a8d3-43b8-805c-95acd138d9d6,network=Network(a6199784-1742-41a7-9152-bb54abb7bef1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3bdc1813-a8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 13:21:41 compute-0 nova_compute[185723]: 2026-02-16 13:21:41.731 185727 DEBUG nova.objects.instance [None req-0397efa7-33b2-4726-9e67-1a7c9d6a31a1 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Lazy-loading 'pci_devices' on Instance uuid b21f8b55-68d7-4cd7-beed-2d61f932f84e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 13:21:41 compute-0 nova_compute[185723]: 2026-02-16 13:21:41.750 185727 DEBUG nova.virt.libvirt.driver [None req-0397efa7-33b2-4726-9e67-1a7c9d6a31a1 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: b21f8b55-68d7-4cd7-beed-2d61f932f84e] End _get_guest_xml xml=<domain type="kvm">
Feb 16 13:21:41 compute-0 nova_compute[185723]:   <uuid>b21f8b55-68d7-4cd7-beed-2d61f932f84e</uuid>
Feb 16 13:21:41 compute-0 nova_compute[185723]:   <name>instance-00000003</name>
Feb 16 13:21:41 compute-0 nova_compute[185723]:   <memory>131072</memory>
Feb 16 13:21:41 compute-0 nova_compute[185723]:   <vcpu>1</vcpu>
Feb 16 13:21:41 compute-0 nova_compute[185723]:   <metadata>
Feb 16 13:21:41 compute-0 nova_compute[185723]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb 16 13:21:41 compute-0 nova_compute[185723]:       <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Feb 16 13:21:41 compute-0 nova_compute[185723]:       <nova:name>tempest-TestExecuteActionsViaActuator-server-2049385443</nova:name>
Feb 16 13:21:41 compute-0 nova_compute[185723]:       <nova:creationTime>2026-02-16 13:21:41</nova:creationTime>
Feb 16 13:21:41 compute-0 nova_compute[185723]:       <nova:flavor name="m1.nano">
Feb 16 13:21:41 compute-0 nova_compute[185723]:         <nova:memory>128</nova:memory>
Feb 16 13:21:41 compute-0 nova_compute[185723]:         <nova:disk>1</nova:disk>
Feb 16 13:21:41 compute-0 nova_compute[185723]:         <nova:swap>0</nova:swap>
Feb 16 13:21:41 compute-0 nova_compute[185723]:         <nova:ephemeral>0</nova:ephemeral>
Feb 16 13:21:41 compute-0 nova_compute[185723]:         <nova:vcpus>1</nova:vcpus>
Feb 16 13:21:41 compute-0 nova_compute[185723]:       </nova:flavor>
Feb 16 13:21:41 compute-0 nova_compute[185723]:       <nova:owner>
Feb 16 13:21:41 compute-0 nova_compute[185723]:         <nova:user uuid="53b5045c5aaf4a7d8d84dce2ac4aa424">tempest-TestExecuteActionsViaActuator-1504038973-project-member</nova:user>
Feb 16 13:21:41 compute-0 nova_compute[185723]:         <nova:project uuid="b5e0321e3a614b62a46eef7fb2e737ff">tempest-TestExecuteActionsViaActuator-1504038973</nova:project>
Feb 16 13:21:41 compute-0 nova_compute[185723]:       </nova:owner>
Feb 16 13:21:41 compute-0 nova_compute[185723]:       <nova:root type="image" uuid="6fb9af7f-2971-4890-a777-6e99e888717f"/>
Feb 16 13:21:41 compute-0 nova_compute[185723]:       <nova:ports>
Feb 16 13:21:41 compute-0 nova_compute[185723]:         <nova:port uuid="3bdc1813-a8d3-43b8-805c-95acd138d9d6">
Feb 16 13:21:41 compute-0 nova_compute[185723]:           <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Feb 16 13:21:41 compute-0 nova_compute[185723]:         </nova:port>
Feb 16 13:21:41 compute-0 nova_compute[185723]:       </nova:ports>
Feb 16 13:21:41 compute-0 nova_compute[185723]:     </nova:instance>
Feb 16 13:21:41 compute-0 nova_compute[185723]:   </metadata>
Feb 16 13:21:41 compute-0 nova_compute[185723]:   <sysinfo type="smbios">
Feb 16 13:21:41 compute-0 nova_compute[185723]:     <system>
Feb 16 13:21:41 compute-0 nova_compute[185723]:       <entry name="manufacturer">RDO</entry>
Feb 16 13:21:41 compute-0 nova_compute[185723]:       <entry name="product">OpenStack Compute</entry>
Feb 16 13:21:41 compute-0 nova_compute[185723]:       <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Feb 16 13:21:41 compute-0 nova_compute[185723]:       <entry name="serial">b21f8b55-68d7-4cd7-beed-2d61f932f84e</entry>
Feb 16 13:21:41 compute-0 nova_compute[185723]:       <entry name="uuid">b21f8b55-68d7-4cd7-beed-2d61f932f84e</entry>
Feb 16 13:21:41 compute-0 nova_compute[185723]:       <entry name="family">Virtual Machine</entry>
Feb 16 13:21:41 compute-0 nova_compute[185723]:     </system>
Feb 16 13:21:41 compute-0 nova_compute[185723]:   </sysinfo>
Feb 16 13:21:41 compute-0 nova_compute[185723]:   <os>
Feb 16 13:21:41 compute-0 nova_compute[185723]:     <type arch="x86_64" machine="q35">hvm</type>
Feb 16 13:21:41 compute-0 nova_compute[185723]:     <boot dev="hd"/>
Feb 16 13:21:41 compute-0 nova_compute[185723]:     <smbios mode="sysinfo"/>
Feb 16 13:21:41 compute-0 nova_compute[185723]:   </os>
Feb 16 13:21:41 compute-0 nova_compute[185723]:   <features>
Feb 16 13:21:41 compute-0 nova_compute[185723]:     <acpi/>
Feb 16 13:21:41 compute-0 nova_compute[185723]:     <apic/>
Feb 16 13:21:41 compute-0 nova_compute[185723]:     <vmcoreinfo/>
Feb 16 13:21:41 compute-0 nova_compute[185723]:   </features>
Feb 16 13:21:41 compute-0 nova_compute[185723]:   <clock offset="utc">
Feb 16 13:21:41 compute-0 nova_compute[185723]:     <timer name="pit" tickpolicy="delay"/>
Feb 16 13:21:41 compute-0 nova_compute[185723]:     <timer name="rtc" tickpolicy="catchup"/>
Feb 16 13:21:41 compute-0 nova_compute[185723]:     <timer name="hpet" present="no"/>
Feb 16 13:21:41 compute-0 nova_compute[185723]:   </clock>
Feb 16 13:21:41 compute-0 nova_compute[185723]:   <cpu mode="custom" match="exact">
Feb 16 13:21:41 compute-0 nova_compute[185723]:     <model>Nehalem</model>
Feb 16 13:21:41 compute-0 nova_compute[185723]:     <topology sockets="1" cores="1" threads="1"/>
Feb 16 13:21:41 compute-0 nova_compute[185723]:   </cpu>
Feb 16 13:21:41 compute-0 nova_compute[185723]:   <devices>
Feb 16 13:21:41 compute-0 nova_compute[185723]:     <disk type="file" device="disk">
Feb 16 13:21:41 compute-0 nova_compute[185723]:       <driver name="qemu" type="qcow2" cache="none"/>
Feb 16 13:21:41 compute-0 nova_compute[185723]:       <source file="/var/lib/nova/instances/b21f8b55-68d7-4cd7-beed-2d61f932f84e/disk"/>
Feb 16 13:21:41 compute-0 nova_compute[185723]:       <target dev="vda" bus="virtio"/>
Feb 16 13:21:41 compute-0 nova_compute[185723]:     </disk>
Feb 16 13:21:41 compute-0 nova_compute[185723]:     <disk type="file" device="cdrom">
Feb 16 13:21:41 compute-0 nova_compute[185723]:       <driver name="qemu" type="raw" cache="none"/>
Feb 16 13:21:41 compute-0 nova_compute[185723]:       <source file="/var/lib/nova/instances/b21f8b55-68d7-4cd7-beed-2d61f932f84e/disk.config"/>
Feb 16 13:21:41 compute-0 nova_compute[185723]:       <target dev="sda" bus="sata"/>
Feb 16 13:21:41 compute-0 nova_compute[185723]:     </disk>
Feb 16 13:21:41 compute-0 nova_compute[185723]:     <interface type="ethernet">
Feb 16 13:21:41 compute-0 nova_compute[185723]:       <mac address="fa:16:3e:8a:da:08"/>
Feb 16 13:21:41 compute-0 nova_compute[185723]:       <model type="virtio"/>
Feb 16 13:21:41 compute-0 nova_compute[185723]:       <driver name="vhost" rx_queue_size="512"/>
Feb 16 13:21:41 compute-0 nova_compute[185723]:       <mtu size="1442"/>
Feb 16 13:21:41 compute-0 nova_compute[185723]:       <target dev="tap3bdc1813-a8"/>
Feb 16 13:21:41 compute-0 nova_compute[185723]:     </interface>
Feb 16 13:21:41 compute-0 nova_compute[185723]:     <serial type="pty">
Feb 16 13:21:41 compute-0 nova_compute[185723]:       <log file="/var/lib/nova/instances/b21f8b55-68d7-4cd7-beed-2d61f932f84e/console.log" append="off"/>
Feb 16 13:21:41 compute-0 nova_compute[185723]:     </serial>
Feb 16 13:21:41 compute-0 nova_compute[185723]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Feb 16 13:21:41 compute-0 nova_compute[185723]:     <video>
Feb 16 13:21:41 compute-0 nova_compute[185723]:       <model type="virtio"/>
Feb 16 13:21:41 compute-0 nova_compute[185723]:     </video>
Feb 16 13:21:41 compute-0 nova_compute[185723]:     <input type="tablet" bus="usb"/>
Feb 16 13:21:41 compute-0 nova_compute[185723]:     <rng model="virtio">
Feb 16 13:21:41 compute-0 nova_compute[185723]:       <backend model="random">/dev/urandom</backend>
Feb 16 13:21:41 compute-0 nova_compute[185723]:     </rng>
Feb 16 13:21:41 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root"/>
Feb 16 13:21:41 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:21:41 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:21:41 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:21:41 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:21:41 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:21:41 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:21:41 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:21:41 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:21:41 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:21:41 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:21:41 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:21:41 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:21:41 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:21:41 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:21:41 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:21:41 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:21:41 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:21:41 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:21:41 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:21:41 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:21:41 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:21:41 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:21:41 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:21:41 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:21:41 compute-0 nova_compute[185723]:     <controller type="usb" index="0"/>
Feb 16 13:21:41 compute-0 nova_compute[185723]:     <memballoon model="virtio">
Feb 16 13:21:41 compute-0 nova_compute[185723]:       <stats period="10"/>
Feb 16 13:21:41 compute-0 nova_compute[185723]:     </memballoon>
Feb 16 13:21:41 compute-0 nova_compute[185723]:   </devices>
Feb 16 13:21:41 compute-0 nova_compute[185723]: </domain>
Feb 16 13:21:41 compute-0 nova_compute[185723]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Feb 16 13:21:41 compute-0 nova_compute[185723]: 2026-02-16 13:21:41.750 185727 DEBUG nova.compute.manager [None req-0397efa7-33b2-4726-9e67-1a7c9d6a31a1 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: b21f8b55-68d7-4cd7-beed-2d61f932f84e] Preparing to wait for external event network-vif-plugged-3bdc1813-a8d3-43b8-805c-95acd138d9d6 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Feb 16 13:21:41 compute-0 nova_compute[185723]: 2026-02-16 13:21:41.751 185727 DEBUG oslo_concurrency.lockutils [None req-0397efa7-33b2-4726-9e67-1a7c9d6a31a1 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Acquiring lock "b21f8b55-68d7-4cd7-beed-2d61f932f84e-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:21:41 compute-0 nova_compute[185723]: 2026-02-16 13:21:41.751 185727 DEBUG oslo_concurrency.lockutils [None req-0397efa7-33b2-4726-9e67-1a7c9d6a31a1 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Lock "b21f8b55-68d7-4cd7-beed-2d61f932f84e-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:21:41 compute-0 nova_compute[185723]: 2026-02-16 13:21:41.751 185727 DEBUG oslo_concurrency.lockutils [None req-0397efa7-33b2-4726-9e67-1a7c9d6a31a1 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Lock "b21f8b55-68d7-4cd7-beed-2d61f932f84e-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:21:41 compute-0 nova_compute[185723]: 2026-02-16 13:21:41.752 185727 DEBUG nova.virt.libvirt.vif [None req-0397efa7-33b2-4726-9e67-1a7c9d6a31a1 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-16T13:21:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-2049385443',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-2049385443',id=3,image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b5e0321e3a614b62a46eef7fb2e737ff',ramdisk_id='',reservation_id='r-h5dq1f8v',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteActionsViaActuator-1504038973',owner_user_name='tempest-TestExecuteActionsViaActuator-1504038973-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-16T13:21:35Z,user_data=None,user_id='53b5045c5aaf4a7d8d84dce2ac4aa424',uuid=b21f8b55-68d7-4cd7-beed-2d61f932f84e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3bdc1813-a8d3-43b8-805c-95acd138d9d6", "address": "fa:16:3e:8a:da:08", "network": {"id": "a6199784-1742-41a7-9152-bb54abb7bef1", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-926379118-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5e0321e3a614b62a46eef7fb2e737ff", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3bdc1813-a8", "ovs_interfaceid": "3bdc1813-a8d3-43b8-805c-95acd138d9d6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Feb 16 13:21:41 compute-0 nova_compute[185723]: 2026-02-16 13:21:41.752 185727 DEBUG nova.network.os_vif_util [None req-0397efa7-33b2-4726-9e67-1a7c9d6a31a1 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Converting VIF {"id": "3bdc1813-a8d3-43b8-805c-95acd138d9d6", "address": "fa:16:3e:8a:da:08", "network": {"id": "a6199784-1742-41a7-9152-bb54abb7bef1", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-926379118-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5e0321e3a614b62a46eef7fb2e737ff", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3bdc1813-a8", "ovs_interfaceid": "3bdc1813-a8d3-43b8-805c-95acd138d9d6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 13:21:41 compute-0 nova_compute[185723]: 2026-02-16 13:21:41.753 185727 DEBUG nova.network.os_vif_util [None req-0397efa7-33b2-4726-9e67-1a7c9d6a31a1 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8a:da:08,bridge_name='br-int',has_traffic_filtering=True,id=3bdc1813-a8d3-43b8-805c-95acd138d9d6,network=Network(a6199784-1742-41a7-9152-bb54abb7bef1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3bdc1813-a8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 13:21:41 compute-0 nova_compute[185723]: 2026-02-16 13:21:41.753 185727 DEBUG os_vif [None req-0397efa7-33b2-4726-9e67-1a7c9d6a31a1 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:8a:da:08,bridge_name='br-int',has_traffic_filtering=True,id=3bdc1813-a8d3-43b8-805c-95acd138d9d6,network=Network(a6199784-1742-41a7-9152-bb54abb7bef1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3bdc1813-a8') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Feb 16 13:21:41 compute-0 nova_compute[185723]: 2026-02-16 13:21:41.753 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:21:41 compute-0 nova_compute[185723]: 2026-02-16 13:21:41.754 185727 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:21:41 compute-0 nova_compute[185723]: 2026-02-16 13:21:41.754 185727 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 13:21:41 compute-0 nova_compute[185723]: 2026-02-16 13:21:41.756 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:21:41 compute-0 nova_compute[185723]: 2026-02-16 13:21:41.757 185727 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3bdc1813-a8, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:21:41 compute-0 nova_compute[185723]: 2026-02-16 13:21:41.757 185727 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap3bdc1813-a8, col_values=(('external_ids', {'iface-id': '3bdc1813-a8d3-43b8-805c-95acd138d9d6', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:8a:da:08', 'vm-uuid': 'b21f8b55-68d7-4cd7-beed-2d61f932f84e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:21:41 compute-0 nova_compute[185723]: 2026-02-16 13:21:41.759 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:21:41 compute-0 NetworkManager[56177]: <info>  [1771248101.7604] manager: (tap3bdc1813-a8): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/24)
Feb 16 13:21:41 compute-0 nova_compute[185723]: 2026-02-16 13:21:41.761 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 13:21:41 compute-0 nova_compute[185723]: 2026-02-16 13:21:41.766 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:21:41 compute-0 nova_compute[185723]: 2026-02-16 13:21:41.767 185727 INFO os_vif [None req-0397efa7-33b2-4726-9e67-1a7c9d6a31a1 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:8a:da:08,bridge_name='br-int',has_traffic_filtering=True,id=3bdc1813-a8d3-43b8-805c-95acd138d9d6,network=Network(a6199784-1742-41a7-9152-bb54abb7bef1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3bdc1813-a8')
Feb 16 13:21:41 compute-0 nova_compute[185723]: 2026-02-16 13:21:41.845 185727 DEBUG nova.virt.libvirt.driver [None req-0397efa7-33b2-4726-9e67-1a7c9d6a31a1 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 16 13:21:41 compute-0 nova_compute[185723]: 2026-02-16 13:21:41.846 185727 DEBUG nova.virt.libvirt.driver [None req-0397efa7-33b2-4726-9e67-1a7c9d6a31a1 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 16 13:21:41 compute-0 nova_compute[185723]: 2026-02-16 13:21:41.846 185727 DEBUG nova.virt.libvirt.driver [None req-0397efa7-33b2-4726-9e67-1a7c9d6a31a1 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] No VIF found with MAC fa:16:3e:8a:da:08, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Feb 16 13:21:41 compute-0 nova_compute[185723]: 2026-02-16 13:21:41.847 185727 INFO nova.virt.libvirt.driver [None req-0397efa7-33b2-4726-9e67-1a7c9d6a31a1 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: b21f8b55-68d7-4cd7-beed-2d61f932f84e] Using config drive
Feb 16 13:21:42 compute-0 nova_compute[185723]: 2026-02-16 13:21:42.004 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:21:43 compute-0 nova_compute[185723]: 2026-02-16 13:21:43.854 185727 INFO nova.virt.libvirt.driver [None req-0397efa7-33b2-4726-9e67-1a7c9d6a31a1 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: b21f8b55-68d7-4cd7-beed-2d61f932f84e] Creating config drive at /var/lib/nova/instances/b21f8b55-68d7-4cd7-beed-2d61f932f84e/disk.config
Feb 16 13:21:43 compute-0 nova_compute[185723]: 2026-02-16 13:21:43.858 185727 DEBUG oslo_concurrency.processutils [None req-0397efa7-33b2-4726-9e67-1a7c9d6a31a1 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/b21f8b55-68d7-4cd7-beed-2d61f932f84e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmptd_p7epy execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:21:43 compute-0 nova_compute[185723]: 2026-02-16 13:21:43.977 185727 DEBUG oslo_concurrency.processutils [None req-0397efa7-33b2-4726-9e67-1a7c9d6a31a1 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/b21f8b55-68d7-4cd7-beed-2d61f932f84e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmptd_p7epy" returned: 0 in 0.119s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:21:44 compute-0 kernel: tap3bdc1813-a8: entered promiscuous mode
Feb 16 13:21:44 compute-0 NetworkManager[56177]: <info>  [1771248104.0377] manager: (tap3bdc1813-a8): new Tun device (/org/freedesktop/NetworkManager/Devices/25)
Feb 16 13:21:44 compute-0 ovn_controller[96072]: 2026-02-16T13:21:44Z|00035|binding|INFO|Claiming lport 3bdc1813-a8d3-43b8-805c-95acd138d9d6 for this chassis.
Feb 16 13:21:44 compute-0 ovn_controller[96072]: 2026-02-16T13:21:44Z|00036|binding|INFO|3bdc1813-a8d3-43b8-805c-95acd138d9d6: Claiming fa:16:3e:8a:da:08 10.100.0.4
Feb 16 13:21:44 compute-0 nova_compute[185723]: 2026-02-16 13:21:44.052 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:21:44 compute-0 ovn_controller[96072]: 2026-02-16T13:21:44Z|00037|binding|INFO|Setting lport 3bdc1813-a8d3-43b8-805c-95acd138d9d6 ovn-installed in OVS
Feb 16 13:21:44 compute-0 nova_compute[185723]: 2026-02-16 13:21:44.057 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:21:44 compute-0 systemd-machined[155229]: New machine qemu-2-instance-00000003.
Feb 16 13:21:44 compute-0 ovn_controller[96072]: 2026-02-16T13:21:44Z|00038|binding|INFO|Setting lport 3bdc1813-a8d3-43b8-805c-95acd138d9d6 up in Southbound
Feb 16 13:21:44 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:21:44.085 105360 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8a:da:08 10.100.0.4'], port_security=['fa:16:3e:8a:da:08 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'b21f8b55-68d7-4cd7-beed-2d61f932f84e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a6199784-1742-41a7-9152-bb54abb7bef1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b5e0321e3a614b62a46eef7fb2e737ff', 'neutron:revision_number': '2', 'neutron:security_group_ids': '22e3f3ae-6435-49f2-b1a3-ead6d5ff75b1', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5a8796a1-c459-4e68-a95d-23fef829aa8d, chassis=[<ovs.db.idl.Row object at 0x7f2e53046760>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2e53046760>], logical_port=3bdc1813-a8d3-43b8-805c-95acd138d9d6) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 13:21:44 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:21:44.086 105360 INFO neutron.agent.ovn.metadata.agent [-] Port 3bdc1813-a8d3-43b8-805c-95acd138d9d6 in datapath a6199784-1742-41a7-9152-bb54abb7bef1 bound to our chassis
Feb 16 13:21:44 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:21:44.087 105360 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network a6199784-1742-41a7-9152-bb54abb7bef1
Feb 16 13:21:44 compute-0 systemd[1]: Started Virtual Machine qemu-2-instance-00000003.
Feb 16 13:21:44 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:21:44.097 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[c78af7cd-b894-497d-b70a-7d2aa52f9bc9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:21:44 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:21:44.098 105360 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapa6199784-11 in ovnmeta-a6199784-1742-41a7-9152-bb54abb7bef1 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Feb 16 13:21:44 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:21:44.099 206438 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapa6199784-10 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Feb 16 13:21:44 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:21:44.099 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[1100a20f-e5fb-4a1e-8de7-de1f772a5ba9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:21:44 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:21:44.100 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[4ae29a4a-3639-4cf8-aa86-4a320358522a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:21:44 compute-0 systemd-udevd[206902]: Network interface NamePolicy= disabled on kernel command line.
Feb 16 13:21:44 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:21:44.109 105762 DEBUG oslo.privsep.daemon [-] privsep: reply[4812c405-1683-4539-af03-40771fd75f92]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:21:44 compute-0 NetworkManager[56177]: <info>  [1771248104.1179] device (tap3bdc1813-a8): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 16 13:21:44 compute-0 NetworkManager[56177]: <info>  [1771248104.1184] device (tap3bdc1813-a8): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Feb 16 13:21:44 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:21:44.124 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[317975f1-3348-4b27-8d0d-bfbe52bff0d0]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:21:44 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:21:44.150 206452 DEBUG oslo.privsep.daemon [-] privsep: reply[08ecb737-421b-44b9-82a0-33d3205d8253]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:21:44 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:21:44.156 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[2d563f5e-5967-4453-9bab-7c32ae16cd4e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:21:44 compute-0 NetworkManager[56177]: <info>  [1771248104.1579] manager: (tapa6199784-10): new Veth device (/org/freedesktop/NetworkManager/Devices/26)
Feb 16 13:21:44 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:21:44.185 206452 DEBUG oslo.privsep.daemon [-] privsep: reply[72cc46a5-38a0-4f73-b2b3-99c828f8653f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:21:44 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:21:44.189 206452 DEBUG oslo.privsep.daemon [-] privsep: reply[0fddd59f-a9d9-42ae-90a6-041e3302f7e0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:21:44 compute-0 NetworkManager[56177]: <info>  [1771248104.2076] device (tapa6199784-10): carrier: link connected
Feb 16 13:21:44 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:21:44.209 206452 DEBUG oslo.privsep.daemon [-] privsep: reply[23ab1e9a-003f-4ce4-8dbf-05e40bea7e7a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:21:44 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:21:44.223 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[8b5b0022-1107-49fe-8f2b-a6008b7746eb]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa6199784-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:dd:b9:43'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 15], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 418760, 'reachable_time': 19845, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 206934, 'error': None, 'target': 'ovnmeta-a6199784-1742-41a7-9152-bb54abb7bef1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:21:44 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:21:44.233 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[47a72486-4e20-424c-bbe5-c6c7d30f3f55]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fedd:b943'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 418760, 'tstamp': 418760}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 206935, 'error': None, 'target': 'ovnmeta-a6199784-1742-41a7-9152-bb54abb7bef1', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:21:44 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:21:44.243 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[d7eb36cf-6652-4f1b-91d5-c4bcb47f72fc]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa6199784-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:dd:b9:43'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 15], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 418760, 'reachable_time': 19845, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 206936, 'error': None, 'target': 'ovnmeta-a6199784-1742-41a7-9152-bb54abb7bef1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:21:44 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:21:44.271 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[6eb28d2c-a5d9-4fb0-b2e5-625cd8416950]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:21:44 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:21:44.322 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[91c26e35-a6f0-477c-b4a3-6c2caeaefcbb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:21:44 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:21:44.325 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa6199784-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:21:44 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:21:44.325 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 13:21:44 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:21:44.326 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa6199784-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:21:44 compute-0 nova_compute[185723]: 2026-02-16 13:21:44.328 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:21:44 compute-0 NetworkManager[56177]: <info>  [1771248104.3289] manager: (tapa6199784-10): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/27)
Feb 16 13:21:44 compute-0 kernel: tapa6199784-10: entered promiscuous mode
Feb 16 13:21:44 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:21:44.333 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapa6199784-10, col_values=(('external_ids', {'iface-id': '3b5a298b-9fc2-4705-8faa-2b8cfb88937b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:21:44 compute-0 nova_compute[185723]: 2026-02-16 13:21:44.334 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:21:44 compute-0 ovn_controller[96072]: 2026-02-16T13:21:44Z|00039|binding|INFO|Releasing lport 3b5a298b-9fc2-4705-8faa-2b8cfb88937b from this chassis (sb_readonly=0)
Feb 16 13:21:44 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:21:44.335 105360 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/a6199784-1742-41a7-9152-bb54abb7bef1.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/a6199784-1742-41a7-9152-bb54abb7bef1.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Feb 16 13:21:44 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:21:44.336 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[49d501b6-cef1-43ab-8a17-975e3f2a19b7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:21:44 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:21:44.337 105360 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Feb 16 13:21:44 compute-0 ovn_metadata_agent[105355]: global
Feb 16 13:21:44 compute-0 ovn_metadata_agent[105355]:     log         /dev/log local0 debug
Feb 16 13:21:44 compute-0 ovn_metadata_agent[105355]:     log-tag     haproxy-metadata-proxy-a6199784-1742-41a7-9152-bb54abb7bef1
Feb 16 13:21:44 compute-0 ovn_metadata_agent[105355]:     user        root
Feb 16 13:21:44 compute-0 ovn_metadata_agent[105355]:     group       root
Feb 16 13:21:44 compute-0 ovn_metadata_agent[105355]:     maxconn     1024
Feb 16 13:21:44 compute-0 ovn_metadata_agent[105355]:     pidfile     /var/lib/neutron/external/pids/a6199784-1742-41a7-9152-bb54abb7bef1.pid.haproxy
Feb 16 13:21:44 compute-0 ovn_metadata_agent[105355]:     daemon
Feb 16 13:21:44 compute-0 ovn_metadata_agent[105355]: 
Feb 16 13:21:44 compute-0 ovn_metadata_agent[105355]: defaults
Feb 16 13:21:44 compute-0 ovn_metadata_agent[105355]:     log global
Feb 16 13:21:44 compute-0 ovn_metadata_agent[105355]:     mode http
Feb 16 13:21:44 compute-0 ovn_metadata_agent[105355]:     option httplog
Feb 16 13:21:44 compute-0 ovn_metadata_agent[105355]:     option dontlognull
Feb 16 13:21:44 compute-0 ovn_metadata_agent[105355]:     option http-server-close
Feb 16 13:21:44 compute-0 ovn_metadata_agent[105355]:     option forwardfor
Feb 16 13:21:44 compute-0 ovn_metadata_agent[105355]:     retries                 3
Feb 16 13:21:44 compute-0 ovn_metadata_agent[105355]:     timeout http-request    30s
Feb 16 13:21:44 compute-0 ovn_metadata_agent[105355]:     timeout connect         30s
Feb 16 13:21:44 compute-0 ovn_metadata_agent[105355]:     timeout client          32s
Feb 16 13:21:44 compute-0 ovn_metadata_agent[105355]:     timeout server          32s
Feb 16 13:21:44 compute-0 ovn_metadata_agent[105355]:     timeout http-keep-alive 30s
Feb 16 13:21:44 compute-0 ovn_metadata_agent[105355]: 
Feb 16 13:21:44 compute-0 ovn_metadata_agent[105355]: 
Feb 16 13:21:44 compute-0 ovn_metadata_agent[105355]: listen listener
Feb 16 13:21:44 compute-0 ovn_metadata_agent[105355]:     bind 169.254.169.254:80
Feb 16 13:21:44 compute-0 ovn_metadata_agent[105355]:     server metadata /var/lib/neutron/metadata_proxy
Feb 16 13:21:44 compute-0 ovn_metadata_agent[105355]:     http-request add-header X-OVN-Network-ID a6199784-1742-41a7-9152-bb54abb7bef1
Feb 16 13:21:44 compute-0 ovn_metadata_agent[105355]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Feb 16 13:21:44 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:21:44.338 105360 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-a6199784-1742-41a7-9152-bb54abb7bef1', 'env', 'PROCESS_TAG=haproxy-a6199784-1742-41a7-9152-bb54abb7bef1', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/a6199784-1742-41a7-9152-bb54abb7bef1.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Feb 16 13:21:44 compute-0 nova_compute[185723]: 2026-02-16 13:21:44.338 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:21:44 compute-0 nova_compute[185723]: 2026-02-16 13:21:44.423 185727 DEBUG nova.virt.driver [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] Emitting event <LifecycleEvent: 1771248104.422254, b21f8b55-68d7-4cd7-beed-2d61f932f84e => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 13:21:44 compute-0 nova_compute[185723]: 2026-02-16 13:21:44.423 185727 INFO nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: b21f8b55-68d7-4cd7-beed-2d61f932f84e] VM Started (Lifecycle Event)
Feb 16 13:21:44 compute-0 nova_compute[185723]: 2026-02-16 13:21:44.479 185727 DEBUG nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: b21f8b55-68d7-4cd7-beed-2d61f932f84e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:21:44 compute-0 nova_compute[185723]: 2026-02-16 13:21:44.484 185727 DEBUG nova.virt.driver [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] Emitting event <LifecycleEvent: 1771248104.4224453, b21f8b55-68d7-4cd7-beed-2d61f932f84e => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 13:21:44 compute-0 nova_compute[185723]: 2026-02-16 13:21:44.485 185727 INFO nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: b21f8b55-68d7-4cd7-beed-2d61f932f84e] VM Paused (Lifecycle Event)
Feb 16 13:21:44 compute-0 nova_compute[185723]: 2026-02-16 13:21:44.513 185727 DEBUG nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: b21f8b55-68d7-4cd7-beed-2d61f932f84e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:21:44 compute-0 nova_compute[185723]: 2026-02-16 13:21:44.518 185727 DEBUG nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: b21f8b55-68d7-4cd7-beed-2d61f932f84e] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 16 13:21:44 compute-0 nova_compute[185723]: 2026-02-16 13:21:44.548 185727 INFO nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: b21f8b55-68d7-4cd7-beed-2d61f932f84e] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 16 13:21:44 compute-0 podman[206977]: 2026-02-16 13:21:44.646301871 +0000 UTC m=+0.042178911 container create ab76ecbfc722e0ed48e66496cd71d756e9981f8ef0a34e55d14276dbd768f8cf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a6199784-1742-41a7-9152-bb54abb7bef1, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20260127)
Feb 16 13:21:44 compute-0 systemd[1]: Started libpod-conmon-ab76ecbfc722e0ed48e66496cd71d756e9981f8ef0a34e55d14276dbd768f8cf.scope.
Feb 16 13:21:44 compute-0 systemd[1]: Started libcrun container.
Feb 16 13:21:44 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/890aef6c8cdf419a2b6c513b894d36adca80637c7d3be7cad014923b430b09ce/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 16 13:21:44 compute-0 podman[206977]: 2026-02-16 13:21:44.624061708 +0000 UTC m=+0.019938778 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Feb 16 13:21:44 compute-0 podman[206977]: 2026-02-16 13:21:44.730291034 +0000 UTC m=+0.126168104 container init ab76ecbfc722e0ed48e66496cd71d756e9981f8ef0a34e55d14276dbd768f8cf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a6199784-1742-41a7-9152-bb54abb7bef1, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Feb 16 13:21:44 compute-0 podman[206977]: 2026-02-16 13:21:44.737389811 +0000 UTC m=+0.133266851 container start ab76ecbfc722e0ed48e66496cd71d756e9981f8ef0a34e55d14276dbd768f8cf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a6199784-1742-41a7-9152-bb54abb7bef1, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 16 13:21:44 compute-0 neutron-haproxy-ovnmeta-a6199784-1742-41a7-9152-bb54abb7bef1[206992]: [NOTICE]   (206996) : New worker (206998) forked
Feb 16 13:21:44 compute-0 neutron-haproxy-ovnmeta-a6199784-1742-41a7-9152-bb54abb7bef1[206992]: [NOTICE]   (206996) : Loading success.
Feb 16 13:21:45 compute-0 sshd-session[206953]: Connection closed by authenticating user root 146.190.226.24 port 33686 [preauth]
Feb 16 13:21:46 compute-0 nova_compute[185723]: 2026-02-16 13:21:46.761 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:21:47 compute-0 nova_compute[185723]: 2026-02-16 13:21:47.006 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:21:47 compute-0 nova_compute[185723]: 2026-02-16 13:21:47.381 185727 DEBUG nova.network.neutron [req-e8db086c-27ac-4de5-88e8-69cd2a288832 req-5be8caaa-3c53-4e5c-82dc-490fdd00dee8 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: b21f8b55-68d7-4cd7-beed-2d61f932f84e] Updated VIF entry in instance network info cache for port 3bdc1813-a8d3-43b8-805c-95acd138d9d6. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 16 13:21:47 compute-0 nova_compute[185723]: 2026-02-16 13:21:47.381 185727 DEBUG nova.network.neutron [req-e8db086c-27ac-4de5-88e8-69cd2a288832 req-5be8caaa-3c53-4e5c-82dc-490fdd00dee8 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: b21f8b55-68d7-4cd7-beed-2d61f932f84e] Updating instance_info_cache with network_info: [{"id": "3bdc1813-a8d3-43b8-805c-95acd138d9d6", "address": "fa:16:3e:8a:da:08", "network": {"id": "a6199784-1742-41a7-9152-bb54abb7bef1", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-926379118-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5e0321e3a614b62a46eef7fb2e737ff", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3bdc1813-a8", "ovs_interfaceid": "3bdc1813-a8d3-43b8-805c-95acd138d9d6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 13:21:47 compute-0 nova_compute[185723]: 2026-02-16 13:21:47.404 185727 DEBUG nova.compute.manager [req-e7314660-7009-4fed-8594-45042f663136 req-d583da35-f4fe-4b59-835d-59a270c8ae0d faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: b21f8b55-68d7-4cd7-beed-2d61f932f84e] Received event network-vif-plugged-3bdc1813-a8d3-43b8-805c-95acd138d9d6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:21:47 compute-0 nova_compute[185723]: 2026-02-16 13:21:47.405 185727 DEBUG oslo_concurrency.lockutils [req-e7314660-7009-4fed-8594-45042f663136 req-d583da35-f4fe-4b59-835d-59a270c8ae0d faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "b21f8b55-68d7-4cd7-beed-2d61f932f84e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:21:47 compute-0 nova_compute[185723]: 2026-02-16 13:21:47.405 185727 DEBUG oslo_concurrency.lockutils [req-e7314660-7009-4fed-8594-45042f663136 req-d583da35-f4fe-4b59-835d-59a270c8ae0d faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "b21f8b55-68d7-4cd7-beed-2d61f932f84e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:21:47 compute-0 nova_compute[185723]: 2026-02-16 13:21:47.406 185727 DEBUG oslo_concurrency.lockutils [req-e7314660-7009-4fed-8594-45042f663136 req-d583da35-f4fe-4b59-835d-59a270c8ae0d faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "b21f8b55-68d7-4cd7-beed-2d61f932f84e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:21:47 compute-0 nova_compute[185723]: 2026-02-16 13:21:47.407 185727 DEBUG nova.compute.manager [req-e7314660-7009-4fed-8594-45042f663136 req-d583da35-f4fe-4b59-835d-59a270c8ae0d faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: b21f8b55-68d7-4cd7-beed-2d61f932f84e] Processing event network-vif-plugged-3bdc1813-a8d3-43b8-805c-95acd138d9d6 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Feb 16 13:21:47 compute-0 nova_compute[185723]: 2026-02-16 13:21:47.408 185727 DEBUG nova.compute.manager [None req-0397efa7-33b2-4726-9e67-1a7c9d6a31a1 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: b21f8b55-68d7-4cd7-beed-2d61f932f84e] Instance event wait completed in 2 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Feb 16 13:21:47 compute-0 nova_compute[185723]: 2026-02-16 13:21:47.413 185727 DEBUG nova.virt.driver [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] Emitting event <LifecycleEvent: 1771248107.412979, b21f8b55-68d7-4cd7-beed-2d61f932f84e => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 13:21:47 compute-0 nova_compute[185723]: 2026-02-16 13:21:47.413 185727 INFO nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: b21f8b55-68d7-4cd7-beed-2d61f932f84e] VM Resumed (Lifecycle Event)
Feb 16 13:21:47 compute-0 nova_compute[185723]: 2026-02-16 13:21:47.415 185727 DEBUG nova.virt.libvirt.driver [None req-0397efa7-33b2-4726-9e67-1a7c9d6a31a1 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: b21f8b55-68d7-4cd7-beed-2d61f932f84e] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Feb 16 13:21:47 compute-0 nova_compute[185723]: 2026-02-16 13:21:47.419 185727 INFO nova.virt.libvirt.driver [-] [instance: b21f8b55-68d7-4cd7-beed-2d61f932f84e] Instance spawned successfully.
Feb 16 13:21:47 compute-0 nova_compute[185723]: 2026-02-16 13:21:47.419 185727 DEBUG nova.virt.libvirt.driver [None req-0397efa7-33b2-4726-9e67-1a7c9d6a31a1 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: b21f8b55-68d7-4cd7-beed-2d61f932f84e] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Feb 16 13:21:47 compute-0 nova_compute[185723]: 2026-02-16 13:21:47.421 185727 DEBUG oslo_concurrency.lockutils [req-e8db086c-27ac-4de5-88e8-69cd2a288832 req-5be8caaa-3c53-4e5c-82dc-490fdd00dee8 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Releasing lock "refresh_cache-b21f8b55-68d7-4cd7-beed-2d61f932f84e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 13:21:47 compute-0 nova_compute[185723]: 2026-02-16 13:21:47.457 185727 DEBUG nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: b21f8b55-68d7-4cd7-beed-2d61f932f84e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:21:47 compute-0 nova_compute[185723]: 2026-02-16 13:21:47.463 185727 DEBUG nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: b21f8b55-68d7-4cd7-beed-2d61f932f84e] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 16 13:21:47 compute-0 nova_compute[185723]: 2026-02-16 13:21:47.466 185727 DEBUG nova.virt.libvirt.driver [None req-0397efa7-33b2-4726-9e67-1a7c9d6a31a1 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: b21f8b55-68d7-4cd7-beed-2d61f932f84e] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:21:47 compute-0 nova_compute[185723]: 2026-02-16 13:21:47.466 185727 DEBUG nova.virt.libvirt.driver [None req-0397efa7-33b2-4726-9e67-1a7c9d6a31a1 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: b21f8b55-68d7-4cd7-beed-2d61f932f84e] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:21:47 compute-0 nova_compute[185723]: 2026-02-16 13:21:47.467 185727 DEBUG nova.virt.libvirt.driver [None req-0397efa7-33b2-4726-9e67-1a7c9d6a31a1 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: b21f8b55-68d7-4cd7-beed-2d61f932f84e] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:21:47 compute-0 nova_compute[185723]: 2026-02-16 13:21:47.467 185727 DEBUG nova.virt.libvirt.driver [None req-0397efa7-33b2-4726-9e67-1a7c9d6a31a1 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: b21f8b55-68d7-4cd7-beed-2d61f932f84e] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:21:47 compute-0 nova_compute[185723]: 2026-02-16 13:21:47.467 185727 DEBUG nova.virt.libvirt.driver [None req-0397efa7-33b2-4726-9e67-1a7c9d6a31a1 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: b21f8b55-68d7-4cd7-beed-2d61f932f84e] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:21:47 compute-0 nova_compute[185723]: 2026-02-16 13:21:47.468 185727 DEBUG nova.virt.libvirt.driver [None req-0397efa7-33b2-4726-9e67-1a7c9d6a31a1 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: b21f8b55-68d7-4cd7-beed-2d61f932f84e] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:21:47 compute-0 nova_compute[185723]: 2026-02-16 13:21:47.509 185727 INFO nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: b21f8b55-68d7-4cd7-beed-2d61f932f84e] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 16 13:21:47 compute-0 nova_compute[185723]: 2026-02-16 13:21:47.608 185727 INFO nova.compute.manager [None req-0397efa7-33b2-4726-9e67-1a7c9d6a31a1 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: b21f8b55-68d7-4cd7-beed-2d61f932f84e] Took 12.52 seconds to spawn the instance on the hypervisor.
Feb 16 13:21:47 compute-0 nova_compute[185723]: 2026-02-16 13:21:47.609 185727 DEBUG nova.compute.manager [None req-0397efa7-33b2-4726-9e67-1a7c9d6a31a1 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: b21f8b55-68d7-4cd7-beed-2d61f932f84e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:21:47 compute-0 nova_compute[185723]: 2026-02-16 13:21:47.714 185727 INFO nova.compute.manager [None req-0397efa7-33b2-4726-9e67-1a7c9d6a31a1 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: b21f8b55-68d7-4cd7-beed-2d61f932f84e] Took 13.19 seconds to build instance.
Feb 16 13:21:47 compute-0 nova_compute[185723]: 2026-02-16 13:21:47.749 185727 DEBUG oslo_concurrency.lockutils [None req-0397efa7-33b2-4726-9e67-1a7c9d6a31a1 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Lock "b21f8b55-68d7-4cd7-beed-2d61f932f84e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 13.326s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:21:49 compute-0 sshd-session[207007]: Connection closed by authenticating user root 64.227.72.94 port 48922 [preauth]
Feb 16 13:21:49 compute-0 nova_compute[185723]: 2026-02-16 13:21:49.610 185727 DEBUG nova.compute.manager [req-9c8e3beb-31d5-4315-bbb1-94a97dbbb553 req-a3ef0720-a87b-46fd-9c06-53fcfd639cf4 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: b21f8b55-68d7-4cd7-beed-2d61f932f84e] Received event network-vif-plugged-3bdc1813-a8d3-43b8-805c-95acd138d9d6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:21:49 compute-0 nova_compute[185723]: 2026-02-16 13:21:49.611 185727 DEBUG oslo_concurrency.lockutils [req-9c8e3beb-31d5-4315-bbb1-94a97dbbb553 req-a3ef0720-a87b-46fd-9c06-53fcfd639cf4 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "b21f8b55-68d7-4cd7-beed-2d61f932f84e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:21:49 compute-0 nova_compute[185723]: 2026-02-16 13:21:49.611 185727 DEBUG oslo_concurrency.lockutils [req-9c8e3beb-31d5-4315-bbb1-94a97dbbb553 req-a3ef0720-a87b-46fd-9c06-53fcfd639cf4 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "b21f8b55-68d7-4cd7-beed-2d61f932f84e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:21:49 compute-0 nova_compute[185723]: 2026-02-16 13:21:49.611 185727 DEBUG oslo_concurrency.lockutils [req-9c8e3beb-31d5-4315-bbb1-94a97dbbb553 req-a3ef0720-a87b-46fd-9c06-53fcfd639cf4 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "b21f8b55-68d7-4cd7-beed-2d61f932f84e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:21:49 compute-0 nova_compute[185723]: 2026-02-16 13:21:49.611 185727 DEBUG nova.compute.manager [req-9c8e3beb-31d5-4315-bbb1-94a97dbbb553 req-a3ef0720-a87b-46fd-9c06-53fcfd639cf4 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: b21f8b55-68d7-4cd7-beed-2d61f932f84e] No waiting events found dispatching network-vif-plugged-3bdc1813-a8d3-43b8-805c-95acd138d9d6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:21:49 compute-0 nova_compute[185723]: 2026-02-16 13:21:49.611 185727 WARNING nova.compute.manager [req-9c8e3beb-31d5-4315-bbb1-94a97dbbb553 req-a3ef0720-a87b-46fd-9c06-53fcfd639cf4 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: b21f8b55-68d7-4cd7-beed-2d61f932f84e] Received unexpected event network-vif-plugged-3bdc1813-a8d3-43b8-805c-95acd138d9d6 for instance with vm_state active and task_state None.
Feb 16 13:21:51 compute-0 nova_compute[185723]: 2026-02-16 13:21:51.764 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:21:52 compute-0 nova_compute[185723]: 2026-02-16 13:21:52.007 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:21:52 compute-0 podman[207011]: 2026-02-16 13:21:52.017031534 +0000 UTC m=+0.047559945 container health_status d69bd0be854ec8043fcba555b70c2cd311c7a2510d07d6bc9929fe7684abece9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20260127, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent)
Feb 16 13:21:52 compute-0 podman[207010]: 2026-02-16 13:21:52.021205378 +0000 UTC m=+0.052148460 container health_status 93151dcbec9f159583042df5101bdd7b5b1bcb5f3117955b4bc9cd1c0e465b98 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, container_name=openstack_network_exporter, build-date=2026-02-05T04:57:10Z, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, distribution-scope=public, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, architecture=x86_64, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, release=1770267347, vendor=Red Hat, Inc., config_id=openstack_network_exporter, name=ubi9/ubi-minimal, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2026-02-05T04:57:10Z, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, version=9.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 16 13:21:56 compute-0 podman[207050]: 2026-02-16 13:21:56.046259372 +0000 UTC m=+0.081944343 container health_status 19961a2f872cc72daf954f4dd556f980b5f58f137d145776df6132c167a8a93c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Feb 16 13:21:56 compute-0 nova_compute[185723]: 2026-02-16 13:21:56.767 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:21:57 compute-0 nova_compute[185723]: 2026-02-16 13:21:57.035 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:21:58 compute-0 nova_compute[185723]: 2026-02-16 13:21:58.433 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:21:59 compute-0 nova_compute[185723]: 2026-02-16 13:21:59.434 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:21:59 compute-0 ovn_controller[96072]: 2026-02-16T13:21:59Z|00006|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:8a:da:08 10.100.0.4
Feb 16 13:21:59 compute-0 ovn_controller[96072]: 2026-02-16T13:21:59Z|00007|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:8a:da:08 10.100.0.4
Feb 16 13:21:59 compute-0 podman[195053]: time="2026-02-16T13:21:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:21:59 compute-0 podman[195053]: @ - - [16/Feb/2026:13:21:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17245 "" "Go-http-client/1.1"
Feb 16 13:21:59 compute-0 podman[195053]: @ - - [16/Feb/2026:13:21:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2628 "" "Go-http-client/1.1"
Feb 16 13:22:00 compute-0 nova_compute[185723]: 2026-02-16 13:22:00.432 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:22:00 compute-0 nova_compute[185723]: 2026-02-16 13:22:00.433 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:22:00 compute-0 nova_compute[185723]: 2026-02-16 13:22:00.470 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:22:00 compute-0 nova_compute[185723]: 2026-02-16 13:22:00.470 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:22:00 compute-0 nova_compute[185723]: 2026-02-16 13:22:00.471 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:22:00 compute-0 nova_compute[185723]: 2026-02-16 13:22:00.471 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 16 13:22:00 compute-0 nova_compute[185723]: 2026-02-16 13:22:00.549 185727 DEBUG oslo_concurrency.processutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b21f8b55-68d7-4cd7-beed-2d61f932f84e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:22:00 compute-0 nova_compute[185723]: 2026-02-16 13:22:00.608 185727 DEBUG oslo_concurrency.processutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b21f8b55-68d7-4cd7-beed-2d61f932f84e/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:22:00 compute-0 nova_compute[185723]: 2026-02-16 13:22:00.609 185727 DEBUG oslo_concurrency.processutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b21f8b55-68d7-4cd7-beed-2d61f932f84e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:22:00 compute-0 nova_compute[185723]: 2026-02-16 13:22:00.653 185727 DEBUG oslo_concurrency.processutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b21f8b55-68d7-4cd7-beed-2d61f932f84e/disk --force-share --output=json" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:22:00 compute-0 nova_compute[185723]: 2026-02-16 13:22:00.776 185727 WARNING nova.virt.libvirt.driver [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 13:22:00 compute-0 nova_compute[185723]: 2026-02-16 13:22:00.778 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5667MB free_disk=73.19858932495117GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 16 13:22:00 compute-0 nova_compute[185723]: 2026-02-16 13:22:00.778 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:22:00 compute-0 nova_compute[185723]: 2026-02-16 13:22:00.778 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:22:01 compute-0 nova_compute[185723]: 2026-02-16 13:22:01.001 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Instance b21f8b55-68d7-4cd7-beed-2d61f932f84e actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 16 13:22:01 compute-0 nova_compute[185723]: 2026-02-16 13:22:01.002 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 16 13:22:01 compute-0 nova_compute[185723]: 2026-02-16 13:22:01.002 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 16 13:22:01 compute-0 nova_compute[185723]: 2026-02-16 13:22:01.080 185727 DEBUG nova.compute.provider_tree [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Inventory has not changed in ProviderTree for provider: c9501a85-df32-4b8f-bce0-9425ef1e7866 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 13:22:01 compute-0 nova_compute[185723]: 2026-02-16 13:22:01.112 185727 DEBUG nova.scheduler.client.report [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Inventory has not changed for provider c9501a85-df32-4b8f-bce0-9425ef1e7866 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 13:22:01 compute-0 nova_compute[185723]: 2026-02-16 13:22:01.150 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 16 13:22:01 compute-0 nova_compute[185723]: 2026-02-16 13:22:01.151 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.373s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:22:01 compute-0 openstack_network_exporter[197909]: ERROR   13:22:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:22:01 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:22:01 compute-0 openstack_network_exporter[197909]: ERROR   13:22:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:22:01 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:22:01 compute-0 nova_compute[185723]: 2026-02-16 13:22:01.770 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:22:02 compute-0 nova_compute[185723]: 2026-02-16 13:22:02.036 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:22:02 compute-0 nova_compute[185723]: 2026-02-16 13:22:02.642 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:22:02 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:22:02.643 105360 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=5, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '52:96:1b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '16:6c:7a:bd:49:39'}, ipsec=False) old=SB_Global(nb_cfg=4) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 13:22:02 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:22:02.644 105360 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 16 13:22:03 compute-0 nova_compute[185723]: 2026-02-16 13:22:03.151 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:22:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:22:03.213 105360 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:22:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:22:03.214 105360 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:22:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:22:03.214 105360 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:22:03 compute-0 nova_compute[185723]: 2026-02-16 13:22:03.429 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:22:03 compute-0 nova_compute[185723]: 2026-02-16 13:22:03.432 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:22:03 compute-0 nova_compute[185723]: 2026-02-16 13:22:03.433 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 16 13:22:03 compute-0 nova_compute[185723]: 2026-02-16 13:22:03.433 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 16 13:22:04 compute-0 nova_compute[185723]: 2026-02-16 13:22:04.610 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Acquiring lock "refresh_cache-b21f8b55-68d7-4cd7-beed-2d61f932f84e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 13:22:04 compute-0 nova_compute[185723]: 2026-02-16 13:22:04.611 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Acquired lock "refresh_cache-b21f8b55-68d7-4cd7-beed-2d61f932f84e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 13:22:04 compute-0 nova_compute[185723]: 2026-02-16 13:22:04.611 185727 DEBUG nova.network.neutron [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] [instance: b21f8b55-68d7-4cd7-beed-2d61f932f84e] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Feb 16 13:22:04 compute-0 nova_compute[185723]: 2026-02-16 13:22:04.611 185727 DEBUG nova.objects.instance [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lazy-loading 'info_cache' on Instance uuid b21f8b55-68d7-4cd7-beed-2d61f932f84e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 13:22:06 compute-0 podman[207098]: 2026-02-16 13:22:06.033217204 +0000 UTC m=+0.070667512 container health_status 4ec6105d1727f201f656d7e6e358931be8ceabc5af37fb5ad779abc620122180 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 16 13:22:06 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:22:06.646 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=b0e583b2-47d7-4bde-bbd6-282143e0c194, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '5'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:22:06 compute-0 nova_compute[185723]: 2026-02-16 13:22:06.772 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:22:07 compute-0 nova_compute[185723]: 2026-02-16 13:22:07.037 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:22:11 compute-0 nova_compute[185723]: 2026-02-16 13:22:11.775 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:22:12 compute-0 nova_compute[185723]: 2026-02-16 13:22:12.039 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:22:13 compute-0 nova_compute[185723]: 2026-02-16 13:22:13.458 185727 DEBUG nova.network.neutron [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] [instance: b21f8b55-68d7-4cd7-beed-2d61f932f84e] Updating instance_info_cache with network_info: [{"id": "3bdc1813-a8d3-43b8-805c-95acd138d9d6", "address": "fa:16:3e:8a:da:08", "network": {"id": "a6199784-1742-41a7-9152-bb54abb7bef1", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-926379118-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5e0321e3a614b62a46eef7fb2e737ff", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3bdc1813-a8", "ovs_interfaceid": "3bdc1813-a8d3-43b8-805c-95acd138d9d6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 13:22:14 compute-0 nova_compute[185723]: 2026-02-16 13:22:14.852 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Releasing lock "refresh_cache-b21f8b55-68d7-4cd7-beed-2d61f932f84e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 13:22:14 compute-0 nova_compute[185723]: 2026-02-16 13:22:14.852 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] [instance: b21f8b55-68d7-4cd7-beed-2d61f932f84e] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Feb 16 13:22:14 compute-0 nova_compute[185723]: 2026-02-16 13:22:14.853 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:22:14 compute-0 nova_compute[185723]: 2026-02-16 13:22:14.853 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:22:14 compute-0 nova_compute[185723]: 2026-02-16 13:22:14.853 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 16 13:22:15 compute-0 sshd-session[207124]: Connection closed by authenticating user root 188.166.42.159 port 37784 [preauth]
Feb 16 13:22:16 compute-0 nova_compute[185723]: 2026-02-16 13:22:16.824 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:22:17 compute-0 nova_compute[185723]: 2026-02-16 13:22:17.042 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:22:21 compute-0 nova_compute[185723]: 2026-02-16 13:22:21.826 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:22:22 compute-0 nova_compute[185723]: 2026-02-16 13:22:22.042 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:22:23 compute-0 podman[207127]: 2026-02-16 13:22:23.01799719 +0000 UTC m=+0.051540795 container health_status d69bd0be854ec8043fcba555b70c2cd311c7a2510d07d6bc9929fe7684abece9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20260127, managed_by=edpm_ansible)
Feb 16 13:22:23 compute-0 podman[207126]: 2026-02-16 13:22:23.048149671 +0000 UTC m=+0.081927962 container health_status 93151dcbec9f159583042df5101bdd7b5b1bcb5f3117955b4bc9cd1c0e465b98 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9/ubi-minimal, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, org.opencontainers.image.created=2026-02-05T04:57:10Z, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., io.openshift.expose-services=, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-type=git, architecture=x86_64, managed_by=edpm_ansible, release=1770267347, version=9.7, build-date=2026-02-05T04:57:10Z, maintainer=Red Hat, Inc.)
Feb 16 13:22:26 compute-0 nova_compute[185723]: 2026-02-16 13:22:26.830 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:22:27 compute-0 podman[207167]: 2026-02-16 13:22:27.028065939 +0000 UTC m=+0.069914382 container health_status 19961a2f872cc72daf954f4dd556f980b5f58f137d145776df6132c167a8a93c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_controller)
Feb 16 13:22:27 compute-0 nova_compute[185723]: 2026-02-16 13:22:27.083 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:22:29 compute-0 podman[195053]: time="2026-02-16T13:22:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:22:29 compute-0 podman[195053]: @ - - [16/Feb/2026:13:22:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17245 "" "Go-http-client/1.1"
Feb 16 13:22:29 compute-0 podman[195053]: @ - - [16/Feb/2026:13:22:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2628 "" "Go-http-client/1.1"
Feb 16 13:22:31 compute-0 openstack_network_exporter[197909]: ERROR   13:22:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:22:31 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:22:31 compute-0 openstack_network_exporter[197909]: ERROR   13:22:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:22:31 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:22:31 compute-0 nova_compute[185723]: 2026-02-16 13:22:31.831 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:22:32 compute-0 nova_compute[185723]: 2026-02-16 13:22:32.087 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:22:32 compute-0 ovn_controller[96072]: 2026-02-16T13:22:32Z|00040|memory_trim|INFO|Detected inactivity (last active 30003 ms ago): trimming memory
Feb 16 13:22:36 compute-0 nova_compute[185723]: 2026-02-16 13:22:36.834 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:22:37 compute-0 podman[207195]: 2026-02-16 13:22:37.02018412 +0000 UTC m=+0.059863102 container health_status 4ec6105d1727f201f656d7e6e358931be8ceabc5af37fb5ad779abc620122180 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 16 13:22:37 compute-0 nova_compute[185723]: 2026-02-16 13:22:37.088 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:22:41 compute-0 sshd-session[207219]: Connection closed by authenticating user root 64.227.72.94 port 60012 [preauth]
Feb 16 13:22:41 compute-0 nova_compute[185723]: 2026-02-16 13:22:41.765 185727 DEBUG oslo_concurrency.lockutils [None req-ced86f35-95de-416d-8655-58237d8ddebf 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Acquiring lock "139d8f81-7f89-4100-af32-e59289aeb6f5" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:22:41 compute-0 nova_compute[185723]: 2026-02-16 13:22:41.765 185727 DEBUG oslo_concurrency.lockutils [None req-ced86f35-95de-416d-8655-58237d8ddebf 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Lock "139d8f81-7f89-4100-af32-e59289aeb6f5" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:22:41 compute-0 nova_compute[185723]: 2026-02-16 13:22:41.809 185727 DEBUG nova.compute.manager [None req-ced86f35-95de-416d-8655-58237d8ddebf 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 139d8f81-7f89-4100-af32-e59289aeb6f5] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Feb 16 13:22:41 compute-0 nova_compute[185723]: 2026-02-16 13:22:41.837 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:22:41 compute-0 nova_compute[185723]: 2026-02-16 13:22:41.945 185727 DEBUG oslo_concurrency.lockutils [None req-ced86f35-95de-416d-8655-58237d8ddebf 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:22:41 compute-0 nova_compute[185723]: 2026-02-16 13:22:41.945 185727 DEBUG oslo_concurrency.lockutils [None req-ced86f35-95de-416d-8655-58237d8ddebf 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:22:41 compute-0 nova_compute[185723]: 2026-02-16 13:22:41.955 185727 DEBUG nova.virt.hardware [None req-ced86f35-95de-416d-8655-58237d8ddebf 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Feb 16 13:22:41 compute-0 nova_compute[185723]: 2026-02-16 13:22:41.955 185727 INFO nova.compute.claims [None req-ced86f35-95de-416d-8655-58237d8ddebf 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 139d8f81-7f89-4100-af32-e59289aeb6f5] Claim successful on node compute-0.ctlplane.example.com
Feb 16 13:22:42 compute-0 nova_compute[185723]: 2026-02-16 13:22:42.092 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:22:42 compute-0 nova_compute[185723]: 2026-02-16 13:22:42.158 185727 DEBUG nova.compute.provider_tree [None req-ced86f35-95de-416d-8655-58237d8ddebf 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Inventory has not changed in ProviderTree for provider: c9501a85-df32-4b8f-bce0-9425ef1e7866 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 13:22:42 compute-0 nova_compute[185723]: 2026-02-16 13:22:42.183 185727 DEBUG nova.scheduler.client.report [None req-ced86f35-95de-416d-8655-58237d8ddebf 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Inventory has not changed for provider c9501a85-df32-4b8f-bce0-9425ef1e7866 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 13:22:42 compute-0 nova_compute[185723]: 2026-02-16 13:22:42.225 185727 DEBUG oslo_concurrency.lockutils [None req-ced86f35-95de-416d-8655-58237d8ddebf 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.280s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:22:42 compute-0 nova_compute[185723]: 2026-02-16 13:22:42.226 185727 DEBUG nova.compute.manager [None req-ced86f35-95de-416d-8655-58237d8ddebf 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 139d8f81-7f89-4100-af32-e59289aeb6f5] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Feb 16 13:22:42 compute-0 nova_compute[185723]: 2026-02-16 13:22:42.284 185727 DEBUG nova.compute.manager [None req-ced86f35-95de-416d-8655-58237d8ddebf 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 139d8f81-7f89-4100-af32-e59289aeb6f5] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Feb 16 13:22:42 compute-0 nova_compute[185723]: 2026-02-16 13:22:42.284 185727 DEBUG nova.network.neutron [None req-ced86f35-95de-416d-8655-58237d8ddebf 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 139d8f81-7f89-4100-af32-e59289aeb6f5] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Feb 16 13:22:42 compute-0 nova_compute[185723]: 2026-02-16 13:22:42.337 185727 INFO nova.virt.libvirt.driver [None req-ced86f35-95de-416d-8655-58237d8ddebf 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 139d8f81-7f89-4100-af32-e59289aeb6f5] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Feb 16 13:22:42 compute-0 nova_compute[185723]: 2026-02-16 13:22:42.407 185727 DEBUG nova.compute.manager [None req-ced86f35-95de-416d-8655-58237d8ddebf 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 139d8f81-7f89-4100-af32-e59289aeb6f5] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Feb 16 13:22:42 compute-0 nova_compute[185723]: 2026-02-16 13:22:42.576 185727 DEBUG nova.policy [None req-ced86f35-95de-416d-8655-58237d8ddebf 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '53b5045c5aaf4a7d8d84dce2ac4aa424', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b5e0321e3a614b62a46eef7fb2e737ff', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Feb 16 13:22:42 compute-0 nova_compute[185723]: 2026-02-16 13:22:42.611 185727 DEBUG nova.compute.manager [None req-ced86f35-95de-416d-8655-58237d8ddebf 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 139d8f81-7f89-4100-af32-e59289aeb6f5] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Feb 16 13:22:42 compute-0 nova_compute[185723]: 2026-02-16 13:22:42.612 185727 DEBUG nova.virt.libvirt.driver [None req-ced86f35-95de-416d-8655-58237d8ddebf 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 139d8f81-7f89-4100-af32-e59289aeb6f5] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Feb 16 13:22:42 compute-0 nova_compute[185723]: 2026-02-16 13:22:42.612 185727 INFO nova.virt.libvirt.driver [None req-ced86f35-95de-416d-8655-58237d8ddebf 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 139d8f81-7f89-4100-af32-e59289aeb6f5] Creating image(s)
Feb 16 13:22:42 compute-0 nova_compute[185723]: 2026-02-16 13:22:42.613 185727 DEBUG oslo_concurrency.lockutils [None req-ced86f35-95de-416d-8655-58237d8ddebf 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Acquiring lock "/var/lib/nova/instances/139d8f81-7f89-4100-af32-e59289aeb6f5/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:22:42 compute-0 nova_compute[185723]: 2026-02-16 13:22:42.613 185727 DEBUG oslo_concurrency.lockutils [None req-ced86f35-95de-416d-8655-58237d8ddebf 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Lock "/var/lib/nova/instances/139d8f81-7f89-4100-af32-e59289aeb6f5/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:22:42 compute-0 nova_compute[185723]: 2026-02-16 13:22:42.613 185727 DEBUG oslo_concurrency.lockutils [None req-ced86f35-95de-416d-8655-58237d8ddebf 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Lock "/var/lib/nova/instances/139d8f81-7f89-4100-af32-e59289aeb6f5/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:22:42 compute-0 nova_compute[185723]: 2026-02-16 13:22:42.626 185727 DEBUG oslo_concurrency.processutils [None req-ced86f35-95de-416d-8655-58237d8ddebf 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:22:42 compute-0 nova_compute[185723]: 2026-02-16 13:22:42.670 185727 DEBUG oslo_concurrency.processutils [None req-ced86f35-95de-416d-8655-58237d8ddebf 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:22:42 compute-0 nova_compute[185723]: 2026-02-16 13:22:42.672 185727 DEBUG oslo_concurrency.lockutils [None req-ced86f35-95de-416d-8655-58237d8ddebf 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Acquiring lock "755ae6cb63977e865a71a8c38a1a67ce95c9d0b7" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:22:42 compute-0 nova_compute[185723]: 2026-02-16 13:22:42.673 185727 DEBUG oslo_concurrency.lockutils [None req-ced86f35-95de-416d-8655-58237d8ddebf 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Lock "755ae6cb63977e865a71a8c38a1a67ce95c9d0b7" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:22:42 compute-0 nova_compute[185723]: 2026-02-16 13:22:42.691 185727 DEBUG oslo_concurrency.processutils [None req-ced86f35-95de-416d-8655-58237d8ddebf 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:22:42 compute-0 nova_compute[185723]: 2026-02-16 13:22:42.734 185727 DEBUG oslo_concurrency.processutils [None req-ced86f35-95de-416d-8655-58237d8ddebf 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:22:42 compute-0 nova_compute[185723]: 2026-02-16 13:22:42.735 185727 DEBUG oslo_concurrency.processutils [None req-ced86f35-95de-416d-8655-58237d8ddebf 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7,backing_fmt=raw /var/lib/nova/instances/139d8f81-7f89-4100-af32-e59289aeb6f5/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:22:42 compute-0 nova_compute[185723]: 2026-02-16 13:22:42.834 185727 DEBUG oslo_concurrency.processutils [None req-ced86f35-95de-416d-8655-58237d8ddebf 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7,backing_fmt=raw /var/lib/nova/instances/139d8f81-7f89-4100-af32-e59289aeb6f5/disk 1073741824" returned: 0 in 0.099s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:22:42 compute-0 nova_compute[185723]: 2026-02-16 13:22:42.835 185727 DEBUG oslo_concurrency.lockutils [None req-ced86f35-95de-416d-8655-58237d8ddebf 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Lock "755ae6cb63977e865a71a8c38a1a67ce95c9d0b7" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.163s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:22:42 compute-0 nova_compute[185723]: 2026-02-16 13:22:42.836 185727 DEBUG oslo_concurrency.processutils [None req-ced86f35-95de-416d-8655-58237d8ddebf 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:22:42 compute-0 nova_compute[185723]: 2026-02-16 13:22:42.881 185727 DEBUG oslo_concurrency.processutils [None req-ced86f35-95de-416d-8655-58237d8ddebf 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json" returned: 0 in 0.045s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:22:42 compute-0 nova_compute[185723]: 2026-02-16 13:22:42.881 185727 DEBUG nova.virt.disk.api [None req-ced86f35-95de-416d-8655-58237d8ddebf 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Checking if we can resize image /var/lib/nova/instances/139d8f81-7f89-4100-af32-e59289aeb6f5/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Feb 16 13:22:42 compute-0 nova_compute[185723]: 2026-02-16 13:22:42.882 185727 DEBUG oslo_concurrency.processutils [None req-ced86f35-95de-416d-8655-58237d8ddebf 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/139d8f81-7f89-4100-af32-e59289aeb6f5/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:22:42 compute-0 nova_compute[185723]: 2026-02-16 13:22:42.924 185727 DEBUG oslo_concurrency.processutils [None req-ced86f35-95de-416d-8655-58237d8ddebf 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/139d8f81-7f89-4100-af32-e59289aeb6f5/disk --force-share --output=json" returned: 0 in 0.042s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:22:42 compute-0 nova_compute[185723]: 2026-02-16 13:22:42.925 185727 DEBUG nova.virt.disk.api [None req-ced86f35-95de-416d-8655-58237d8ddebf 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Cannot resize image /var/lib/nova/instances/139d8f81-7f89-4100-af32-e59289aeb6f5/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Feb 16 13:22:42 compute-0 nova_compute[185723]: 2026-02-16 13:22:42.925 185727 DEBUG nova.objects.instance [None req-ced86f35-95de-416d-8655-58237d8ddebf 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Lazy-loading 'migration_context' on Instance uuid 139d8f81-7f89-4100-af32-e59289aeb6f5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 13:22:42 compute-0 nova_compute[185723]: 2026-02-16 13:22:42.950 185727 DEBUG nova.virt.libvirt.driver [None req-ced86f35-95de-416d-8655-58237d8ddebf 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 139d8f81-7f89-4100-af32-e59289aeb6f5] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Feb 16 13:22:42 compute-0 nova_compute[185723]: 2026-02-16 13:22:42.950 185727 DEBUG nova.virt.libvirt.driver [None req-ced86f35-95de-416d-8655-58237d8ddebf 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 139d8f81-7f89-4100-af32-e59289aeb6f5] Ensure instance console log exists: /var/lib/nova/instances/139d8f81-7f89-4100-af32-e59289aeb6f5/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Feb 16 13:22:42 compute-0 nova_compute[185723]: 2026-02-16 13:22:42.951 185727 DEBUG oslo_concurrency.lockutils [None req-ced86f35-95de-416d-8655-58237d8ddebf 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:22:42 compute-0 nova_compute[185723]: 2026-02-16 13:22:42.951 185727 DEBUG oslo_concurrency.lockutils [None req-ced86f35-95de-416d-8655-58237d8ddebf 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:22:42 compute-0 nova_compute[185723]: 2026-02-16 13:22:42.951 185727 DEBUG oslo_concurrency.lockutils [None req-ced86f35-95de-416d-8655-58237d8ddebf 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:22:43 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:22:43.711 105360 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=6, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '52:96:1b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '16:6c:7a:bd:49:39'}, ipsec=False) old=SB_Global(nb_cfg=5) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 13:22:43 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:22:43.711 105360 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 16 13:22:43 compute-0 nova_compute[185723]: 2026-02-16 13:22:43.763 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:22:44 compute-0 nova_compute[185723]: 2026-02-16 13:22:44.124 185727 DEBUG nova.network.neutron [None req-ced86f35-95de-416d-8655-58237d8ddebf 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 139d8f81-7f89-4100-af32-e59289aeb6f5] Successfully created port: b14d49f7-53e0-4c41-a463-8f16b26817ae _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Feb 16 13:22:46 compute-0 nova_compute[185723]: 2026-02-16 13:22:46.840 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:22:47 compute-0 nova_compute[185723]: 2026-02-16 13:22:47.133 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:22:47 compute-0 nova_compute[185723]: 2026-02-16 13:22:47.296 185727 DEBUG nova.network.neutron [None req-ced86f35-95de-416d-8655-58237d8ddebf 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 139d8f81-7f89-4100-af32-e59289aeb6f5] Successfully updated port: b14d49f7-53e0-4c41-a463-8f16b26817ae _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Feb 16 13:22:47 compute-0 nova_compute[185723]: 2026-02-16 13:22:47.371 185727 DEBUG oslo_concurrency.lockutils [None req-ced86f35-95de-416d-8655-58237d8ddebf 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Acquiring lock "refresh_cache-139d8f81-7f89-4100-af32-e59289aeb6f5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 13:22:47 compute-0 nova_compute[185723]: 2026-02-16 13:22:47.372 185727 DEBUG oslo_concurrency.lockutils [None req-ced86f35-95de-416d-8655-58237d8ddebf 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Acquired lock "refresh_cache-139d8f81-7f89-4100-af32-e59289aeb6f5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 13:22:47 compute-0 nova_compute[185723]: 2026-02-16 13:22:47.372 185727 DEBUG nova.network.neutron [None req-ced86f35-95de-416d-8655-58237d8ddebf 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 139d8f81-7f89-4100-af32-e59289aeb6f5] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 16 13:22:47 compute-0 nova_compute[185723]: 2026-02-16 13:22:47.424 185727 DEBUG nova.compute.manager [req-55db466d-bb2f-44c6-89d3-6348917e4b39 req-cf96fad0-fb5c-466b-8026-f68c83babff6 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 139d8f81-7f89-4100-af32-e59289aeb6f5] Received event network-changed-b14d49f7-53e0-4c41-a463-8f16b26817ae external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:22:47 compute-0 nova_compute[185723]: 2026-02-16 13:22:47.425 185727 DEBUG nova.compute.manager [req-55db466d-bb2f-44c6-89d3-6348917e4b39 req-cf96fad0-fb5c-466b-8026-f68c83babff6 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 139d8f81-7f89-4100-af32-e59289aeb6f5] Refreshing instance network info cache due to event network-changed-b14d49f7-53e0-4c41-a463-8f16b26817ae. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 16 13:22:47 compute-0 nova_compute[185723]: 2026-02-16 13:22:47.425 185727 DEBUG oslo_concurrency.lockutils [req-55db466d-bb2f-44c6-89d3-6348917e4b39 req-cf96fad0-fb5c-466b-8026-f68c83babff6 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "refresh_cache-139d8f81-7f89-4100-af32-e59289aeb6f5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 13:22:48 compute-0 nova_compute[185723]: 2026-02-16 13:22:48.107 185727 DEBUG nova.network.neutron [None req-ced86f35-95de-416d-8655-58237d8ddebf 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 139d8f81-7f89-4100-af32-e59289aeb6f5] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Feb 16 13:22:48 compute-0 sshd-session[207243]: Connection closed by authenticating user root 146.190.22.227 port 56294 [preauth]
Feb 16 13:22:48 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:22:48.714 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=b0e583b2-47d7-4bde-bbd6-282143e0c194, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '6'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:22:49 compute-0 nova_compute[185723]: 2026-02-16 13:22:49.551 185727 DEBUG nova.network.neutron [None req-ced86f35-95de-416d-8655-58237d8ddebf 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 139d8f81-7f89-4100-af32-e59289aeb6f5] Updating instance_info_cache with network_info: [{"id": "b14d49f7-53e0-4c41-a463-8f16b26817ae", "address": "fa:16:3e:49:6b:ae", "network": {"id": "a6199784-1742-41a7-9152-bb54abb7bef1", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-926379118-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5e0321e3a614b62a46eef7fb2e737ff", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb14d49f7-53", "ovs_interfaceid": "b14d49f7-53e0-4c41-a463-8f16b26817ae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 13:22:49 compute-0 nova_compute[185723]: 2026-02-16 13:22:49.578 185727 DEBUG oslo_concurrency.lockutils [None req-ced86f35-95de-416d-8655-58237d8ddebf 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Releasing lock "refresh_cache-139d8f81-7f89-4100-af32-e59289aeb6f5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 13:22:49 compute-0 nova_compute[185723]: 2026-02-16 13:22:49.578 185727 DEBUG nova.compute.manager [None req-ced86f35-95de-416d-8655-58237d8ddebf 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 139d8f81-7f89-4100-af32-e59289aeb6f5] Instance network_info: |[{"id": "b14d49f7-53e0-4c41-a463-8f16b26817ae", "address": "fa:16:3e:49:6b:ae", "network": {"id": "a6199784-1742-41a7-9152-bb54abb7bef1", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-926379118-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5e0321e3a614b62a46eef7fb2e737ff", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb14d49f7-53", "ovs_interfaceid": "b14d49f7-53e0-4c41-a463-8f16b26817ae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Feb 16 13:22:49 compute-0 nova_compute[185723]: 2026-02-16 13:22:49.579 185727 DEBUG oslo_concurrency.lockutils [req-55db466d-bb2f-44c6-89d3-6348917e4b39 req-cf96fad0-fb5c-466b-8026-f68c83babff6 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquired lock "refresh_cache-139d8f81-7f89-4100-af32-e59289aeb6f5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 13:22:49 compute-0 nova_compute[185723]: 2026-02-16 13:22:49.579 185727 DEBUG nova.network.neutron [req-55db466d-bb2f-44c6-89d3-6348917e4b39 req-cf96fad0-fb5c-466b-8026-f68c83babff6 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 139d8f81-7f89-4100-af32-e59289aeb6f5] Refreshing network info cache for port b14d49f7-53e0-4c41-a463-8f16b26817ae _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 16 13:22:49 compute-0 nova_compute[185723]: 2026-02-16 13:22:49.582 185727 DEBUG nova.virt.libvirt.driver [None req-ced86f35-95de-416d-8655-58237d8ddebf 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 139d8f81-7f89-4100-af32-e59289aeb6f5] Start _get_guest_xml network_info=[{"id": "b14d49f7-53e0-4c41-a463-8f16b26817ae", "address": "fa:16:3e:49:6b:ae", "network": {"id": "a6199784-1742-41a7-9152-bb54abb7bef1", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-926379118-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5e0321e3a614b62a46eef7fb2e737ff", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb14d49f7-53", "ovs_interfaceid": "b14d49f7-53e0-4c41-a463-8f16b26817ae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-16T13:17:01Z,direct_url=<?>,disk_format='qcow2',id=6fb9af7f-2971-4890-a777-6e99e888717f,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='fa8ec31824694513a42cc22c81880a5c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-16T13:17:03Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'encrypted': False, 'device_type': 'disk', 'disk_bus': 'virtio', 'encryption_options': None, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'size': 0, 'image_id': '6fb9af7f-2971-4890-a777-6e99e888717f'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Feb 16 13:22:49 compute-0 nova_compute[185723]: 2026-02-16 13:22:49.587 185727 WARNING nova.virt.libvirt.driver [None req-ced86f35-95de-416d-8655-58237d8ddebf 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 13:22:49 compute-0 nova_compute[185723]: 2026-02-16 13:22:49.594 185727 DEBUG nova.virt.libvirt.host [None req-ced86f35-95de-416d-8655-58237d8ddebf 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Feb 16 13:22:49 compute-0 nova_compute[185723]: 2026-02-16 13:22:49.595 185727 DEBUG nova.virt.libvirt.host [None req-ced86f35-95de-416d-8655-58237d8ddebf 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Feb 16 13:22:49 compute-0 nova_compute[185723]: 2026-02-16 13:22:49.597 185727 DEBUG nova.virt.libvirt.host [None req-ced86f35-95de-416d-8655-58237d8ddebf 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Feb 16 13:22:49 compute-0 nova_compute[185723]: 2026-02-16 13:22:49.598 185727 DEBUG nova.virt.libvirt.host [None req-ced86f35-95de-416d-8655-58237d8ddebf 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Feb 16 13:22:49 compute-0 nova_compute[185723]: 2026-02-16 13:22:49.600 185727 DEBUG nova.virt.libvirt.driver [None req-ced86f35-95de-416d-8655-58237d8ddebf 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Feb 16 13:22:49 compute-0 nova_compute[185723]: 2026-02-16 13:22:49.600 185727 DEBUG nova.virt.hardware [None req-ced86f35-95de-416d-8655-58237d8ddebf 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-02-16T13:16:59Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='6d89f72c-1760-421e-a5f2-83dfc3723b84',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-16T13:17:01Z,direct_url=<?>,disk_format='qcow2',id=6fb9af7f-2971-4890-a777-6e99e888717f,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='fa8ec31824694513a42cc22c81880a5c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-16T13:17:03Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Feb 16 13:22:49 compute-0 nova_compute[185723]: 2026-02-16 13:22:49.601 185727 DEBUG nova.virt.hardware [None req-ced86f35-95de-416d-8655-58237d8ddebf 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Feb 16 13:22:49 compute-0 nova_compute[185723]: 2026-02-16 13:22:49.602 185727 DEBUG nova.virt.hardware [None req-ced86f35-95de-416d-8655-58237d8ddebf 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Feb 16 13:22:49 compute-0 nova_compute[185723]: 2026-02-16 13:22:49.602 185727 DEBUG nova.virt.hardware [None req-ced86f35-95de-416d-8655-58237d8ddebf 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Feb 16 13:22:49 compute-0 nova_compute[185723]: 2026-02-16 13:22:49.602 185727 DEBUG nova.virt.hardware [None req-ced86f35-95de-416d-8655-58237d8ddebf 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Feb 16 13:22:49 compute-0 nova_compute[185723]: 2026-02-16 13:22:49.603 185727 DEBUG nova.virt.hardware [None req-ced86f35-95de-416d-8655-58237d8ddebf 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Feb 16 13:22:49 compute-0 nova_compute[185723]: 2026-02-16 13:22:49.603 185727 DEBUG nova.virt.hardware [None req-ced86f35-95de-416d-8655-58237d8ddebf 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Feb 16 13:22:49 compute-0 nova_compute[185723]: 2026-02-16 13:22:49.603 185727 DEBUG nova.virt.hardware [None req-ced86f35-95de-416d-8655-58237d8ddebf 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Feb 16 13:22:49 compute-0 nova_compute[185723]: 2026-02-16 13:22:49.604 185727 DEBUG nova.virt.hardware [None req-ced86f35-95de-416d-8655-58237d8ddebf 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Feb 16 13:22:49 compute-0 nova_compute[185723]: 2026-02-16 13:22:49.604 185727 DEBUG nova.virt.hardware [None req-ced86f35-95de-416d-8655-58237d8ddebf 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Feb 16 13:22:49 compute-0 nova_compute[185723]: 2026-02-16 13:22:49.604 185727 DEBUG nova.virt.hardware [None req-ced86f35-95de-416d-8655-58237d8ddebf 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Feb 16 13:22:49 compute-0 nova_compute[185723]: 2026-02-16 13:22:49.608 185727 DEBUG nova.virt.libvirt.vif [None req-ced86f35-95de-416d-8655-58237d8ddebf 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-16T13:22:35Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-1427510884',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-1427510884',id=5,image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b5e0321e3a614b62a46eef7fb2e737ff',ramdisk_id='',reservation_id='r-gqinun03',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteActionsViaActuator-1504038973',owner_user_name='tempest-TestExecuteActionsViaActuator-1504038973-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-16T13:22:42Z,user_data=None,user_id='53b5045c5aaf4a7d8d84dce2ac4aa424',uuid=139d8f81-7f89-4100-af32-e59289aeb6f5,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b14d49f7-53e0-4c41-a463-8f16b26817ae", "address": "fa:16:3e:49:6b:ae", "network": {"id": "a6199784-1742-41a7-9152-bb54abb7bef1", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-926379118-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5e0321e3a614b62a46eef7fb2e737ff", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb14d49f7-53", "ovs_interfaceid": "b14d49f7-53e0-4c41-a463-8f16b26817ae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Feb 16 13:22:49 compute-0 nova_compute[185723]: 2026-02-16 13:22:49.609 185727 DEBUG nova.network.os_vif_util [None req-ced86f35-95de-416d-8655-58237d8ddebf 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Converting VIF {"id": "b14d49f7-53e0-4c41-a463-8f16b26817ae", "address": "fa:16:3e:49:6b:ae", "network": {"id": "a6199784-1742-41a7-9152-bb54abb7bef1", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-926379118-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5e0321e3a614b62a46eef7fb2e737ff", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb14d49f7-53", "ovs_interfaceid": "b14d49f7-53e0-4c41-a463-8f16b26817ae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 13:22:49 compute-0 nova_compute[185723]: 2026-02-16 13:22:49.610 185727 DEBUG nova.network.os_vif_util [None req-ced86f35-95de-416d-8655-58237d8ddebf 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:49:6b:ae,bridge_name='br-int',has_traffic_filtering=True,id=b14d49f7-53e0-4c41-a463-8f16b26817ae,network=Network(a6199784-1742-41a7-9152-bb54abb7bef1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb14d49f7-53') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 13:22:49 compute-0 nova_compute[185723]: 2026-02-16 13:22:49.611 185727 DEBUG nova.objects.instance [None req-ced86f35-95de-416d-8655-58237d8ddebf 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Lazy-loading 'pci_devices' on Instance uuid 139d8f81-7f89-4100-af32-e59289aeb6f5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 13:22:49 compute-0 nova_compute[185723]: 2026-02-16 13:22:49.629 185727 DEBUG nova.virt.libvirt.driver [None req-ced86f35-95de-416d-8655-58237d8ddebf 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 139d8f81-7f89-4100-af32-e59289aeb6f5] End _get_guest_xml xml=<domain type="kvm">
Feb 16 13:22:49 compute-0 nova_compute[185723]:   <uuid>139d8f81-7f89-4100-af32-e59289aeb6f5</uuid>
Feb 16 13:22:49 compute-0 nova_compute[185723]:   <name>instance-00000005</name>
Feb 16 13:22:49 compute-0 nova_compute[185723]:   <memory>131072</memory>
Feb 16 13:22:49 compute-0 nova_compute[185723]:   <vcpu>1</vcpu>
Feb 16 13:22:49 compute-0 nova_compute[185723]:   <metadata>
Feb 16 13:22:49 compute-0 nova_compute[185723]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb 16 13:22:49 compute-0 nova_compute[185723]:       <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Feb 16 13:22:49 compute-0 nova_compute[185723]:       <nova:name>tempest-TestExecuteActionsViaActuator-server-1427510884</nova:name>
Feb 16 13:22:49 compute-0 nova_compute[185723]:       <nova:creationTime>2026-02-16 13:22:49</nova:creationTime>
Feb 16 13:22:49 compute-0 nova_compute[185723]:       <nova:flavor name="m1.nano">
Feb 16 13:22:49 compute-0 nova_compute[185723]:         <nova:memory>128</nova:memory>
Feb 16 13:22:49 compute-0 nova_compute[185723]:         <nova:disk>1</nova:disk>
Feb 16 13:22:49 compute-0 nova_compute[185723]:         <nova:swap>0</nova:swap>
Feb 16 13:22:49 compute-0 nova_compute[185723]:         <nova:ephemeral>0</nova:ephemeral>
Feb 16 13:22:49 compute-0 nova_compute[185723]:         <nova:vcpus>1</nova:vcpus>
Feb 16 13:22:49 compute-0 nova_compute[185723]:       </nova:flavor>
Feb 16 13:22:49 compute-0 nova_compute[185723]:       <nova:owner>
Feb 16 13:22:49 compute-0 nova_compute[185723]:         <nova:user uuid="53b5045c5aaf4a7d8d84dce2ac4aa424">tempest-TestExecuteActionsViaActuator-1504038973-project-member</nova:user>
Feb 16 13:22:49 compute-0 nova_compute[185723]:         <nova:project uuid="b5e0321e3a614b62a46eef7fb2e737ff">tempest-TestExecuteActionsViaActuator-1504038973</nova:project>
Feb 16 13:22:49 compute-0 nova_compute[185723]:       </nova:owner>
Feb 16 13:22:49 compute-0 nova_compute[185723]:       <nova:root type="image" uuid="6fb9af7f-2971-4890-a777-6e99e888717f"/>
Feb 16 13:22:49 compute-0 nova_compute[185723]:       <nova:ports>
Feb 16 13:22:49 compute-0 nova_compute[185723]:         <nova:port uuid="b14d49f7-53e0-4c41-a463-8f16b26817ae">
Feb 16 13:22:49 compute-0 nova_compute[185723]:           <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Feb 16 13:22:49 compute-0 nova_compute[185723]:         </nova:port>
Feb 16 13:22:49 compute-0 nova_compute[185723]:       </nova:ports>
Feb 16 13:22:49 compute-0 nova_compute[185723]:     </nova:instance>
Feb 16 13:22:49 compute-0 nova_compute[185723]:   </metadata>
Feb 16 13:22:49 compute-0 nova_compute[185723]:   <sysinfo type="smbios">
Feb 16 13:22:49 compute-0 nova_compute[185723]:     <system>
Feb 16 13:22:49 compute-0 nova_compute[185723]:       <entry name="manufacturer">RDO</entry>
Feb 16 13:22:49 compute-0 nova_compute[185723]:       <entry name="product">OpenStack Compute</entry>
Feb 16 13:22:49 compute-0 nova_compute[185723]:       <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Feb 16 13:22:49 compute-0 nova_compute[185723]:       <entry name="serial">139d8f81-7f89-4100-af32-e59289aeb6f5</entry>
Feb 16 13:22:49 compute-0 nova_compute[185723]:       <entry name="uuid">139d8f81-7f89-4100-af32-e59289aeb6f5</entry>
Feb 16 13:22:49 compute-0 nova_compute[185723]:       <entry name="family">Virtual Machine</entry>
Feb 16 13:22:49 compute-0 nova_compute[185723]:     </system>
Feb 16 13:22:49 compute-0 nova_compute[185723]:   </sysinfo>
Feb 16 13:22:49 compute-0 nova_compute[185723]:   <os>
Feb 16 13:22:49 compute-0 nova_compute[185723]:     <type arch="x86_64" machine="q35">hvm</type>
Feb 16 13:22:49 compute-0 nova_compute[185723]:     <boot dev="hd"/>
Feb 16 13:22:49 compute-0 nova_compute[185723]:     <smbios mode="sysinfo"/>
Feb 16 13:22:49 compute-0 nova_compute[185723]:   </os>
Feb 16 13:22:49 compute-0 nova_compute[185723]:   <features>
Feb 16 13:22:49 compute-0 nova_compute[185723]:     <acpi/>
Feb 16 13:22:49 compute-0 nova_compute[185723]:     <apic/>
Feb 16 13:22:49 compute-0 nova_compute[185723]:     <vmcoreinfo/>
Feb 16 13:22:49 compute-0 nova_compute[185723]:   </features>
Feb 16 13:22:49 compute-0 nova_compute[185723]:   <clock offset="utc">
Feb 16 13:22:49 compute-0 nova_compute[185723]:     <timer name="pit" tickpolicy="delay"/>
Feb 16 13:22:49 compute-0 nova_compute[185723]:     <timer name="rtc" tickpolicy="catchup"/>
Feb 16 13:22:49 compute-0 nova_compute[185723]:     <timer name="hpet" present="no"/>
Feb 16 13:22:49 compute-0 nova_compute[185723]:   </clock>
Feb 16 13:22:49 compute-0 nova_compute[185723]:   <cpu mode="custom" match="exact">
Feb 16 13:22:49 compute-0 nova_compute[185723]:     <model>Nehalem</model>
Feb 16 13:22:49 compute-0 nova_compute[185723]:     <topology sockets="1" cores="1" threads="1"/>
Feb 16 13:22:49 compute-0 nova_compute[185723]:   </cpu>
Feb 16 13:22:49 compute-0 nova_compute[185723]:   <devices>
Feb 16 13:22:49 compute-0 nova_compute[185723]:     <disk type="file" device="disk">
Feb 16 13:22:49 compute-0 nova_compute[185723]:       <driver name="qemu" type="qcow2" cache="none"/>
Feb 16 13:22:49 compute-0 nova_compute[185723]:       <source file="/var/lib/nova/instances/139d8f81-7f89-4100-af32-e59289aeb6f5/disk"/>
Feb 16 13:22:49 compute-0 nova_compute[185723]:       <target dev="vda" bus="virtio"/>
Feb 16 13:22:49 compute-0 nova_compute[185723]:     </disk>
Feb 16 13:22:49 compute-0 nova_compute[185723]:     <disk type="file" device="cdrom">
Feb 16 13:22:49 compute-0 nova_compute[185723]:       <driver name="qemu" type="raw" cache="none"/>
Feb 16 13:22:49 compute-0 nova_compute[185723]:       <source file="/var/lib/nova/instances/139d8f81-7f89-4100-af32-e59289aeb6f5/disk.config"/>
Feb 16 13:22:49 compute-0 nova_compute[185723]:       <target dev="sda" bus="sata"/>
Feb 16 13:22:49 compute-0 nova_compute[185723]:     </disk>
Feb 16 13:22:49 compute-0 nova_compute[185723]:     <interface type="ethernet">
Feb 16 13:22:49 compute-0 nova_compute[185723]:       <mac address="fa:16:3e:49:6b:ae"/>
Feb 16 13:22:49 compute-0 nova_compute[185723]:       <model type="virtio"/>
Feb 16 13:22:49 compute-0 nova_compute[185723]:       <driver name="vhost" rx_queue_size="512"/>
Feb 16 13:22:49 compute-0 nova_compute[185723]:       <mtu size="1442"/>
Feb 16 13:22:49 compute-0 nova_compute[185723]:       <target dev="tapb14d49f7-53"/>
Feb 16 13:22:49 compute-0 nova_compute[185723]:     </interface>
Feb 16 13:22:49 compute-0 nova_compute[185723]:     <serial type="pty">
Feb 16 13:22:49 compute-0 nova_compute[185723]:       <log file="/var/lib/nova/instances/139d8f81-7f89-4100-af32-e59289aeb6f5/console.log" append="off"/>
Feb 16 13:22:49 compute-0 nova_compute[185723]:     </serial>
Feb 16 13:22:49 compute-0 nova_compute[185723]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Feb 16 13:22:49 compute-0 nova_compute[185723]:     <video>
Feb 16 13:22:49 compute-0 nova_compute[185723]:       <model type="virtio"/>
Feb 16 13:22:49 compute-0 nova_compute[185723]:     </video>
Feb 16 13:22:49 compute-0 nova_compute[185723]:     <input type="tablet" bus="usb"/>
Feb 16 13:22:49 compute-0 nova_compute[185723]:     <rng model="virtio">
Feb 16 13:22:49 compute-0 nova_compute[185723]:       <backend model="random">/dev/urandom</backend>
Feb 16 13:22:49 compute-0 nova_compute[185723]:     </rng>
Feb 16 13:22:49 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root"/>
Feb 16 13:22:49 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:22:49 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:22:49 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:22:49 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:22:49 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:22:49 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:22:49 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:22:49 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:22:49 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:22:49 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:22:49 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:22:49 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:22:49 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:22:49 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:22:49 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:22:49 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:22:49 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:22:49 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:22:49 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:22:49 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:22:49 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:22:49 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:22:49 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:22:49 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:22:49 compute-0 nova_compute[185723]:     <controller type="usb" index="0"/>
Feb 16 13:22:49 compute-0 nova_compute[185723]:     <memballoon model="virtio">
Feb 16 13:22:49 compute-0 nova_compute[185723]:       <stats period="10"/>
Feb 16 13:22:49 compute-0 nova_compute[185723]:     </memballoon>
Feb 16 13:22:49 compute-0 nova_compute[185723]:   </devices>
Feb 16 13:22:49 compute-0 nova_compute[185723]: </domain>
Feb 16 13:22:49 compute-0 nova_compute[185723]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Feb 16 13:22:49 compute-0 nova_compute[185723]: 2026-02-16 13:22:49.631 185727 DEBUG nova.compute.manager [None req-ced86f35-95de-416d-8655-58237d8ddebf 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 139d8f81-7f89-4100-af32-e59289aeb6f5] Preparing to wait for external event network-vif-plugged-b14d49f7-53e0-4c41-a463-8f16b26817ae prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Feb 16 13:22:49 compute-0 nova_compute[185723]: 2026-02-16 13:22:49.631 185727 DEBUG oslo_concurrency.lockutils [None req-ced86f35-95de-416d-8655-58237d8ddebf 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Acquiring lock "139d8f81-7f89-4100-af32-e59289aeb6f5-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:22:49 compute-0 nova_compute[185723]: 2026-02-16 13:22:49.631 185727 DEBUG oslo_concurrency.lockutils [None req-ced86f35-95de-416d-8655-58237d8ddebf 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Lock "139d8f81-7f89-4100-af32-e59289aeb6f5-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:22:49 compute-0 nova_compute[185723]: 2026-02-16 13:22:49.632 185727 DEBUG oslo_concurrency.lockutils [None req-ced86f35-95de-416d-8655-58237d8ddebf 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Lock "139d8f81-7f89-4100-af32-e59289aeb6f5-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:22:49 compute-0 nova_compute[185723]: 2026-02-16 13:22:49.632 185727 DEBUG nova.virt.libvirt.vif [None req-ced86f35-95de-416d-8655-58237d8ddebf 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-16T13:22:35Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-1427510884',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-1427510884',id=5,image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b5e0321e3a614b62a46eef7fb2e737ff',ramdisk_id='',reservation_id='r-gqinun03',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteActionsViaActuator-1504038973',owner_user_name='tempest-TestExecuteActionsViaActuator-1504038973-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-16T13:22:42Z,user_data=None,user_id='53b5045c5aaf4a7d8d84dce2ac4aa424',uuid=139d8f81-7f89-4100-af32-e59289aeb6f5,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b14d49f7-53e0-4c41-a463-8f16b26817ae", "address": "fa:16:3e:49:6b:ae", "network": {"id": "a6199784-1742-41a7-9152-bb54abb7bef1", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-926379118-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5e0321e3a614b62a46eef7fb2e737ff", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb14d49f7-53", "ovs_interfaceid": "b14d49f7-53e0-4c41-a463-8f16b26817ae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Feb 16 13:22:49 compute-0 nova_compute[185723]: 2026-02-16 13:22:49.633 185727 DEBUG nova.network.os_vif_util [None req-ced86f35-95de-416d-8655-58237d8ddebf 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Converting VIF {"id": "b14d49f7-53e0-4c41-a463-8f16b26817ae", "address": "fa:16:3e:49:6b:ae", "network": {"id": "a6199784-1742-41a7-9152-bb54abb7bef1", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-926379118-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5e0321e3a614b62a46eef7fb2e737ff", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb14d49f7-53", "ovs_interfaceid": "b14d49f7-53e0-4c41-a463-8f16b26817ae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 13:22:49 compute-0 nova_compute[185723]: 2026-02-16 13:22:49.634 185727 DEBUG nova.network.os_vif_util [None req-ced86f35-95de-416d-8655-58237d8ddebf 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:49:6b:ae,bridge_name='br-int',has_traffic_filtering=True,id=b14d49f7-53e0-4c41-a463-8f16b26817ae,network=Network(a6199784-1742-41a7-9152-bb54abb7bef1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb14d49f7-53') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 13:22:49 compute-0 nova_compute[185723]: 2026-02-16 13:22:49.634 185727 DEBUG os_vif [None req-ced86f35-95de-416d-8655-58237d8ddebf 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:49:6b:ae,bridge_name='br-int',has_traffic_filtering=True,id=b14d49f7-53e0-4c41-a463-8f16b26817ae,network=Network(a6199784-1742-41a7-9152-bb54abb7bef1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb14d49f7-53') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Feb 16 13:22:49 compute-0 nova_compute[185723]: 2026-02-16 13:22:49.635 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:22:49 compute-0 nova_compute[185723]: 2026-02-16 13:22:49.635 185727 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:22:49 compute-0 nova_compute[185723]: 2026-02-16 13:22:49.635 185727 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 13:22:49 compute-0 nova_compute[185723]: 2026-02-16 13:22:49.641 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:22:49 compute-0 nova_compute[185723]: 2026-02-16 13:22:49.641 185727 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb14d49f7-53, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:22:49 compute-0 nova_compute[185723]: 2026-02-16 13:22:49.642 185727 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapb14d49f7-53, col_values=(('external_ids', {'iface-id': 'b14d49f7-53e0-4c41-a463-8f16b26817ae', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:49:6b:ae', 'vm-uuid': '139d8f81-7f89-4100-af32-e59289aeb6f5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:22:49 compute-0 nova_compute[185723]: 2026-02-16 13:22:49.644 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:22:49 compute-0 NetworkManager[56177]: <info>  [1771248169.6451] manager: (tapb14d49f7-53): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/28)
Feb 16 13:22:49 compute-0 nova_compute[185723]: 2026-02-16 13:22:49.646 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 13:22:49 compute-0 nova_compute[185723]: 2026-02-16 13:22:49.652 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:22:49 compute-0 nova_compute[185723]: 2026-02-16 13:22:49.653 185727 INFO os_vif [None req-ced86f35-95de-416d-8655-58237d8ddebf 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:49:6b:ae,bridge_name='br-int',has_traffic_filtering=True,id=b14d49f7-53e0-4c41-a463-8f16b26817ae,network=Network(a6199784-1742-41a7-9152-bb54abb7bef1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb14d49f7-53')
Feb 16 13:22:49 compute-0 nova_compute[185723]: 2026-02-16 13:22:49.727 185727 DEBUG nova.virt.libvirt.driver [None req-ced86f35-95de-416d-8655-58237d8ddebf 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 16 13:22:49 compute-0 nova_compute[185723]: 2026-02-16 13:22:49.727 185727 DEBUG nova.virt.libvirt.driver [None req-ced86f35-95de-416d-8655-58237d8ddebf 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 16 13:22:49 compute-0 nova_compute[185723]: 2026-02-16 13:22:49.728 185727 DEBUG nova.virt.libvirt.driver [None req-ced86f35-95de-416d-8655-58237d8ddebf 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] No VIF found with MAC fa:16:3e:49:6b:ae, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Feb 16 13:22:49 compute-0 nova_compute[185723]: 2026-02-16 13:22:49.728 185727 INFO nova.virt.libvirt.driver [None req-ced86f35-95de-416d-8655-58237d8ddebf 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 139d8f81-7f89-4100-af32-e59289aeb6f5] Using config drive
Feb 16 13:22:50 compute-0 nova_compute[185723]: 2026-02-16 13:22:50.404 185727 INFO nova.virt.libvirt.driver [None req-ced86f35-95de-416d-8655-58237d8ddebf 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 139d8f81-7f89-4100-af32-e59289aeb6f5] Creating config drive at /var/lib/nova/instances/139d8f81-7f89-4100-af32-e59289aeb6f5/disk.config
Feb 16 13:22:50 compute-0 nova_compute[185723]: 2026-02-16 13:22:50.410 185727 DEBUG oslo_concurrency.processutils [None req-ced86f35-95de-416d-8655-58237d8ddebf 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/139d8f81-7f89-4100-af32-e59289aeb6f5/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmp3646tv7v execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:22:50 compute-0 nova_compute[185723]: 2026-02-16 13:22:50.536 185727 DEBUG oslo_concurrency.processutils [None req-ced86f35-95de-416d-8655-58237d8ddebf 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/139d8f81-7f89-4100-af32-e59289aeb6f5/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmp3646tv7v" returned: 0 in 0.126s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:22:50 compute-0 kernel: tapb14d49f7-53: entered promiscuous mode
Feb 16 13:22:50 compute-0 NetworkManager[56177]: <info>  [1771248170.5768] manager: (tapb14d49f7-53): new Tun device (/org/freedesktop/NetworkManager/Devices/29)
Feb 16 13:22:50 compute-0 nova_compute[185723]: 2026-02-16 13:22:50.615 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:22:50 compute-0 ovn_controller[96072]: 2026-02-16T13:22:50Z|00041|binding|INFO|Claiming lport b14d49f7-53e0-4c41-a463-8f16b26817ae for this chassis.
Feb 16 13:22:50 compute-0 ovn_controller[96072]: 2026-02-16T13:22:50Z|00042|binding|INFO|b14d49f7-53e0-4c41-a463-8f16b26817ae: Claiming fa:16:3e:49:6b:ae 10.100.0.8
Feb 16 13:22:50 compute-0 ovn_controller[96072]: 2026-02-16T13:22:50Z|00043|binding|INFO|Setting lport b14d49f7-53e0-4c41-a463-8f16b26817ae ovn-installed in OVS
Feb 16 13:22:50 compute-0 nova_compute[185723]: 2026-02-16 13:22:50.625 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:22:50 compute-0 systemd-machined[155229]: New machine qemu-3-instance-00000005.
Feb 16 13:22:50 compute-0 ovn_controller[96072]: 2026-02-16T13:22:50Z|00044|binding|INFO|Setting lport b14d49f7-53e0-4c41-a463-8f16b26817ae up in Southbound
Feb 16 13:22:50 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:22:50.661 105360 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:49:6b:ae 10.100.0.8'], port_security=['fa:16:3e:49:6b:ae 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '139d8f81-7f89-4100-af32-e59289aeb6f5', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a6199784-1742-41a7-9152-bb54abb7bef1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b5e0321e3a614b62a46eef7fb2e737ff', 'neutron:revision_number': '2', 'neutron:security_group_ids': '22e3f3ae-6435-49f2-b1a3-ead6d5ff75b1', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5a8796a1-c459-4e68-a95d-23fef829aa8d, chassis=[<ovs.db.idl.Row object at 0x7f2e53046760>], tunnel_key=7, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2e53046760>], logical_port=b14d49f7-53e0-4c41-a463-8f16b26817ae) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 13:22:50 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:22:50.662 105360 INFO neutron.agent.ovn.metadata.agent [-] Port b14d49f7-53e0-4c41-a463-8f16b26817ae in datapath a6199784-1742-41a7-9152-bb54abb7bef1 bound to our chassis
Feb 16 13:22:50 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:22:50.663 105360 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network a6199784-1742-41a7-9152-bb54abb7bef1
Feb 16 13:22:50 compute-0 systemd[1]: Started Virtual Machine qemu-3-instance-00000005.
Feb 16 13:22:50 compute-0 systemd-udevd[207264]: Network interface NamePolicy= disabled on kernel command line.
Feb 16 13:22:50 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:22:50.675 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[62f6a8b1-6156-4166-8af9-62806c9c2bdb]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:22:50 compute-0 NetworkManager[56177]: <info>  [1771248170.6789] device (tapb14d49f7-53): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 16 13:22:50 compute-0 NetworkManager[56177]: <info>  [1771248170.6799] device (tapb14d49f7-53): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Feb 16 13:22:50 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:22:50.696 206452 DEBUG oslo.privsep.daemon [-] privsep: reply[da4bd6b1-d544-4aba-84a7-43964ea99223]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:22:50 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:22:50.698 206452 DEBUG oslo.privsep.daemon [-] privsep: reply[53526364-9229-4b48-bf7f-7f516b952b17]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:22:50 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:22:50.714 206452 DEBUG oslo.privsep.daemon [-] privsep: reply[2807ec46-9e31-4183-8b79-772277f314cf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:22:50 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:22:50.725 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[11bbb3cc-c442-4998-8989-2aff306fdda2]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa6199784-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:dd:b9:43'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 15], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 418760, 'reachable_time': 19845, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 207278, 'error': None, 'target': 'ovnmeta-a6199784-1742-41a7-9152-bb54abb7bef1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:22:50 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:22:50.736 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[315a4615-a57c-463b-8dcf-895357b577ce]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapa6199784-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 418769, 'tstamp': 418769}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 207279, 'error': None, 'target': 'ovnmeta-a6199784-1742-41a7-9152-bb54abb7bef1', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapa6199784-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 418771, 'tstamp': 418771}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 207279, 'error': None, 'target': 'ovnmeta-a6199784-1742-41a7-9152-bb54abb7bef1', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:22:50 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:22:50.738 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa6199784-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:22:50 compute-0 nova_compute[185723]: 2026-02-16 13:22:50.739 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:22:50 compute-0 nova_compute[185723]: 2026-02-16 13:22:50.740 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:22:50 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:22:50.742 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa6199784-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:22:50 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:22:50.743 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 13:22:50 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:22:50.743 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapa6199784-10, col_values=(('external_ids', {'iface-id': '3b5a298b-9fc2-4705-8faa-2b8cfb88937b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:22:50 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:22:50.743 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 13:22:50 compute-0 nova_compute[185723]: 2026-02-16 13:22:50.926 185727 DEBUG nova.virt.driver [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] Emitting event <LifecycleEvent: 1771248170.9265287, 139d8f81-7f89-4100-af32-e59289aeb6f5 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 13:22:50 compute-0 nova_compute[185723]: 2026-02-16 13:22:50.928 185727 INFO nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: 139d8f81-7f89-4100-af32-e59289aeb6f5] VM Started (Lifecycle Event)
Feb 16 13:22:50 compute-0 nova_compute[185723]: 2026-02-16 13:22:50.954 185727 DEBUG nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: 139d8f81-7f89-4100-af32-e59289aeb6f5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:22:50 compute-0 nova_compute[185723]: 2026-02-16 13:22:50.958 185727 DEBUG nova.virt.driver [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] Emitting event <LifecycleEvent: 1771248170.927788, 139d8f81-7f89-4100-af32-e59289aeb6f5 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 13:22:50 compute-0 nova_compute[185723]: 2026-02-16 13:22:50.958 185727 INFO nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: 139d8f81-7f89-4100-af32-e59289aeb6f5] VM Paused (Lifecycle Event)
Feb 16 13:22:50 compute-0 nova_compute[185723]: 2026-02-16 13:22:50.986 185727 DEBUG nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: 139d8f81-7f89-4100-af32-e59289aeb6f5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:22:50 compute-0 nova_compute[185723]: 2026-02-16 13:22:50.990 185727 DEBUG nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: 139d8f81-7f89-4100-af32-e59289aeb6f5] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 16 13:22:51 compute-0 nova_compute[185723]: 2026-02-16 13:22:51.013 185727 INFO nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: 139d8f81-7f89-4100-af32-e59289aeb6f5] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 16 13:22:51 compute-0 nova_compute[185723]: 2026-02-16 13:22:51.322 185727 DEBUG nova.compute.manager [req-399b9793-15f4-465d-abbc-a99dff825076 req-849cc3d2-1720-4a4c-b4fc-71ac440fc38d faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 139d8f81-7f89-4100-af32-e59289aeb6f5] Received event network-vif-plugged-b14d49f7-53e0-4c41-a463-8f16b26817ae external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:22:51 compute-0 nova_compute[185723]: 2026-02-16 13:22:51.322 185727 DEBUG oslo_concurrency.lockutils [req-399b9793-15f4-465d-abbc-a99dff825076 req-849cc3d2-1720-4a4c-b4fc-71ac440fc38d faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "139d8f81-7f89-4100-af32-e59289aeb6f5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:22:51 compute-0 nova_compute[185723]: 2026-02-16 13:22:51.323 185727 DEBUG oslo_concurrency.lockutils [req-399b9793-15f4-465d-abbc-a99dff825076 req-849cc3d2-1720-4a4c-b4fc-71ac440fc38d faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "139d8f81-7f89-4100-af32-e59289aeb6f5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:22:51 compute-0 nova_compute[185723]: 2026-02-16 13:22:51.323 185727 DEBUG oslo_concurrency.lockutils [req-399b9793-15f4-465d-abbc-a99dff825076 req-849cc3d2-1720-4a4c-b4fc-71ac440fc38d faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "139d8f81-7f89-4100-af32-e59289aeb6f5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:22:51 compute-0 nova_compute[185723]: 2026-02-16 13:22:51.323 185727 DEBUG nova.compute.manager [req-399b9793-15f4-465d-abbc-a99dff825076 req-849cc3d2-1720-4a4c-b4fc-71ac440fc38d faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 139d8f81-7f89-4100-af32-e59289aeb6f5] Processing event network-vif-plugged-b14d49f7-53e0-4c41-a463-8f16b26817ae _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Feb 16 13:22:51 compute-0 nova_compute[185723]: 2026-02-16 13:22:51.324 185727 DEBUG nova.compute.manager [None req-ced86f35-95de-416d-8655-58237d8ddebf 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 139d8f81-7f89-4100-af32-e59289aeb6f5] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Feb 16 13:22:51 compute-0 nova_compute[185723]: 2026-02-16 13:22:51.327 185727 DEBUG nova.virt.driver [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] Emitting event <LifecycleEvent: 1771248171.3277035, 139d8f81-7f89-4100-af32-e59289aeb6f5 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 13:22:51 compute-0 nova_compute[185723]: 2026-02-16 13:22:51.328 185727 INFO nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: 139d8f81-7f89-4100-af32-e59289aeb6f5] VM Resumed (Lifecycle Event)
Feb 16 13:22:51 compute-0 nova_compute[185723]: 2026-02-16 13:22:51.329 185727 DEBUG nova.virt.libvirt.driver [None req-ced86f35-95de-416d-8655-58237d8ddebf 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 139d8f81-7f89-4100-af32-e59289aeb6f5] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Feb 16 13:22:51 compute-0 nova_compute[185723]: 2026-02-16 13:22:51.332 185727 INFO nova.virt.libvirt.driver [-] [instance: 139d8f81-7f89-4100-af32-e59289aeb6f5] Instance spawned successfully.
Feb 16 13:22:51 compute-0 nova_compute[185723]: 2026-02-16 13:22:51.333 185727 DEBUG nova.virt.libvirt.driver [None req-ced86f35-95de-416d-8655-58237d8ddebf 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 139d8f81-7f89-4100-af32-e59289aeb6f5] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Feb 16 13:22:51 compute-0 nova_compute[185723]: 2026-02-16 13:22:51.635 185727 DEBUG nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: 139d8f81-7f89-4100-af32-e59289aeb6f5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:22:51 compute-0 nova_compute[185723]: 2026-02-16 13:22:51.639 185727 DEBUG nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: 139d8f81-7f89-4100-af32-e59289aeb6f5] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 16 13:22:51 compute-0 nova_compute[185723]: 2026-02-16 13:22:51.650 185727 DEBUG nova.virt.libvirt.driver [None req-ced86f35-95de-416d-8655-58237d8ddebf 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 139d8f81-7f89-4100-af32-e59289aeb6f5] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:22:51 compute-0 nova_compute[185723]: 2026-02-16 13:22:51.650 185727 DEBUG nova.virt.libvirt.driver [None req-ced86f35-95de-416d-8655-58237d8ddebf 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 139d8f81-7f89-4100-af32-e59289aeb6f5] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:22:51 compute-0 nova_compute[185723]: 2026-02-16 13:22:51.651 185727 DEBUG nova.virt.libvirt.driver [None req-ced86f35-95de-416d-8655-58237d8ddebf 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 139d8f81-7f89-4100-af32-e59289aeb6f5] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:22:51 compute-0 nova_compute[185723]: 2026-02-16 13:22:51.651 185727 DEBUG nova.virt.libvirt.driver [None req-ced86f35-95de-416d-8655-58237d8ddebf 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 139d8f81-7f89-4100-af32-e59289aeb6f5] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:22:51 compute-0 nova_compute[185723]: 2026-02-16 13:22:51.651 185727 DEBUG nova.virt.libvirt.driver [None req-ced86f35-95de-416d-8655-58237d8ddebf 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 139d8f81-7f89-4100-af32-e59289aeb6f5] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:22:51 compute-0 nova_compute[185723]: 2026-02-16 13:22:51.652 185727 DEBUG nova.virt.libvirt.driver [None req-ced86f35-95de-416d-8655-58237d8ddebf 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 139d8f81-7f89-4100-af32-e59289aeb6f5] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:22:51 compute-0 nova_compute[185723]: 2026-02-16 13:22:51.759 185727 INFO nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: 139d8f81-7f89-4100-af32-e59289aeb6f5] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 16 13:22:51 compute-0 nova_compute[185723]: 2026-02-16 13:22:51.793 185727 DEBUG nova.network.neutron [req-55db466d-bb2f-44c6-89d3-6348917e4b39 req-cf96fad0-fb5c-466b-8026-f68c83babff6 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 139d8f81-7f89-4100-af32-e59289aeb6f5] Updated VIF entry in instance network info cache for port b14d49f7-53e0-4c41-a463-8f16b26817ae. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 16 13:22:51 compute-0 nova_compute[185723]: 2026-02-16 13:22:51.794 185727 DEBUG nova.network.neutron [req-55db466d-bb2f-44c6-89d3-6348917e4b39 req-cf96fad0-fb5c-466b-8026-f68c83babff6 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 139d8f81-7f89-4100-af32-e59289aeb6f5] Updating instance_info_cache with network_info: [{"id": "b14d49f7-53e0-4c41-a463-8f16b26817ae", "address": "fa:16:3e:49:6b:ae", "network": {"id": "a6199784-1742-41a7-9152-bb54abb7bef1", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-926379118-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5e0321e3a614b62a46eef7fb2e737ff", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb14d49f7-53", "ovs_interfaceid": "b14d49f7-53e0-4c41-a463-8f16b26817ae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 13:22:51 compute-0 nova_compute[185723]: 2026-02-16 13:22:51.796 185727 INFO nova.compute.manager [None req-ced86f35-95de-416d-8655-58237d8ddebf 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 139d8f81-7f89-4100-af32-e59289aeb6f5] Took 9.18 seconds to spawn the instance on the hypervisor.
Feb 16 13:22:51 compute-0 nova_compute[185723]: 2026-02-16 13:22:51.796 185727 DEBUG nova.compute.manager [None req-ced86f35-95de-416d-8655-58237d8ddebf 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 139d8f81-7f89-4100-af32-e59289aeb6f5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:22:51 compute-0 nova_compute[185723]: 2026-02-16 13:22:51.827 185727 DEBUG oslo_concurrency.lockutils [req-55db466d-bb2f-44c6-89d3-6348917e4b39 req-cf96fad0-fb5c-466b-8026-f68c83babff6 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Releasing lock "refresh_cache-139d8f81-7f89-4100-af32-e59289aeb6f5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 13:22:51 compute-0 nova_compute[185723]: 2026-02-16 13:22:51.900 185727 INFO nova.compute.manager [None req-ced86f35-95de-416d-8655-58237d8ddebf 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 139d8f81-7f89-4100-af32-e59289aeb6f5] Took 9.99 seconds to build instance.
Feb 16 13:22:51 compute-0 nova_compute[185723]: 2026-02-16 13:22:51.933 185727 DEBUG oslo_concurrency.lockutils [None req-ced86f35-95de-416d-8655-58237d8ddebf 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Lock "139d8f81-7f89-4100-af32-e59289aeb6f5" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.168s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:22:52 compute-0 nova_compute[185723]: 2026-02-16 13:22:52.136 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:22:53 compute-0 sshd-session[207287]: Connection closed by authenticating user root 146.190.226.24 port 47478 [preauth]
Feb 16 13:22:53 compute-0 nova_compute[185723]: 2026-02-16 13:22:53.522 185727 DEBUG nova.compute.manager [req-b2ea80dd-0269-4d01-b3e7-ab1712ece7ea req-e3f932bf-4488-4fe1-a68c-bc0edb7fc3f1 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 139d8f81-7f89-4100-af32-e59289aeb6f5] Received event network-vif-plugged-b14d49f7-53e0-4c41-a463-8f16b26817ae external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:22:53 compute-0 nova_compute[185723]: 2026-02-16 13:22:53.523 185727 DEBUG oslo_concurrency.lockutils [req-b2ea80dd-0269-4d01-b3e7-ab1712ece7ea req-e3f932bf-4488-4fe1-a68c-bc0edb7fc3f1 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "139d8f81-7f89-4100-af32-e59289aeb6f5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:22:53 compute-0 nova_compute[185723]: 2026-02-16 13:22:53.524 185727 DEBUG oslo_concurrency.lockutils [req-b2ea80dd-0269-4d01-b3e7-ab1712ece7ea req-e3f932bf-4488-4fe1-a68c-bc0edb7fc3f1 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "139d8f81-7f89-4100-af32-e59289aeb6f5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:22:53 compute-0 nova_compute[185723]: 2026-02-16 13:22:53.525 185727 DEBUG oslo_concurrency.lockutils [req-b2ea80dd-0269-4d01-b3e7-ab1712ece7ea req-e3f932bf-4488-4fe1-a68c-bc0edb7fc3f1 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "139d8f81-7f89-4100-af32-e59289aeb6f5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:22:53 compute-0 nova_compute[185723]: 2026-02-16 13:22:53.525 185727 DEBUG nova.compute.manager [req-b2ea80dd-0269-4d01-b3e7-ab1712ece7ea req-e3f932bf-4488-4fe1-a68c-bc0edb7fc3f1 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 139d8f81-7f89-4100-af32-e59289aeb6f5] No waiting events found dispatching network-vif-plugged-b14d49f7-53e0-4c41-a463-8f16b26817ae pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:22:53 compute-0 nova_compute[185723]: 2026-02-16 13:22:53.526 185727 WARNING nova.compute.manager [req-b2ea80dd-0269-4d01-b3e7-ab1712ece7ea req-e3f932bf-4488-4fe1-a68c-bc0edb7fc3f1 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 139d8f81-7f89-4100-af32-e59289aeb6f5] Received unexpected event network-vif-plugged-b14d49f7-53e0-4c41-a463-8f16b26817ae for instance with vm_state active and task_state None.
Feb 16 13:22:54 compute-0 podman[207290]: 2026-02-16 13:22:54.018989356 +0000 UTC m=+0.047411932 container health_status d69bd0be854ec8043fcba555b70c2cd311c7a2510d07d6bc9929fe7684abece9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Feb 16 13:22:54 compute-0 podman[207289]: 2026-02-16 13:22:54.038640045 +0000 UTC m=+0.064037156 container health_status 93151dcbec9f159583042df5101bdd7b5b1bcb5f3117955b4bc9cd1c0e465b98 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, org.opencontainers.image.created=2026-02-05T04:57:10Z, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, distribution-scope=public, build-date=2026-02-05T04:57:10Z, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, config_id=openstack_network_exporter, container_name=openstack_network_exporter, name=ubi9/ubi-minimal, release=1770267347, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Red Hat, Inc., architecture=x86_64, version=9.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=)
Feb 16 13:22:54 compute-0 nova_compute[185723]: 2026-02-16 13:22:54.645 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:22:55 compute-0 nova_compute[185723]: 2026-02-16 13:22:55.432 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:22:55 compute-0 nova_compute[185723]: 2026-02-16 13:22:55.433 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Feb 16 13:22:55 compute-0 nova_compute[185723]: 2026-02-16 13:22:55.564 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Feb 16 13:22:55 compute-0 nova_compute[185723]: 2026-02-16 13:22:55.565 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:22:55 compute-0 nova_compute[185723]: 2026-02-16 13:22:55.565 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Feb 16 13:22:55 compute-0 nova_compute[185723]: 2026-02-16 13:22:55.949 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:22:55 compute-0 nova_compute[185723]: 2026-02-16 13:22:55.975 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Triggering sync for uuid b21f8b55-68d7-4cd7-beed-2d61f932f84e _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268
Feb 16 13:22:55 compute-0 nova_compute[185723]: 2026-02-16 13:22:55.976 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Triggering sync for uuid 139d8f81-7f89-4100-af32-e59289aeb6f5 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268
Feb 16 13:22:55 compute-0 nova_compute[185723]: 2026-02-16 13:22:55.977 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Acquiring lock "b21f8b55-68d7-4cd7-beed-2d61f932f84e" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:22:55 compute-0 nova_compute[185723]: 2026-02-16 13:22:55.978 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "b21f8b55-68d7-4cd7-beed-2d61f932f84e" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:22:55 compute-0 nova_compute[185723]: 2026-02-16 13:22:55.979 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Acquiring lock "139d8f81-7f89-4100-af32-e59289aeb6f5" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:22:55 compute-0 nova_compute[185723]: 2026-02-16 13:22:55.979 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "139d8f81-7f89-4100-af32-e59289aeb6f5" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:22:56 compute-0 nova_compute[185723]: 2026-02-16 13:22:56.059 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "b21f8b55-68d7-4cd7-beed-2d61f932f84e" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.081s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:22:56 compute-0 nova_compute[185723]: 2026-02-16 13:22:56.061 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "139d8f81-7f89-4100-af32-e59289aeb6f5" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.082s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:22:57 compute-0 nova_compute[185723]: 2026-02-16 13:22:57.139 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:22:58 compute-0 podman[207327]: 2026-02-16 13:22:58.038144182 +0000 UTC m=+0.080577308 container health_status 19961a2f872cc72daf954f4dd556f980b5f58f137d145776df6132c167a8a93c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3)
Feb 16 13:22:58 compute-0 nova_compute[185723]: 2026-02-16 13:22:58.459 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:22:59 compute-0 nova_compute[185723]: 2026-02-16 13:22:59.648 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:22:59 compute-0 podman[195053]: time="2026-02-16T13:22:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:22:59 compute-0 podman[195053]: @ - - [16/Feb/2026:13:22:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17245 "" "Go-http-client/1.1"
Feb 16 13:22:59 compute-0 podman[195053]: @ - - [16/Feb/2026:13:22:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2630 "" "Go-http-client/1.1"
Feb 16 13:23:00 compute-0 nova_compute[185723]: 2026-02-16 13:23:00.432 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:23:00 compute-0 nova_compute[185723]: 2026-02-16 13:23:00.433 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:23:00 compute-0 nova_compute[185723]: 2026-02-16 13:23:00.434 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:23:01 compute-0 openstack_network_exporter[197909]: ERROR   13:23:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:23:01 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:23:01 compute-0 openstack_network_exporter[197909]: ERROR   13:23:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:23:01 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:23:01 compute-0 nova_compute[185723]: 2026-02-16 13:23:01.433 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:23:01 compute-0 nova_compute[185723]: 2026-02-16 13:23:01.472 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:23:01 compute-0 nova_compute[185723]: 2026-02-16 13:23:01.472 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:23:01 compute-0 nova_compute[185723]: 2026-02-16 13:23:01.473 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:23:01 compute-0 nova_compute[185723]: 2026-02-16 13:23:01.473 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 16 13:23:01 compute-0 nova_compute[185723]: 2026-02-16 13:23:01.574 185727 DEBUG oslo_concurrency.processutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/139d8f81-7f89-4100-af32-e59289aeb6f5/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:23:01 compute-0 nova_compute[185723]: 2026-02-16 13:23:01.635 185727 DEBUG oslo_concurrency.processutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/139d8f81-7f89-4100-af32-e59289aeb6f5/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:23:01 compute-0 nova_compute[185723]: 2026-02-16 13:23:01.636 185727 DEBUG oslo_concurrency.processutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/139d8f81-7f89-4100-af32-e59289aeb6f5/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:23:01 compute-0 nova_compute[185723]: 2026-02-16 13:23:01.682 185727 DEBUG oslo_concurrency.processutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/139d8f81-7f89-4100-af32-e59289aeb6f5/disk --force-share --output=json" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:23:01 compute-0 nova_compute[185723]: 2026-02-16 13:23:01.688 185727 DEBUG oslo_concurrency.processutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b21f8b55-68d7-4cd7-beed-2d61f932f84e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:23:01 compute-0 nova_compute[185723]: 2026-02-16 13:23:01.733 185727 DEBUG oslo_concurrency.processutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b21f8b55-68d7-4cd7-beed-2d61f932f84e/disk --force-share --output=json" returned: 0 in 0.045s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:23:01 compute-0 nova_compute[185723]: 2026-02-16 13:23:01.734 185727 DEBUG oslo_concurrency.processutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b21f8b55-68d7-4cd7-beed-2d61f932f84e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:23:01 compute-0 nova_compute[185723]: 2026-02-16 13:23:01.778 185727 DEBUG oslo_concurrency.processutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b21f8b55-68d7-4cd7-beed-2d61f932f84e/disk --force-share --output=json" returned: 0 in 0.043s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:23:01 compute-0 nova_compute[185723]: 2026-02-16 13:23:01.904 185727 WARNING nova.virt.libvirt.driver [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 13:23:01 compute-0 nova_compute[185723]: 2026-02-16 13:23:01.905 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5485MB free_disk=73.19770812988281GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 16 13:23:01 compute-0 nova_compute[185723]: 2026-02-16 13:23:01.906 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:23:01 compute-0 nova_compute[185723]: 2026-02-16 13:23:01.906 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:23:02 compute-0 nova_compute[185723]: 2026-02-16 13:23:02.140 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:23:02 compute-0 nova_compute[185723]: 2026-02-16 13:23:02.181 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Instance b21f8b55-68d7-4cd7-beed-2d61f932f84e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 16 13:23:02 compute-0 nova_compute[185723]: 2026-02-16 13:23:02.182 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Instance 139d8f81-7f89-4100-af32-e59289aeb6f5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 16 13:23:02 compute-0 nova_compute[185723]: 2026-02-16 13:23:02.182 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 16 13:23:02 compute-0 nova_compute[185723]: 2026-02-16 13:23:02.182 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 16 13:23:02 compute-0 nova_compute[185723]: 2026-02-16 13:23:02.601 185727 DEBUG nova.compute.provider_tree [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Inventory has not changed in ProviderTree for provider: c9501a85-df32-4b8f-bce0-9425ef1e7866 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 13:23:02 compute-0 nova_compute[185723]: 2026-02-16 13:23:02.647 185727 DEBUG nova.scheduler.client.report [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Inventory has not changed for provider c9501a85-df32-4b8f-bce0-9425ef1e7866 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 13:23:02 compute-0 nova_compute[185723]: 2026-02-16 13:23:02.745 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 16 13:23:02 compute-0 nova_compute[185723]: 2026-02-16 13:23:02.746 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.840s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:23:02 compute-0 nova_compute[185723]: 2026-02-16 13:23:02.852 185727 DEBUG oslo_concurrency.lockutils [None req-77a19d73-6ece-484c-85f8-0fb32e8c5af9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Acquiring lock "393399a7-477d-4663-9a81-2b968a02bb03" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:23:02 compute-0 nova_compute[185723]: 2026-02-16 13:23:02.853 185727 DEBUG oslo_concurrency.lockutils [None req-77a19d73-6ece-484c-85f8-0fb32e8c5af9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Lock "393399a7-477d-4663-9a81-2b968a02bb03" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:23:02 compute-0 nova_compute[185723]: 2026-02-16 13:23:02.987 185727 DEBUG nova.compute.manager [None req-77a19d73-6ece-484c-85f8-0fb32e8c5af9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 393399a7-477d-4663-9a81-2b968a02bb03] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Feb 16 13:23:03 compute-0 nova_compute[185723]: 2026-02-16 13:23:03.177 185727 DEBUG oslo_concurrency.lockutils [None req-77a19d73-6ece-484c-85f8-0fb32e8c5af9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:23:03 compute-0 nova_compute[185723]: 2026-02-16 13:23:03.177 185727 DEBUG oslo_concurrency.lockutils [None req-77a19d73-6ece-484c-85f8-0fb32e8c5af9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:23:03 compute-0 nova_compute[185723]: 2026-02-16 13:23:03.190 185727 DEBUG nova.virt.hardware [None req-77a19d73-6ece-484c-85f8-0fb32e8c5af9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Feb 16 13:23:03 compute-0 nova_compute[185723]: 2026-02-16 13:23:03.190 185727 INFO nova.compute.claims [None req-77a19d73-6ece-484c-85f8-0fb32e8c5af9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 393399a7-477d-4663-9a81-2b968a02bb03] Claim successful on node compute-0.ctlplane.example.com
Feb 16 13:23:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:23:03.214 105360 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:23:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:23:03.214 105360 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:23:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:23:03.215 105360 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:23:03 compute-0 nova_compute[185723]: 2026-02-16 13:23:03.539 185727 DEBUG nova.compute.provider_tree [None req-77a19d73-6ece-484c-85f8-0fb32e8c5af9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Inventory has not changed in ProviderTree for provider: c9501a85-df32-4b8f-bce0-9425ef1e7866 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 13:23:03 compute-0 nova_compute[185723]: 2026-02-16 13:23:03.561 185727 DEBUG nova.scheduler.client.report [None req-77a19d73-6ece-484c-85f8-0fb32e8c5af9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Inventory has not changed for provider c9501a85-df32-4b8f-bce0-9425ef1e7866 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 13:23:03 compute-0 nova_compute[185723]: 2026-02-16 13:23:03.616 185727 DEBUG oslo_concurrency.lockutils [None req-77a19d73-6ece-484c-85f8-0fb32e8c5af9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.438s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:23:03 compute-0 nova_compute[185723]: 2026-02-16 13:23:03.617 185727 DEBUG nova.compute.manager [None req-77a19d73-6ece-484c-85f8-0fb32e8c5af9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 393399a7-477d-4663-9a81-2b968a02bb03] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Feb 16 13:23:03 compute-0 nova_compute[185723]: 2026-02-16 13:23:03.715 185727 DEBUG nova.compute.manager [None req-77a19d73-6ece-484c-85f8-0fb32e8c5af9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 393399a7-477d-4663-9a81-2b968a02bb03] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Feb 16 13:23:03 compute-0 nova_compute[185723]: 2026-02-16 13:23:03.717 185727 DEBUG nova.network.neutron [None req-77a19d73-6ece-484c-85f8-0fb32e8c5af9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 393399a7-477d-4663-9a81-2b968a02bb03] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Feb 16 13:23:03 compute-0 nova_compute[185723]: 2026-02-16 13:23:03.746 185727 INFO nova.virt.libvirt.driver [None req-77a19d73-6ece-484c-85f8-0fb32e8c5af9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 393399a7-477d-4663-9a81-2b968a02bb03] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Feb 16 13:23:03 compute-0 nova_compute[185723]: 2026-02-16 13:23:03.749 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:23:03 compute-0 nova_compute[185723]: 2026-02-16 13:23:03.749 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 16 13:23:03 compute-0 nova_compute[185723]: 2026-02-16 13:23:03.749 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 16 13:23:03 compute-0 ovn_controller[96072]: 2026-02-16T13:23:03Z|00008|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:49:6b:ae 10.100.0.8
Feb 16 13:23:03 compute-0 ovn_controller[96072]: 2026-02-16T13:23:03Z|00009|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:49:6b:ae 10.100.0.8
Feb 16 13:23:03 compute-0 nova_compute[185723]: 2026-02-16 13:23:03.783 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] [instance: 393399a7-477d-4663-9a81-2b968a02bb03] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871
Feb 16 13:23:03 compute-0 nova_compute[185723]: 2026-02-16 13:23:03.787 185727 DEBUG nova.compute.manager [None req-77a19d73-6ece-484c-85f8-0fb32e8c5af9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 393399a7-477d-4663-9a81-2b968a02bb03] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Feb 16 13:23:04 compute-0 nova_compute[185723]: 2026-02-16 13:23:04.038 185727 DEBUG nova.compute.manager [None req-77a19d73-6ece-484c-85f8-0fb32e8c5af9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 393399a7-477d-4663-9a81-2b968a02bb03] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Feb 16 13:23:04 compute-0 nova_compute[185723]: 2026-02-16 13:23:04.040 185727 DEBUG nova.virt.libvirt.driver [None req-77a19d73-6ece-484c-85f8-0fb32e8c5af9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 393399a7-477d-4663-9a81-2b968a02bb03] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Feb 16 13:23:04 compute-0 nova_compute[185723]: 2026-02-16 13:23:04.040 185727 INFO nova.virt.libvirt.driver [None req-77a19d73-6ece-484c-85f8-0fb32e8c5af9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 393399a7-477d-4663-9a81-2b968a02bb03] Creating image(s)
Feb 16 13:23:04 compute-0 nova_compute[185723]: 2026-02-16 13:23:04.041 185727 DEBUG oslo_concurrency.lockutils [None req-77a19d73-6ece-484c-85f8-0fb32e8c5af9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Acquiring lock "/var/lib/nova/instances/393399a7-477d-4663-9a81-2b968a02bb03/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:23:04 compute-0 nova_compute[185723]: 2026-02-16 13:23:04.041 185727 DEBUG oslo_concurrency.lockutils [None req-77a19d73-6ece-484c-85f8-0fb32e8c5af9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Lock "/var/lib/nova/instances/393399a7-477d-4663-9a81-2b968a02bb03/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:23:04 compute-0 nova_compute[185723]: 2026-02-16 13:23:04.042 185727 DEBUG oslo_concurrency.lockutils [None req-77a19d73-6ece-484c-85f8-0fb32e8c5af9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Lock "/var/lib/nova/instances/393399a7-477d-4663-9a81-2b968a02bb03/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:23:04 compute-0 nova_compute[185723]: 2026-02-16 13:23:04.058 185727 DEBUG oslo_concurrency.processutils [None req-77a19d73-6ece-484c-85f8-0fb32e8c5af9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:23:04 compute-0 nova_compute[185723]: 2026-02-16 13:23:04.108 185727 DEBUG oslo_concurrency.processutils [None req-77a19d73-6ece-484c-85f8-0fb32e8c5af9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json" returned: 0 in 0.049s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:23:04 compute-0 nova_compute[185723]: 2026-02-16 13:23:04.109 185727 DEBUG oslo_concurrency.lockutils [None req-77a19d73-6ece-484c-85f8-0fb32e8c5af9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Acquiring lock "755ae6cb63977e865a71a8c38a1a67ce95c9d0b7" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:23:04 compute-0 nova_compute[185723]: 2026-02-16 13:23:04.109 185727 DEBUG oslo_concurrency.lockutils [None req-77a19d73-6ece-484c-85f8-0fb32e8c5af9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Lock "755ae6cb63977e865a71a8c38a1a67ce95c9d0b7" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:23:04 compute-0 nova_compute[185723]: 2026-02-16 13:23:04.123 185727 DEBUG oslo_concurrency.processutils [None req-77a19d73-6ece-484c-85f8-0fb32e8c5af9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:23:04 compute-0 nova_compute[185723]: 2026-02-16 13:23:04.167 185727 DEBUG oslo_concurrency.processutils [None req-77a19d73-6ece-484c-85f8-0fb32e8c5af9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:23:04 compute-0 nova_compute[185723]: 2026-02-16 13:23:04.168 185727 DEBUG oslo_concurrency.processutils [None req-77a19d73-6ece-484c-85f8-0fb32e8c5af9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7,backing_fmt=raw /var/lib/nova/instances/393399a7-477d-4663-9a81-2b968a02bb03/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:23:04 compute-0 nova_compute[185723]: 2026-02-16 13:23:04.197 185727 DEBUG oslo_concurrency.processutils [None req-77a19d73-6ece-484c-85f8-0fb32e8c5af9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7,backing_fmt=raw /var/lib/nova/instances/393399a7-477d-4663-9a81-2b968a02bb03/disk 1073741824" returned: 0 in 0.029s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:23:04 compute-0 nova_compute[185723]: 2026-02-16 13:23:04.197 185727 DEBUG oslo_concurrency.lockutils [None req-77a19d73-6ece-484c-85f8-0fb32e8c5af9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Lock "755ae6cb63977e865a71a8c38a1a67ce95c9d0b7" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.088s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:23:04 compute-0 nova_compute[185723]: 2026-02-16 13:23:04.198 185727 DEBUG oslo_concurrency.processutils [None req-77a19d73-6ece-484c-85f8-0fb32e8c5af9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:23:04 compute-0 nova_compute[185723]: 2026-02-16 13:23:04.242 185727 DEBUG oslo_concurrency.processutils [None req-77a19d73-6ece-484c-85f8-0fb32e8c5af9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:23:04 compute-0 nova_compute[185723]: 2026-02-16 13:23:04.243 185727 DEBUG nova.virt.disk.api [None req-77a19d73-6ece-484c-85f8-0fb32e8c5af9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Checking if we can resize image /var/lib/nova/instances/393399a7-477d-4663-9a81-2b968a02bb03/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Feb 16 13:23:04 compute-0 nova_compute[185723]: 2026-02-16 13:23:04.243 185727 DEBUG oslo_concurrency.processutils [None req-77a19d73-6ece-484c-85f8-0fb32e8c5af9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/393399a7-477d-4663-9a81-2b968a02bb03/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:23:04 compute-0 nova_compute[185723]: 2026-02-16 13:23:04.302 185727 DEBUG oslo_concurrency.processutils [None req-77a19d73-6ece-484c-85f8-0fb32e8c5af9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/393399a7-477d-4663-9a81-2b968a02bb03/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:23:04 compute-0 nova_compute[185723]: 2026-02-16 13:23:04.303 185727 DEBUG nova.virt.disk.api [None req-77a19d73-6ece-484c-85f8-0fb32e8c5af9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Cannot resize image /var/lib/nova/instances/393399a7-477d-4663-9a81-2b968a02bb03/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Feb 16 13:23:04 compute-0 nova_compute[185723]: 2026-02-16 13:23:04.304 185727 DEBUG nova.objects.instance [None req-77a19d73-6ece-484c-85f8-0fb32e8c5af9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Lazy-loading 'migration_context' on Instance uuid 393399a7-477d-4663-9a81-2b968a02bb03 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 13:23:04 compute-0 nova_compute[185723]: 2026-02-16 13:23:04.571 185727 DEBUG nova.virt.libvirt.driver [None req-77a19d73-6ece-484c-85f8-0fb32e8c5af9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 393399a7-477d-4663-9a81-2b968a02bb03] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Feb 16 13:23:04 compute-0 nova_compute[185723]: 2026-02-16 13:23:04.571 185727 DEBUG nova.virt.libvirt.driver [None req-77a19d73-6ece-484c-85f8-0fb32e8c5af9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 393399a7-477d-4663-9a81-2b968a02bb03] Ensure instance console log exists: /var/lib/nova/instances/393399a7-477d-4663-9a81-2b968a02bb03/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Feb 16 13:23:04 compute-0 nova_compute[185723]: 2026-02-16 13:23:04.572 185727 DEBUG oslo_concurrency.lockutils [None req-77a19d73-6ece-484c-85f8-0fb32e8c5af9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:23:04 compute-0 nova_compute[185723]: 2026-02-16 13:23:04.572 185727 DEBUG oslo_concurrency.lockutils [None req-77a19d73-6ece-484c-85f8-0fb32e8c5af9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:23:04 compute-0 nova_compute[185723]: 2026-02-16 13:23:04.572 185727 DEBUG oslo_concurrency.lockutils [None req-77a19d73-6ece-484c-85f8-0fb32e8c5af9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:23:04 compute-0 nova_compute[185723]: 2026-02-16 13:23:04.650 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:23:04 compute-0 nova_compute[185723]: 2026-02-16 13:23:04.775 185727 DEBUG nova.policy [None req-77a19d73-6ece-484c-85f8-0fb32e8c5af9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '53b5045c5aaf4a7d8d84dce2ac4aa424', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b5e0321e3a614b62a46eef7fb2e737ff', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Feb 16 13:23:04 compute-0 nova_compute[185723]: 2026-02-16 13:23:04.882 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Acquiring lock "refresh_cache-b21f8b55-68d7-4cd7-beed-2d61f932f84e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 13:23:04 compute-0 nova_compute[185723]: 2026-02-16 13:23:04.882 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Acquired lock "refresh_cache-b21f8b55-68d7-4cd7-beed-2d61f932f84e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 13:23:04 compute-0 nova_compute[185723]: 2026-02-16 13:23:04.882 185727 DEBUG nova.network.neutron [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] [instance: b21f8b55-68d7-4cd7-beed-2d61f932f84e] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Feb 16 13:23:04 compute-0 nova_compute[185723]: 2026-02-16 13:23:04.883 185727 DEBUG nova.objects.instance [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lazy-loading 'info_cache' on Instance uuid b21f8b55-68d7-4cd7-beed-2d61f932f84e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 13:23:07 compute-0 nova_compute[185723]: 2026-02-16 13:23:07.143 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:23:08 compute-0 podman[207396]: 2026-02-16 13:23:08.028438607 +0000 UTC m=+0.071380269 container health_status 4ec6105d1727f201f656d7e6e358931be8ceabc5af37fb5ad779abc620122180 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 16 13:23:08 compute-0 nova_compute[185723]: 2026-02-16 13:23:08.141 185727 DEBUG nova.network.neutron [None req-77a19d73-6ece-484c-85f8-0fb32e8c5af9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 393399a7-477d-4663-9a81-2b968a02bb03] Successfully created port: 6fabb627-8b4f-4fd3-b05a-ceb17816ec5d _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Feb 16 13:23:09 compute-0 nova_compute[185723]: 2026-02-16 13:23:09.653 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:23:10 compute-0 nova_compute[185723]: 2026-02-16 13:23:10.049 185727 DEBUG nova.network.neutron [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] [instance: b21f8b55-68d7-4cd7-beed-2d61f932f84e] Updating instance_info_cache with network_info: [{"id": "3bdc1813-a8d3-43b8-805c-95acd138d9d6", "address": "fa:16:3e:8a:da:08", "network": {"id": "a6199784-1742-41a7-9152-bb54abb7bef1", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-926379118-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5e0321e3a614b62a46eef7fb2e737ff", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3bdc1813-a8", "ovs_interfaceid": "3bdc1813-a8d3-43b8-805c-95acd138d9d6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 13:23:10 compute-0 nova_compute[185723]: 2026-02-16 13:23:10.079 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Releasing lock "refresh_cache-b21f8b55-68d7-4cd7-beed-2d61f932f84e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 13:23:10 compute-0 nova_compute[185723]: 2026-02-16 13:23:10.080 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] [instance: b21f8b55-68d7-4cd7-beed-2d61f932f84e] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Feb 16 13:23:10 compute-0 nova_compute[185723]: 2026-02-16 13:23:10.081 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:23:10 compute-0 nova_compute[185723]: 2026-02-16 13:23:10.081 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:23:10 compute-0 nova_compute[185723]: 2026-02-16 13:23:10.082 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:23:10 compute-0 nova_compute[185723]: 2026-02-16 13:23:10.082 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 16 13:23:10 compute-0 nova_compute[185723]: 2026-02-16 13:23:10.083 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:23:10 compute-0 sshd-session[207422]: Invalid user postgres from 188.166.42.159 port 46210
Feb 16 13:23:10 compute-0 sshd-session[207422]: Connection closed by invalid user postgres 188.166.42.159 port 46210 [preauth]
Feb 16 13:23:10 compute-0 nova_compute[185723]: 2026-02-16 13:23:10.791 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:23:11 compute-0 nova_compute[185723]: 2026-02-16 13:23:11.294 185727 DEBUG nova.network.neutron [None req-77a19d73-6ece-484c-85f8-0fb32e8c5af9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 393399a7-477d-4663-9a81-2b968a02bb03] Successfully updated port: 6fabb627-8b4f-4fd3-b05a-ceb17816ec5d _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Feb 16 13:23:11 compute-0 nova_compute[185723]: 2026-02-16 13:23:11.335 185727 DEBUG oslo_concurrency.lockutils [None req-77a19d73-6ece-484c-85f8-0fb32e8c5af9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Acquiring lock "refresh_cache-393399a7-477d-4663-9a81-2b968a02bb03" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 13:23:11 compute-0 nova_compute[185723]: 2026-02-16 13:23:11.335 185727 DEBUG oslo_concurrency.lockutils [None req-77a19d73-6ece-484c-85f8-0fb32e8c5af9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Acquired lock "refresh_cache-393399a7-477d-4663-9a81-2b968a02bb03" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 13:23:11 compute-0 nova_compute[185723]: 2026-02-16 13:23:11.336 185727 DEBUG nova.network.neutron [None req-77a19d73-6ece-484c-85f8-0fb32e8c5af9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 393399a7-477d-4663-9a81-2b968a02bb03] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 16 13:23:11 compute-0 nova_compute[185723]: 2026-02-16 13:23:11.603 185727 DEBUG nova.compute.manager [req-a92c8e42-8c0e-4d78-80b7-4c6cf3149376 req-95541980-3147-4914-9c9b-e64a4af6e7ed faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 393399a7-477d-4663-9a81-2b968a02bb03] Received event network-changed-6fabb627-8b4f-4fd3-b05a-ceb17816ec5d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:23:11 compute-0 nova_compute[185723]: 2026-02-16 13:23:11.603 185727 DEBUG nova.compute.manager [req-a92c8e42-8c0e-4d78-80b7-4c6cf3149376 req-95541980-3147-4914-9c9b-e64a4af6e7ed faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 393399a7-477d-4663-9a81-2b968a02bb03] Refreshing instance network info cache due to event network-changed-6fabb627-8b4f-4fd3-b05a-ceb17816ec5d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 16 13:23:11 compute-0 nova_compute[185723]: 2026-02-16 13:23:11.603 185727 DEBUG oslo_concurrency.lockutils [req-a92c8e42-8c0e-4d78-80b7-4c6cf3149376 req-95541980-3147-4914-9c9b-e64a4af6e7ed faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "refresh_cache-393399a7-477d-4663-9a81-2b968a02bb03" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 13:23:12 compute-0 nova_compute[185723]: 2026-02-16 13:23:12.143 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:23:12 compute-0 nova_compute[185723]: 2026-02-16 13:23:12.150 185727 DEBUG nova.network.neutron [None req-77a19d73-6ece-484c-85f8-0fb32e8c5af9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 393399a7-477d-4663-9a81-2b968a02bb03] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Feb 16 13:23:14 compute-0 nova_compute[185723]: 2026-02-16 13:23:14.654 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:23:15 compute-0 nova_compute[185723]: 2026-02-16 13:23:15.608 185727 DEBUG nova.network.neutron [None req-77a19d73-6ece-484c-85f8-0fb32e8c5af9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 393399a7-477d-4663-9a81-2b968a02bb03] Updating instance_info_cache with network_info: [{"id": "6fabb627-8b4f-4fd3-b05a-ceb17816ec5d", "address": "fa:16:3e:ac:31:46", "network": {"id": "a6199784-1742-41a7-9152-bb54abb7bef1", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-926379118-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5e0321e3a614b62a46eef7fb2e737ff", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6fabb627-8b", "ovs_interfaceid": "6fabb627-8b4f-4fd3-b05a-ceb17816ec5d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 13:23:16 compute-0 nova_compute[185723]: 2026-02-16 13:23:16.210 185727 DEBUG oslo_concurrency.lockutils [None req-77a19d73-6ece-484c-85f8-0fb32e8c5af9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Releasing lock "refresh_cache-393399a7-477d-4663-9a81-2b968a02bb03" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 13:23:16 compute-0 nova_compute[185723]: 2026-02-16 13:23:16.211 185727 DEBUG nova.compute.manager [None req-77a19d73-6ece-484c-85f8-0fb32e8c5af9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 393399a7-477d-4663-9a81-2b968a02bb03] Instance network_info: |[{"id": "6fabb627-8b4f-4fd3-b05a-ceb17816ec5d", "address": "fa:16:3e:ac:31:46", "network": {"id": "a6199784-1742-41a7-9152-bb54abb7bef1", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-926379118-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5e0321e3a614b62a46eef7fb2e737ff", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6fabb627-8b", "ovs_interfaceid": "6fabb627-8b4f-4fd3-b05a-ceb17816ec5d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Feb 16 13:23:16 compute-0 nova_compute[185723]: 2026-02-16 13:23:16.211 185727 DEBUG oslo_concurrency.lockutils [req-a92c8e42-8c0e-4d78-80b7-4c6cf3149376 req-95541980-3147-4914-9c9b-e64a4af6e7ed faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquired lock "refresh_cache-393399a7-477d-4663-9a81-2b968a02bb03" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 13:23:16 compute-0 nova_compute[185723]: 2026-02-16 13:23:16.211 185727 DEBUG nova.network.neutron [req-a92c8e42-8c0e-4d78-80b7-4c6cf3149376 req-95541980-3147-4914-9c9b-e64a4af6e7ed faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 393399a7-477d-4663-9a81-2b968a02bb03] Refreshing network info cache for port 6fabb627-8b4f-4fd3-b05a-ceb17816ec5d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 16 13:23:16 compute-0 nova_compute[185723]: 2026-02-16 13:23:16.215 185727 DEBUG nova.virt.libvirt.driver [None req-77a19d73-6ece-484c-85f8-0fb32e8c5af9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 393399a7-477d-4663-9a81-2b968a02bb03] Start _get_guest_xml network_info=[{"id": "6fabb627-8b4f-4fd3-b05a-ceb17816ec5d", "address": "fa:16:3e:ac:31:46", "network": {"id": "a6199784-1742-41a7-9152-bb54abb7bef1", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-926379118-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5e0321e3a614b62a46eef7fb2e737ff", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6fabb627-8b", "ovs_interfaceid": "6fabb627-8b4f-4fd3-b05a-ceb17816ec5d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-16T13:17:01Z,direct_url=<?>,disk_format='qcow2',id=6fb9af7f-2971-4890-a777-6e99e888717f,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='fa8ec31824694513a42cc22c81880a5c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-16T13:17:03Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'encrypted': False, 'device_type': 'disk', 'disk_bus': 'virtio', 'encryption_options': None, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'size': 0, 'image_id': '6fb9af7f-2971-4890-a777-6e99e888717f'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Feb 16 13:23:16 compute-0 nova_compute[185723]: 2026-02-16 13:23:16.220 185727 WARNING nova.virt.libvirt.driver [None req-77a19d73-6ece-484c-85f8-0fb32e8c5af9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 13:23:16 compute-0 nova_compute[185723]: 2026-02-16 13:23:16.241 185727 DEBUG nova.virt.libvirt.host [None req-77a19d73-6ece-484c-85f8-0fb32e8c5af9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Feb 16 13:23:16 compute-0 nova_compute[185723]: 2026-02-16 13:23:16.242 185727 DEBUG nova.virt.libvirt.host [None req-77a19d73-6ece-484c-85f8-0fb32e8c5af9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Feb 16 13:23:16 compute-0 nova_compute[185723]: 2026-02-16 13:23:16.246 185727 DEBUG nova.virt.libvirt.host [None req-77a19d73-6ece-484c-85f8-0fb32e8c5af9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Feb 16 13:23:16 compute-0 nova_compute[185723]: 2026-02-16 13:23:16.247 185727 DEBUG nova.virt.libvirt.host [None req-77a19d73-6ece-484c-85f8-0fb32e8c5af9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Feb 16 13:23:16 compute-0 nova_compute[185723]: 2026-02-16 13:23:16.248 185727 DEBUG nova.virt.libvirt.driver [None req-77a19d73-6ece-484c-85f8-0fb32e8c5af9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Feb 16 13:23:16 compute-0 nova_compute[185723]: 2026-02-16 13:23:16.249 185727 DEBUG nova.virt.hardware [None req-77a19d73-6ece-484c-85f8-0fb32e8c5af9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-02-16T13:16:59Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='6d89f72c-1760-421e-a5f2-83dfc3723b84',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-16T13:17:01Z,direct_url=<?>,disk_format='qcow2',id=6fb9af7f-2971-4890-a777-6e99e888717f,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='fa8ec31824694513a42cc22c81880a5c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-16T13:17:03Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Feb 16 13:23:16 compute-0 nova_compute[185723]: 2026-02-16 13:23:16.249 185727 DEBUG nova.virt.hardware [None req-77a19d73-6ece-484c-85f8-0fb32e8c5af9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Feb 16 13:23:16 compute-0 nova_compute[185723]: 2026-02-16 13:23:16.250 185727 DEBUG nova.virt.hardware [None req-77a19d73-6ece-484c-85f8-0fb32e8c5af9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Feb 16 13:23:16 compute-0 nova_compute[185723]: 2026-02-16 13:23:16.250 185727 DEBUG nova.virt.hardware [None req-77a19d73-6ece-484c-85f8-0fb32e8c5af9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Feb 16 13:23:16 compute-0 nova_compute[185723]: 2026-02-16 13:23:16.251 185727 DEBUG nova.virt.hardware [None req-77a19d73-6ece-484c-85f8-0fb32e8c5af9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Feb 16 13:23:16 compute-0 nova_compute[185723]: 2026-02-16 13:23:16.251 185727 DEBUG nova.virt.hardware [None req-77a19d73-6ece-484c-85f8-0fb32e8c5af9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Feb 16 13:23:16 compute-0 nova_compute[185723]: 2026-02-16 13:23:16.251 185727 DEBUG nova.virt.hardware [None req-77a19d73-6ece-484c-85f8-0fb32e8c5af9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Feb 16 13:23:16 compute-0 nova_compute[185723]: 2026-02-16 13:23:16.252 185727 DEBUG nova.virt.hardware [None req-77a19d73-6ece-484c-85f8-0fb32e8c5af9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Feb 16 13:23:16 compute-0 nova_compute[185723]: 2026-02-16 13:23:16.252 185727 DEBUG nova.virt.hardware [None req-77a19d73-6ece-484c-85f8-0fb32e8c5af9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Feb 16 13:23:16 compute-0 nova_compute[185723]: 2026-02-16 13:23:16.253 185727 DEBUG nova.virt.hardware [None req-77a19d73-6ece-484c-85f8-0fb32e8c5af9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Feb 16 13:23:16 compute-0 nova_compute[185723]: 2026-02-16 13:23:16.253 185727 DEBUG nova.virt.hardware [None req-77a19d73-6ece-484c-85f8-0fb32e8c5af9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Feb 16 13:23:16 compute-0 nova_compute[185723]: 2026-02-16 13:23:16.258 185727 DEBUG nova.virt.libvirt.vif [None req-77a19d73-6ece-484c-85f8-0fb32e8c5af9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-16T13:23:00Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-2071860474',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-2071860474',id=6,image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b5e0321e3a614b62a46eef7fb2e737ff',ramdisk_id='',reservation_id='r-paq9ey49',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteActionsViaActuator-1504038973',owner_user_name='tempest-TestExecuteActionsViaActuator-1504038973-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-16T13:23:03Z,user_data=None,user_id='53b5045c5aaf4a7d8d84dce2ac4aa424',uuid=393399a7-477d-4663-9a81-2b968a02bb03,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "6fabb627-8b4f-4fd3-b05a-ceb17816ec5d", "address": "fa:16:3e:ac:31:46", "network": {"id": "a6199784-1742-41a7-9152-bb54abb7bef1", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-926379118-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5e0321e3a614b62a46eef7fb2e737ff", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6fabb627-8b", "ovs_interfaceid": "6fabb627-8b4f-4fd3-b05a-ceb17816ec5d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Feb 16 13:23:16 compute-0 nova_compute[185723]: 2026-02-16 13:23:16.258 185727 DEBUG nova.network.os_vif_util [None req-77a19d73-6ece-484c-85f8-0fb32e8c5af9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Converting VIF {"id": "6fabb627-8b4f-4fd3-b05a-ceb17816ec5d", "address": "fa:16:3e:ac:31:46", "network": {"id": "a6199784-1742-41a7-9152-bb54abb7bef1", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-926379118-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5e0321e3a614b62a46eef7fb2e737ff", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6fabb627-8b", "ovs_interfaceid": "6fabb627-8b4f-4fd3-b05a-ceb17816ec5d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 13:23:16 compute-0 nova_compute[185723]: 2026-02-16 13:23:16.259 185727 DEBUG nova.network.os_vif_util [None req-77a19d73-6ece-484c-85f8-0fb32e8c5af9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ac:31:46,bridge_name='br-int',has_traffic_filtering=True,id=6fabb627-8b4f-4fd3-b05a-ceb17816ec5d,network=Network(a6199784-1742-41a7-9152-bb54abb7bef1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6fabb627-8b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 13:23:16 compute-0 nova_compute[185723]: 2026-02-16 13:23:16.261 185727 DEBUG nova.objects.instance [None req-77a19d73-6ece-484c-85f8-0fb32e8c5af9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Lazy-loading 'pci_devices' on Instance uuid 393399a7-477d-4663-9a81-2b968a02bb03 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 13:23:16 compute-0 nova_compute[185723]: 2026-02-16 13:23:16.298 185727 DEBUG nova.virt.libvirt.driver [None req-77a19d73-6ece-484c-85f8-0fb32e8c5af9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 393399a7-477d-4663-9a81-2b968a02bb03] End _get_guest_xml xml=<domain type="kvm">
Feb 16 13:23:16 compute-0 nova_compute[185723]:   <uuid>393399a7-477d-4663-9a81-2b968a02bb03</uuid>
Feb 16 13:23:16 compute-0 nova_compute[185723]:   <name>instance-00000006</name>
Feb 16 13:23:16 compute-0 nova_compute[185723]:   <memory>131072</memory>
Feb 16 13:23:16 compute-0 nova_compute[185723]:   <vcpu>1</vcpu>
Feb 16 13:23:16 compute-0 nova_compute[185723]:   <metadata>
Feb 16 13:23:16 compute-0 nova_compute[185723]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb 16 13:23:16 compute-0 nova_compute[185723]:       <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Feb 16 13:23:16 compute-0 nova_compute[185723]:       <nova:name>tempest-TestExecuteActionsViaActuator-server-2071860474</nova:name>
Feb 16 13:23:16 compute-0 nova_compute[185723]:       <nova:creationTime>2026-02-16 13:23:16</nova:creationTime>
Feb 16 13:23:16 compute-0 nova_compute[185723]:       <nova:flavor name="m1.nano">
Feb 16 13:23:16 compute-0 nova_compute[185723]:         <nova:memory>128</nova:memory>
Feb 16 13:23:16 compute-0 nova_compute[185723]:         <nova:disk>1</nova:disk>
Feb 16 13:23:16 compute-0 nova_compute[185723]:         <nova:swap>0</nova:swap>
Feb 16 13:23:16 compute-0 nova_compute[185723]:         <nova:ephemeral>0</nova:ephemeral>
Feb 16 13:23:16 compute-0 nova_compute[185723]:         <nova:vcpus>1</nova:vcpus>
Feb 16 13:23:16 compute-0 nova_compute[185723]:       </nova:flavor>
Feb 16 13:23:16 compute-0 nova_compute[185723]:       <nova:owner>
Feb 16 13:23:16 compute-0 nova_compute[185723]:         <nova:user uuid="53b5045c5aaf4a7d8d84dce2ac4aa424">tempest-TestExecuteActionsViaActuator-1504038973-project-member</nova:user>
Feb 16 13:23:16 compute-0 nova_compute[185723]:         <nova:project uuid="b5e0321e3a614b62a46eef7fb2e737ff">tempest-TestExecuteActionsViaActuator-1504038973</nova:project>
Feb 16 13:23:16 compute-0 nova_compute[185723]:       </nova:owner>
Feb 16 13:23:16 compute-0 nova_compute[185723]:       <nova:root type="image" uuid="6fb9af7f-2971-4890-a777-6e99e888717f"/>
Feb 16 13:23:16 compute-0 nova_compute[185723]:       <nova:ports>
Feb 16 13:23:16 compute-0 nova_compute[185723]:         <nova:port uuid="6fabb627-8b4f-4fd3-b05a-ceb17816ec5d">
Feb 16 13:23:16 compute-0 nova_compute[185723]:           <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Feb 16 13:23:16 compute-0 nova_compute[185723]:         </nova:port>
Feb 16 13:23:16 compute-0 nova_compute[185723]:       </nova:ports>
Feb 16 13:23:16 compute-0 nova_compute[185723]:     </nova:instance>
Feb 16 13:23:16 compute-0 nova_compute[185723]:   </metadata>
Feb 16 13:23:16 compute-0 nova_compute[185723]:   <sysinfo type="smbios">
Feb 16 13:23:16 compute-0 nova_compute[185723]:     <system>
Feb 16 13:23:16 compute-0 nova_compute[185723]:       <entry name="manufacturer">RDO</entry>
Feb 16 13:23:16 compute-0 nova_compute[185723]:       <entry name="product">OpenStack Compute</entry>
Feb 16 13:23:16 compute-0 nova_compute[185723]:       <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Feb 16 13:23:16 compute-0 nova_compute[185723]:       <entry name="serial">393399a7-477d-4663-9a81-2b968a02bb03</entry>
Feb 16 13:23:16 compute-0 nova_compute[185723]:       <entry name="uuid">393399a7-477d-4663-9a81-2b968a02bb03</entry>
Feb 16 13:23:16 compute-0 nova_compute[185723]:       <entry name="family">Virtual Machine</entry>
Feb 16 13:23:16 compute-0 nova_compute[185723]:     </system>
Feb 16 13:23:16 compute-0 nova_compute[185723]:   </sysinfo>
Feb 16 13:23:16 compute-0 nova_compute[185723]:   <os>
Feb 16 13:23:16 compute-0 nova_compute[185723]:     <type arch="x86_64" machine="q35">hvm</type>
Feb 16 13:23:16 compute-0 nova_compute[185723]:     <boot dev="hd"/>
Feb 16 13:23:16 compute-0 nova_compute[185723]:     <smbios mode="sysinfo"/>
Feb 16 13:23:16 compute-0 nova_compute[185723]:   </os>
Feb 16 13:23:16 compute-0 nova_compute[185723]:   <features>
Feb 16 13:23:16 compute-0 nova_compute[185723]:     <acpi/>
Feb 16 13:23:16 compute-0 nova_compute[185723]:     <apic/>
Feb 16 13:23:16 compute-0 nova_compute[185723]:     <vmcoreinfo/>
Feb 16 13:23:16 compute-0 nova_compute[185723]:   </features>
Feb 16 13:23:16 compute-0 nova_compute[185723]:   <clock offset="utc">
Feb 16 13:23:16 compute-0 nova_compute[185723]:     <timer name="pit" tickpolicy="delay"/>
Feb 16 13:23:16 compute-0 nova_compute[185723]:     <timer name="rtc" tickpolicy="catchup"/>
Feb 16 13:23:16 compute-0 nova_compute[185723]:     <timer name="hpet" present="no"/>
Feb 16 13:23:16 compute-0 nova_compute[185723]:   </clock>
Feb 16 13:23:16 compute-0 nova_compute[185723]:   <cpu mode="custom" match="exact">
Feb 16 13:23:16 compute-0 nova_compute[185723]:     <model>Nehalem</model>
Feb 16 13:23:16 compute-0 nova_compute[185723]:     <topology sockets="1" cores="1" threads="1"/>
Feb 16 13:23:16 compute-0 nova_compute[185723]:   </cpu>
Feb 16 13:23:16 compute-0 nova_compute[185723]:   <devices>
Feb 16 13:23:16 compute-0 nova_compute[185723]:     <disk type="file" device="disk">
Feb 16 13:23:16 compute-0 nova_compute[185723]:       <driver name="qemu" type="qcow2" cache="none"/>
Feb 16 13:23:16 compute-0 nova_compute[185723]:       <source file="/var/lib/nova/instances/393399a7-477d-4663-9a81-2b968a02bb03/disk"/>
Feb 16 13:23:16 compute-0 nova_compute[185723]:       <target dev="vda" bus="virtio"/>
Feb 16 13:23:16 compute-0 nova_compute[185723]:     </disk>
Feb 16 13:23:16 compute-0 nova_compute[185723]:     <disk type="file" device="cdrom">
Feb 16 13:23:16 compute-0 nova_compute[185723]:       <driver name="qemu" type="raw" cache="none"/>
Feb 16 13:23:16 compute-0 nova_compute[185723]:       <source file="/var/lib/nova/instances/393399a7-477d-4663-9a81-2b968a02bb03/disk.config"/>
Feb 16 13:23:16 compute-0 nova_compute[185723]:       <target dev="sda" bus="sata"/>
Feb 16 13:23:16 compute-0 nova_compute[185723]:     </disk>
Feb 16 13:23:16 compute-0 nova_compute[185723]:     <interface type="ethernet">
Feb 16 13:23:16 compute-0 nova_compute[185723]:       <mac address="fa:16:3e:ac:31:46"/>
Feb 16 13:23:16 compute-0 nova_compute[185723]:       <model type="virtio"/>
Feb 16 13:23:16 compute-0 nova_compute[185723]:       <driver name="vhost" rx_queue_size="512"/>
Feb 16 13:23:16 compute-0 nova_compute[185723]:       <mtu size="1442"/>
Feb 16 13:23:16 compute-0 nova_compute[185723]:       <target dev="tap6fabb627-8b"/>
Feb 16 13:23:16 compute-0 nova_compute[185723]:     </interface>
Feb 16 13:23:16 compute-0 nova_compute[185723]:     <serial type="pty">
Feb 16 13:23:16 compute-0 nova_compute[185723]:       <log file="/var/lib/nova/instances/393399a7-477d-4663-9a81-2b968a02bb03/console.log" append="off"/>
Feb 16 13:23:16 compute-0 nova_compute[185723]:     </serial>
Feb 16 13:23:16 compute-0 nova_compute[185723]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Feb 16 13:23:16 compute-0 nova_compute[185723]:     <video>
Feb 16 13:23:16 compute-0 nova_compute[185723]:       <model type="virtio"/>
Feb 16 13:23:16 compute-0 nova_compute[185723]:     </video>
Feb 16 13:23:16 compute-0 nova_compute[185723]:     <input type="tablet" bus="usb"/>
Feb 16 13:23:16 compute-0 nova_compute[185723]:     <rng model="virtio">
Feb 16 13:23:16 compute-0 nova_compute[185723]:       <backend model="random">/dev/urandom</backend>
Feb 16 13:23:16 compute-0 nova_compute[185723]:     </rng>
Feb 16 13:23:16 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root"/>
Feb 16 13:23:16 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:23:16 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:23:16 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:23:16 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:23:16 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:23:16 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:23:16 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:23:16 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:23:16 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:23:16 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:23:16 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:23:16 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:23:16 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:23:16 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:23:16 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:23:16 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:23:16 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:23:16 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:23:16 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:23:16 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:23:16 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:23:16 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:23:16 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:23:16 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:23:16 compute-0 nova_compute[185723]:     <controller type="usb" index="0"/>
Feb 16 13:23:16 compute-0 nova_compute[185723]:     <memballoon model="virtio">
Feb 16 13:23:16 compute-0 nova_compute[185723]:       <stats period="10"/>
Feb 16 13:23:16 compute-0 nova_compute[185723]:     </memballoon>
Feb 16 13:23:16 compute-0 nova_compute[185723]:   </devices>
Feb 16 13:23:16 compute-0 nova_compute[185723]: </domain>
Feb 16 13:23:16 compute-0 nova_compute[185723]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Feb 16 13:23:16 compute-0 nova_compute[185723]: 2026-02-16 13:23:16.300 185727 DEBUG nova.compute.manager [None req-77a19d73-6ece-484c-85f8-0fb32e8c5af9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 393399a7-477d-4663-9a81-2b968a02bb03] Preparing to wait for external event network-vif-plugged-6fabb627-8b4f-4fd3-b05a-ceb17816ec5d prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Feb 16 13:23:16 compute-0 nova_compute[185723]: 2026-02-16 13:23:16.301 185727 DEBUG oslo_concurrency.lockutils [None req-77a19d73-6ece-484c-85f8-0fb32e8c5af9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Acquiring lock "393399a7-477d-4663-9a81-2b968a02bb03-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:23:16 compute-0 nova_compute[185723]: 2026-02-16 13:23:16.301 185727 DEBUG oslo_concurrency.lockutils [None req-77a19d73-6ece-484c-85f8-0fb32e8c5af9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Lock "393399a7-477d-4663-9a81-2b968a02bb03-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:23:16 compute-0 nova_compute[185723]: 2026-02-16 13:23:16.301 185727 DEBUG oslo_concurrency.lockutils [None req-77a19d73-6ece-484c-85f8-0fb32e8c5af9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Lock "393399a7-477d-4663-9a81-2b968a02bb03-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:23:16 compute-0 nova_compute[185723]: 2026-02-16 13:23:16.302 185727 DEBUG nova.virt.libvirt.vif [None req-77a19d73-6ece-484c-85f8-0fb32e8c5af9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-16T13:23:00Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-2071860474',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-2071860474',id=6,image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b5e0321e3a614b62a46eef7fb2e737ff',ramdisk_id='',reservation_id='r-paq9ey49',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteActionsViaActuator-1504038973',owner_user_name='tempest-TestExecuteActionsViaActuator-1504038973-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-16T13:23:03Z,user_data=None,user_id='53b5045c5aaf4a7d8d84dce2ac4aa424',uuid=393399a7-477d-4663-9a81-2b968a02bb03,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "6fabb627-8b4f-4fd3-b05a-ceb17816ec5d", "address": "fa:16:3e:ac:31:46", "network": {"id": "a6199784-1742-41a7-9152-bb54abb7bef1", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-926379118-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5e0321e3a614b62a46eef7fb2e737ff", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6fabb627-8b", "ovs_interfaceid": "6fabb627-8b4f-4fd3-b05a-ceb17816ec5d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Feb 16 13:23:16 compute-0 nova_compute[185723]: 2026-02-16 13:23:16.302 185727 DEBUG nova.network.os_vif_util [None req-77a19d73-6ece-484c-85f8-0fb32e8c5af9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Converting VIF {"id": "6fabb627-8b4f-4fd3-b05a-ceb17816ec5d", "address": "fa:16:3e:ac:31:46", "network": {"id": "a6199784-1742-41a7-9152-bb54abb7bef1", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-926379118-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5e0321e3a614b62a46eef7fb2e737ff", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6fabb627-8b", "ovs_interfaceid": "6fabb627-8b4f-4fd3-b05a-ceb17816ec5d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 13:23:16 compute-0 nova_compute[185723]: 2026-02-16 13:23:16.303 185727 DEBUG nova.network.os_vif_util [None req-77a19d73-6ece-484c-85f8-0fb32e8c5af9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ac:31:46,bridge_name='br-int',has_traffic_filtering=True,id=6fabb627-8b4f-4fd3-b05a-ceb17816ec5d,network=Network(a6199784-1742-41a7-9152-bb54abb7bef1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6fabb627-8b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 13:23:16 compute-0 nova_compute[185723]: 2026-02-16 13:23:16.304 185727 DEBUG os_vif [None req-77a19d73-6ece-484c-85f8-0fb32e8c5af9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ac:31:46,bridge_name='br-int',has_traffic_filtering=True,id=6fabb627-8b4f-4fd3-b05a-ceb17816ec5d,network=Network(a6199784-1742-41a7-9152-bb54abb7bef1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6fabb627-8b') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Feb 16 13:23:16 compute-0 nova_compute[185723]: 2026-02-16 13:23:16.305 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:23:16 compute-0 nova_compute[185723]: 2026-02-16 13:23:16.305 185727 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:23:16 compute-0 nova_compute[185723]: 2026-02-16 13:23:16.306 185727 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 13:23:16 compute-0 nova_compute[185723]: 2026-02-16 13:23:16.309 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:23:16 compute-0 nova_compute[185723]: 2026-02-16 13:23:16.309 185727 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6fabb627-8b, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:23:16 compute-0 nova_compute[185723]: 2026-02-16 13:23:16.310 185727 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap6fabb627-8b, col_values=(('external_ids', {'iface-id': '6fabb627-8b4f-4fd3-b05a-ceb17816ec5d', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ac:31:46', 'vm-uuid': '393399a7-477d-4663-9a81-2b968a02bb03'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:23:16 compute-0 nova_compute[185723]: 2026-02-16 13:23:16.312 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:23:16 compute-0 NetworkManager[56177]: <info>  [1771248196.3139] manager: (tap6fabb627-8b): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/30)
Feb 16 13:23:16 compute-0 nova_compute[185723]: 2026-02-16 13:23:16.314 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 13:23:16 compute-0 nova_compute[185723]: 2026-02-16 13:23:16.320 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:23:16 compute-0 nova_compute[185723]: 2026-02-16 13:23:16.321 185727 INFO os_vif [None req-77a19d73-6ece-484c-85f8-0fb32e8c5af9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ac:31:46,bridge_name='br-int',has_traffic_filtering=True,id=6fabb627-8b4f-4fd3-b05a-ceb17816ec5d,network=Network(a6199784-1742-41a7-9152-bb54abb7bef1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6fabb627-8b')
Feb 16 13:23:16 compute-0 nova_compute[185723]: 2026-02-16 13:23:16.442 185727 DEBUG nova.virt.libvirt.driver [None req-77a19d73-6ece-484c-85f8-0fb32e8c5af9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 16 13:23:16 compute-0 nova_compute[185723]: 2026-02-16 13:23:16.443 185727 DEBUG nova.virt.libvirt.driver [None req-77a19d73-6ece-484c-85f8-0fb32e8c5af9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 16 13:23:16 compute-0 nova_compute[185723]: 2026-02-16 13:23:16.443 185727 DEBUG nova.virt.libvirt.driver [None req-77a19d73-6ece-484c-85f8-0fb32e8c5af9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] No VIF found with MAC fa:16:3e:ac:31:46, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Feb 16 13:23:16 compute-0 nova_compute[185723]: 2026-02-16 13:23:16.444 185727 INFO nova.virt.libvirt.driver [None req-77a19d73-6ece-484c-85f8-0fb32e8c5af9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 393399a7-477d-4663-9a81-2b968a02bb03] Using config drive
Feb 16 13:23:17 compute-0 nova_compute[185723]: 2026-02-16 13:23:17.146 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:23:18 compute-0 nova_compute[185723]: 2026-02-16 13:23:18.145 185727 INFO nova.virt.libvirt.driver [None req-77a19d73-6ece-484c-85f8-0fb32e8c5af9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 393399a7-477d-4663-9a81-2b968a02bb03] Creating config drive at /var/lib/nova/instances/393399a7-477d-4663-9a81-2b968a02bb03/disk.config
Feb 16 13:23:18 compute-0 nova_compute[185723]: 2026-02-16 13:23:18.149 185727 DEBUG oslo_concurrency.processutils [None req-77a19d73-6ece-484c-85f8-0fb32e8c5af9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/393399a7-477d-4663-9a81-2b968a02bb03/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmp8p5vp2en execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:23:18 compute-0 nova_compute[185723]: 2026-02-16 13:23:18.270 185727 DEBUG oslo_concurrency.processutils [None req-77a19d73-6ece-484c-85f8-0fb32e8c5af9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/393399a7-477d-4663-9a81-2b968a02bb03/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmp8p5vp2en" returned: 0 in 0.121s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:23:18 compute-0 kernel: tap6fabb627-8b: entered promiscuous mode
Feb 16 13:23:18 compute-0 NetworkManager[56177]: <info>  [1771248198.3055] manager: (tap6fabb627-8b): new Tun device (/org/freedesktop/NetworkManager/Devices/31)
Feb 16 13:23:18 compute-0 ovn_controller[96072]: 2026-02-16T13:23:18Z|00045|binding|INFO|Claiming lport 6fabb627-8b4f-4fd3-b05a-ceb17816ec5d for this chassis.
Feb 16 13:23:18 compute-0 ovn_controller[96072]: 2026-02-16T13:23:18Z|00046|binding|INFO|6fabb627-8b4f-4fd3-b05a-ceb17816ec5d: Claiming fa:16:3e:ac:31:46 10.100.0.7
Feb 16 13:23:18 compute-0 nova_compute[185723]: 2026-02-16 13:23:18.307 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:23:18 compute-0 ovn_controller[96072]: 2026-02-16T13:23:18Z|00047|binding|INFO|Setting lport 6fabb627-8b4f-4fd3-b05a-ceb17816ec5d ovn-installed in OVS
Feb 16 13:23:18 compute-0 nova_compute[185723]: 2026-02-16 13:23:18.313 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:23:18 compute-0 ovn_controller[96072]: 2026-02-16T13:23:18Z|00048|binding|INFO|Setting lport 6fabb627-8b4f-4fd3-b05a-ceb17816ec5d up in Southbound
Feb 16 13:23:18 compute-0 nova_compute[185723]: 2026-02-16 13:23:18.314 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:23:18 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:23:18.315 105360 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ac:31:46 10.100.0.7'], port_security=['fa:16:3e:ac:31:46 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '393399a7-477d-4663-9a81-2b968a02bb03', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a6199784-1742-41a7-9152-bb54abb7bef1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b5e0321e3a614b62a46eef7fb2e737ff', 'neutron:revision_number': '2', 'neutron:security_group_ids': '22e3f3ae-6435-49f2-b1a3-ead6d5ff75b1', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5a8796a1-c459-4e68-a95d-23fef829aa8d, chassis=[<ovs.db.idl.Row object at 0x7f2e53046760>], tunnel_key=8, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2e53046760>], logical_port=6fabb627-8b4f-4fd3-b05a-ceb17816ec5d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 13:23:18 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:23:18.316 105360 INFO neutron.agent.ovn.metadata.agent [-] Port 6fabb627-8b4f-4fd3-b05a-ceb17816ec5d in datapath a6199784-1742-41a7-9152-bb54abb7bef1 bound to our chassis
Feb 16 13:23:18 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:23:18.318 105360 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network a6199784-1742-41a7-9152-bb54abb7bef1
Feb 16 13:23:18 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:23:18.326 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[f7690ed7-c868-4c5e-b8a7-cb1f7db2bb25]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:23:18 compute-0 systemd-udevd[207442]: Network interface NamePolicy= disabled on kernel command line.
Feb 16 13:23:18 compute-0 NetworkManager[56177]: <info>  [1771248198.3375] device (tap6fabb627-8b): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 16 13:23:18 compute-0 NetworkManager[56177]: <info>  [1771248198.3381] device (tap6fabb627-8b): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Feb 16 13:23:18 compute-0 systemd-machined[155229]: New machine qemu-4-instance-00000006.
Feb 16 13:23:18 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:23:18.352 206452 DEBUG oslo.privsep.daemon [-] privsep: reply[e8b8c3c4-176e-4a64-afb6-186269d117ec]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:23:18 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:23:18.355 206452 DEBUG oslo.privsep.daemon [-] privsep: reply[d4c99b81-8092-4d2d-8b4e-b89efd3866e5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:23:18 compute-0 systemd[1]: Started Virtual Machine qemu-4-instance-00000006.
Feb 16 13:23:18 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:23:18.377 206452 DEBUG oslo.privsep.daemon [-] privsep: reply[099958e5-3a3e-400d-b720-060c393eaa49]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:23:18 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:23:18.393 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[b35c8482-4e29-4331-bdde-7e0dfcc66213]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa6199784-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:dd:b9:43'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 7, 'rx_bytes': 700, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 7, 'rx_bytes': 700, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 15], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 418760, 'reachable_time': 19845, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 207454, 'error': None, 'target': 'ovnmeta-a6199784-1742-41a7-9152-bb54abb7bef1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:23:18 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:23:18.410 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[141bc859-9f61-4fee-920f-434f34292f5d]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapa6199784-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 418769, 'tstamp': 418769}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 207458, 'error': None, 'target': 'ovnmeta-a6199784-1742-41a7-9152-bb54abb7bef1', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapa6199784-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 418771, 'tstamp': 418771}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 207458, 'error': None, 'target': 'ovnmeta-a6199784-1742-41a7-9152-bb54abb7bef1', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:23:18 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:23:18.412 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa6199784-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:23:18 compute-0 nova_compute[185723]: 2026-02-16 13:23:18.413 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:23:18 compute-0 nova_compute[185723]: 2026-02-16 13:23:18.414 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:23:18 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:23:18.417 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa6199784-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:23:18 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:23:18.417 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 13:23:18 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:23:18.417 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapa6199784-10, col_values=(('external_ids', {'iface-id': '3b5a298b-9fc2-4705-8faa-2b8cfb88937b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:23:18 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:23:18.417 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 13:23:18 compute-0 nova_compute[185723]: 2026-02-16 13:23:18.804 185727 DEBUG nova.virt.driver [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] Emitting event <LifecycleEvent: 1771248198.8037663, 393399a7-477d-4663-9a81-2b968a02bb03 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 13:23:18 compute-0 nova_compute[185723]: 2026-02-16 13:23:18.804 185727 INFO nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: 393399a7-477d-4663-9a81-2b968a02bb03] VM Started (Lifecycle Event)
Feb 16 13:23:18 compute-0 nova_compute[185723]: 2026-02-16 13:23:18.851 185727 DEBUG nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: 393399a7-477d-4663-9a81-2b968a02bb03] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:23:18 compute-0 nova_compute[185723]: 2026-02-16 13:23:18.854 185727 DEBUG nova.virt.driver [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] Emitting event <LifecycleEvent: 1771248198.8040774, 393399a7-477d-4663-9a81-2b968a02bb03 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 13:23:18 compute-0 nova_compute[185723]: 2026-02-16 13:23:18.855 185727 INFO nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: 393399a7-477d-4663-9a81-2b968a02bb03] VM Paused (Lifecycle Event)
Feb 16 13:23:18 compute-0 nova_compute[185723]: 2026-02-16 13:23:18.908 185727 DEBUG nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: 393399a7-477d-4663-9a81-2b968a02bb03] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:23:18 compute-0 nova_compute[185723]: 2026-02-16 13:23:18.911 185727 DEBUG nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: 393399a7-477d-4663-9a81-2b968a02bb03] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 16 13:23:18 compute-0 nova_compute[185723]: 2026-02-16 13:23:18.974 185727 INFO nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: 393399a7-477d-4663-9a81-2b968a02bb03] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 16 13:23:19 compute-0 nova_compute[185723]: 2026-02-16 13:23:19.090 185727 DEBUG nova.compute.manager [req-aa45a7a5-4056-48c4-a9b8-5d9db71d7bae req-1fe05bdf-1735-40c9-b09e-b90b37f66da8 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 393399a7-477d-4663-9a81-2b968a02bb03] Received event network-vif-plugged-6fabb627-8b4f-4fd3-b05a-ceb17816ec5d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:23:19 compute-0 nova_compute[185723]: 2026-02-16 13:23:19.090 185727 DEBUG oslo_concurrency.lockutils [req-aa45a7a5-4056-48c4-a9b8-5d9db71d7bae req-1fe05bdf-1735-40c9-b09e-b90b37f66da8 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "393399a7-477d-4663-9a81-2b968a02bb03-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:23:19 compute-0 nova_compute[185723]: 2026-02-16 13:23:19.090 185727 DEBUG oslo_concurrency.lockutils [req-aa45a7a5-4056-48c4-a9b8-5d9db71d7bae req-1fe05bdf-1735-40c9-b09e-b90b37f66da8 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "393399a7-477d-4663-9a81-2b968a02bb03-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:23:19 compute-0 nova_compute[185723]: 2026-02-16 13:23:19.090 185727 DEBUG oslo_concurrency.lockutils [req-aa45a7a5-4056-48c4-a9b8-5d9db71d7bae req-1fe05bdf-1735-40c9-b09e-b90b37f66da8 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "393399a7-477d-4663-9a81-2b968a02bb03-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:23:19 compute-0 nova_compute[185723]: 2026-02-16 13:23:19.090 185727 DEBUG nova.compute.manager [req-aa45a7a5-4056-48c4-a9b8-5d9db71d7bae req-1fe05bdf-1735-40c9-b09e-b90b37f66da8 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 393399a7-477d-4663-9a81-2b968a02bb03] Processing event network-vif-plugged-6fabb627-8b4f-4fd3-b05a-ceb17816ec5d _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Feb 16 13:23:19 compute-0 nova_compute[185723]: 2026-02-16 13:23:19.092 185727 DEBUG nova.compute.manager [None req-77a19d73-6ece-484c-85f8-0fb32e8c5af9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 393399a7-477d-4663-9a81-2b968a02bb03] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Feb 16 13:23:19 compute-0 nova_compute[185723]: 2026-02-16 13:23:19.097 185727 DEBUG nova.virt.driver [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] Emitting event <LifecycleEvent: 1771248199.096679, 393399a7-477d-4663-9a81-2b968a02bb03 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 13:23:19 compute-0 nova_compute[185723]: 2026-02-16 13:23:19.097 185727 INFO nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: 393399a7-477d-4663-9a81-2b968a02bb03] VM Resumed (Lifecycle Event)
Feb 16 13:23:19 compute-0 nova_compute[185723]: 2026-02-16 13:23:19.101 185727 DEBUG nova.virt.libvirt.driver [None req-77a19d73-6ece-484c-85f8-0fb32e8c5af9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 393399a7-477d-4663-9a81-2b968a02bb03] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Feb 16 13:23:19 compute-0 nova_compute[185723]: 2026-02-16 13:23:19.104 185727 INFO nova.virt.libvirt.driver [-] [instance: 393399a7-477d-4663-9a81-2b968a02bb03] Instance spawned successfully.
Feb 16 13:23:19 compute-0 nova_compute[185723]: 2026-02-16 13:23:19.105 185727 DEBUG nova.virt.libvirt.driver [None req-77a19d73-6ece-484c-85f8-0fb32e8c5af9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 393399a7-477d-4663-9a81-2b968a02bb03] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Feb 16 13:23:19 compute-0 nova_compute[185723]: 2026-02-16 13:23:19.130 185727 DEBUG nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: 393399a7-477d-4663-9a81-2b968a02bb03] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:23:19 compute-0 nova_compute[185723]: 2026-02-16 13:23:19.133 185727 DEBUG nova.virt.libvirt.driver [None req-77a19d73-6ece-484c-85f8-0fb32e8c5af9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 393399a7-477d-4663-9a81-2b968a02bb03] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:23:19 compute-0 nova_compute[185723]: 2026-02-16 13:23:19.134 185727 DEBUG nova.virt.libvirt.driver [None req-77a19d73-6ece-484c-85f8-0fb32e8c5af9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 393399a7-477d-4663-9a81-2b968a02bb03] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:23:19 compute-0 nova_compute[185723]: 2026-02-16 13:23:19.135 185727 DEBUG nova.virt.libvirt.driver [None req-77a19d73-6ece-484c-85f8-0fb32e8c5af9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 393399a7-477d-4663-9a81-2b968a02bb03] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:23:19 compute-0 nova_compute[185723]: 2026-02-16 13:23:19.135 185727 DEBUG nova.virt.libvirt.driver [None req-77a19d73-6ece-484c-85f8-0fb32e8c5af9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 393399a7-477d-4663-9a81-2b968a02bb03] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:23:19 compute-0 nova_compute[185723]: 2026-02-16 13:23:19.136 185727 DEBUG nova.virt.libvirt.driver [None req-77a19d73-6ece-484c-85f8-0fb32e8c5af9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 393399a7-477d-4663-9a81-2b968a02bb03] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:23:19 compute-0 nova_compute[185723]: 2026-02-16 13:23:19.136 185727 DEBUG nova.virt.libvirt.driver [None req-77a19d73-6ece-484c-85f8-0fb32e8c5af9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 393399a7-477d-4663-9a81-2b968a02bb03] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:23:19 compute-0 nova_compute[185723]: 2026-02-16 13:23:19.141 185727 DEBUG nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: 393399a7-477d-4663-9a81-2b968a02bb03] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 16 13:23:19 compute-0 nova_compute[185723]: 2026-02-16 13:23:19.182 185727 INFO nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: 393399a7-477d-4663-9a81-2b968a02bb03] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 16 13:23:19 compute-0 nova_compute[185723]: 2026-02-16 13:23:19.213 185727 INFO nova.compute.manager [None req-77a19d73-6ece-484c-85f8-0fb32e8c5af9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 393399a7-477d-4663-9a81-2b968a02bb03] Took 15.17 seconds to spawn the instance on the hypervisor.
Feb 16 13:23:19 compute-0 nova_compute[185723]: 2026-02-16 13:23:19.213 185727 DEBUG nova.compute.manager [None req-77a19d73-6ece-484c-85f8-0fb32e8c5af9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 393399a7-477d-4663-9a81-2b968a02bb03] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:23:19 compute-0 nova_compute[185723]: 2026-02-16 13:23:19.392 185727 INFO nova.compute.manager [None req-77a19d73-6ece-484c-85f8-0fb32e8c5af9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 393399a7-477d-4663-9a81-2b968a02bb03] Took 16.28 seconds to build instance.
Feb 16 13:23:19 compute-0 nova_compute[185723]: 2026-02-16 13:23:19.414 185727 DEBUG oslo_concurrency.lockutils [None req-77a19d73-6ece-484c-85f8-0fb32e8c5af9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Lock "393399a7-477d-4663-9a81-2b968a02bb03" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 16.561s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:23:20 compute-0 nova_compute[185723]: 2026-02-16 13:23:20.224 185727 DEBUG nova.network.neutron [req-a92c8e42-8c0e-4d78-80b7-4c6cf3149376 req-95541980-3147-4914-9c9b-e64a4af6e7ed faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 393399a7-477d-4663-9a81-2b968a02bb03] Updated VIF entry in instance network info cache for port 6fabb627-8b4f-4fd3-b05a-ceb17816ec5d. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 16 13:23:20 compute-0 nova_compute[185723]: 2026-02-16 13:23:20.224 185727 DEBUG nova.network.neutron [req-a92c8e42-8c0e-4d78-80b7-4c6cf3149376 req-95541980-3147-4914-9c9b-e64a4af6e7ed faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 393399a7-477d-4663-9a81-2b968a02bb03] Updating instance_info_cache with network_info: [{"id": "6fabb627-8b4f-4fd3-b05a-ceb17816ec5d", "address": "fa:16:3e:ac:31:46", "network": {"id": "a6199784-1742-41a7-9152-bb54abb7bef1", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-926379118-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5e0321e3a614b62a46eef7fb2e737ff", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6fabb627-8b", "ovs_interfaceid": "6fabb627-8b4f-4fd3-b05a-ceb17816ec5d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 13:23:20 compute-0 nova_compute[185723]: 2026-02-16 13:23:20.257 185727 DEBUG oslo_concurrency.lockutils [req-a92c8e42-8c0e-4d78-80b7-4c6cf3149376 req-95541980-3147-4914-9c9b-e64a4af6e7ed faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Releasing lock "refresh_cache-393399a7-477d-4663-9a81-2b968a02bb03" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 13:23:21 compute-0 nova_compute[185723]: 2026-02-16 13:23:21.313 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:23:21 compute-0 nova_compute[185723]: 2026-02-16 13:23:21.615 185727 DEBUG nova.compute.manager [req-95edf9c1-4084-4bec-ad7b-e446110b417d req-75501f38-317f-4c40-b65b-67d3efc803a4 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 393399a7-477d-4663-9a81-2b968a02bb03] Received event network-vif-plugged-6fabb627-8b4f-4fd3-b05a-ceb17816ec5d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:23:21 compute-0 nova_compute[185723]: 2026-02-16 13:23:21.616 185727 DEBUG oslo_concurrency.lockutils [req-95edf9c1-4084-4bec-ad7b-e446110b417d req-75501f38-317f-4c40-b65b-67d3efc803a4 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "393399a7-477d-4663-9a81-2b968a02bb03-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:23:21 compute-0 nova_compute[185723]: 2026-02-16 13:23:21.616 185727 DEBUG oslo_concurrency.lockutils [req-95edf9c1-4084-4bec-ad7b-e446110b417d req-75501f38-317f-4c40-b65b-67d3efc803a4 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "393399a7-477d-4663-9a81-2b968a02bb03-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:23:21 compute-0 nova_compute[185723]: 2026-02-16 13:23:21.616 185727 DEBUG oslo_concurrency.lockutils [req-95edf9c1-4084-4bec-ad7b-e446110b417d req-75501f38-317f-4c40-b65b-67d3efc803a4 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "393399a7-477d-4663-9a81-2b968a02bb03-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:23:21 compute-0 nova_compute[185723]: 2026-02-16 13:23:21.616 185727 DEBUG nova.compute.manager [req-95edf9c1-4084-4bec-ad7b-e446110b417d req-75501f38-317f-4c40-b65b-67d3efc803a4 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 393399a7-477d-4663-9a81-2b968a02bb03] No waiting events found dispatching network-vif-plugged-6fabb627-8b4f-4fd3-b05a-ceb17816ec5d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:23:21 compute-0 nova_compute[185723]: 2026-02-16 13:23:21.617 185727 WARNING nova.compute.manager [req-95edf9c1-4084-4bec-ad7b-e446110b417d req-75501f38-317f-4c40-b65b-67d3efc803a4 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 393399a7-477d-4663-9a81-2b968a02bb03] Received unexpected event network-vif-plugged-6fabb627-8b4f-4fd3-b05a-ceb17816ec5d for instance with vm_state active and task_state None.
Feb 16 13:23:22 compute-0 nova_compute[185723]: 2026-02-16 13:23:22.149 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:23:25 compute-0 podman[207468]: 2026-02-16 13:23:25.017787481 +0000 UTC m=+0.057312083 container health_status d69bd0be854ec8043fcba555b70c2cd311c7a2510d07d6bc9929fe7684abece9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2)
Feb 16 13:23:25 compute-0 podman[207467]: 2026-02-16 13:23:25.049224816 +0000 UTC m=+0.090434050 container health_status 93151dcbec9f159583042df5101bdd7b5b1bcb5f3117955b4bc9cd1c0e465b98 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, release=1770267347, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, managed_by=edpm_ansible, org.opencontainers.image.created=2026-02-05T04:57:10Z, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, version=9.7, config_id=openstack_network_exporter, vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2026-02-05T04:57:10Z, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, name=ubi9/ubi-minimal)
Feb 16 13:23:25 compute-0 sshd-session[207506]: Connection closed by 45.91.64.7 port 60023
Feb 16 13:23:26 compute-0 nova_compute[185723]: 2026-02-16 13:23:26.319 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:23:27 compute-0 nova_compute[185723]: 2026-02-16 13:23:27.153 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:23:29 compute-0 podman[207507]: 2026-02-16 13:23:29.105161518 +0000 UTC m=+0.142129641 container health_status 19961a2f872cc72daf954f4dd556f980b5f58f137d145776df6132c167a8a93c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Feb 16 13:23:29 compute-0 podman[195053]: time="2026-02-16T13:23:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:23:29 compute-0 podman[195053]: @ - - [16/Feb/2026:13:23:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17245 "" "Go-http-client/1.1"
Feb 16 13:23:29 compute-0 podman[195053]: @ - - [16/Feb/2026:13:23:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2632 "" "Go-http-client/1.1"
Feb 16 13:23:31 compute-0 nova_compute[185723]: 2026-02-16 13:23:31.324 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:23:31 compute-0 openstack_network_exporter[197909]: ERROR   13:23:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:23:31 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:23:31 compute-0 openstack_network_exporter[197909]: ERROR   13:23:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:23:31 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:23:31 compute-0 sshd-session[207549]: Connection closed by authenticating user root 64.227.72.94 port 59026 [preauth]
Feb 16 13:23:31 compute-0 ovn_controller[96072]: 2026-02-16T13:23:31Z|00010|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:ac:31:46 10.100.0.7
Feb 16 13:23:31 compute-0 ovn_controller[96072]: 2026-02-16T13:23:31Z|00011|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:ac:31:46 10.100.0.7
Feb 16 13:23:32 compute-0 nova_compute[185723]: 2026-02-16 13:23:32.153 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:23:36 compute-0 nova_compute[185723]: 2026-02-16 13:23:36.326 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:23:37 compute-0 nova_compute[185723]: 2026-02-16 13:23:37.154 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:23:37 compute-0 nova_compute[185723]: 2026-02-16 13:23:37.314 185727 DEBUG nova.virt.libvirt.driver [None req-307919dd-13fc-4b5d-9f47-6d3683aea8c2 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: b21f8b55-68d7-4cd7-beed-2d61f932f84e] Check if temp file /var/lib/nova/instances/tmp2w75m052 exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10065
Feb 16 13:23:37 compute-0 nova_compute[185723]: 2026-02-16 13:23:37.315 185727 DEBUG nova.compute.manager [None req-307919dd-13fc-4b5d-9f47-6d3683aea8c2 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=71680,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp2w75m052',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='b21f8b55-68d7-4cd7-beed-2d61f932f84e',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.9/site-packages/nova/compute/manager.py:8587
Feb 16 13:23:38 compute-0 nova_compute[185723]: 2026-02-16 13:23:38.891 185727 DEBUG oslo_concurrency.processutils [None req-307919dd-13fc-4b5d-9f47-6d3683aea8c2 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b21f8b55-68d7-4cd7-beed-2d61f932f84e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:23:38 compute-0 nova_compute[185723]: 2026-02-16 13:23:38.942 185727 DEBUG oslo_concurrency.processutils [None req-307919dd-13fc-4b5d-9f47-6d3683aea8c2 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b21f8b55-68d7-4cd7-beed-2d61f932f84e/disk --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:23:38 compute-0 nova_compute[185723]: 2026-02-16 13:23:38.943 185727 DEBUG oslo_concurrency.processutils [None req-307919dd-13fc-4b5d-9f47-6d3683aea8c2 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b21f8b55-68d7-4cd7-beed-2d61f932f84e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:23:38 compute-0 nova_compute[185723]: 2026-02-16 13:23:38.994 185727 DEBUG oslo_concurrency.processutils [None req-307919dd-13fc-4b5d-9f47-6d3683aea8c2 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b21f8b55-68d7-4cd7-beed-2d61f932f84e/disk --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:23:39 compute-0 podman[207552]: 2026-02-16 13:23:39.01151029 +0000 UTC m=+0.051891277 container health_status 4ec6105d1727f201f656d7e6e358931be8ceabc5af37fb5ad779abc620122180 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 16 13:23:41 compute-0 nova_compute[185723]: 2026-02-16 13:23:41.329 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:23:42 compute-0 nova_compute[185723]: 2026-02-16 13:23:42.159 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:23:44 compute-0 sshd-session[207581]: Accepted publickey for nova from 192.168.122.101 port 54672 ssh2: ECDSA SHA256:U309eBAZgvPXicX2lI3ib2903RjOpPXbPKVddWOb314
Feb 16 13:23:44 compute-0 systemd-logind[818]: New session 27 of user nova.
Feb 16 13:23:44 compute-0 systemd[1]: Created slice User Slice of UID 42436.
Feb 16 13:23:44 compute-0 systemd[1]: Starting User Runtime Directory /run/user/42436...
Feb 16 13:23:44 compute-0 systemd[1]: Finished User Runtime Directory /run/user/42436.
Feb 16 13:23:44 compute-0 systemd[1]: Starting User Manager for UID 42436...
Feb 16 13:23:44 compute-0 systemd[207585]: pam_unix(systemd-user:session): session opened for user nova(uid=42436) by nova(uid=0)
Feb 16 13:23:44 compute-0 systemd[207585]: Queued start job for default target Main User Target.
Feb 16 13:23:44 compute-0 systemd[207585]: Created slice User Application Slice.
Feb 16 13:23:44 compute-0 systemd[207585]: Started Mark boot as successful after the user session has run 2 minutes.
Feb 16 13:23:44 compute-0 systemd[207585]: Started Daily Cleanup of User's Temporary Directories.
Feb 16 13:23:44 compute-0 systemd[207585]: Reached target Paths.
Feb 16 13:23:44 compute-0 systemd[207585]: Reached target Timers.
Feb 16 13:23:44 compute-0 systemd[207585]: Starting D-Bus User Message Bus Socket...
Feb 16 13:23:44 compute-0 systemd[207585]: Starting Create User's Volatile Files and Directories...
Feb 16 13:23:44 compute-0 systemd[207585]: Finished Create User's Volatile Files and Directories.
Feb 16 13:23:44 compute-0 systemd[207585]: Listening on D-Bus User Message Bus Socket.
Feb 16 13:23:44 compute-0 systemd[207585]: Reached target Sockets.
Feb 16 13:23:44 compute-0 systemd[207585]: Reached target Basic System.
Feb 16 13:23:44 compute-0 systemd[207585]: Reached target Main User Target.
Feb 16 13:23:44 compute-0 systemd[207585]: Startup finished in 134ms.
Feb 16 13:23:44 compute-0 systemd[1]: Started User Manager for UID 42436.
Feb 16 13:23:44 compute-0 systemd[1]: Started Session 27 of User nova.
Feb 16 13:23:44 compute-0 sshd-session[207581]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Feb 16 13:23:45 compute-0 sshd-session[207600]: Received disconnect from 192.168.122.101 port 54672:11: disconnected by user
Feb 16 13:23:45 compute-0 sshd-session[207600]: Disconnected from user nova 192.168.122.101 port 54672
Feb 16 13:23:45 compute-0 sshd-session[207581]: pam_unix(sshd:session): session closed for user nova
Feb 16 13:23:45 compute-0 systemd[1]: session-27.scope: Deactivated successfully.
Feb 16 13:23:45 compute-0 systemd-logind[818]: Session 27 logged out. Waiting for processes to exit.
Feb 16 13:23:45 compute-0 systemd-logind[818]: Removed session 27.
Feb 16 13:23:46 compute-0 nova_compute[185723]: 2026-02-16 13:23:46.227 185727 DEBUG nova.compute.manager [req-3ac0d519-ff4c-406c-9d4b-d601c2ad8c9f req-118f6294-ae8b-4fa5-a72b-4cb82583f5b5 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: b21f8b55-68d7-4cd7-beed-2d61f932f84e] Received event network-vif-unplugged-3bdc1813-a8d3-43b8-805c-95acd138d9d6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:23:46 compute-0 nova_compute[185723]: 2026-02-16 13:23:46.229 185727 DEBUG oslo_concurrency.lockutils [req-3ac0d519-ff4c-406c-9d4b-d601c2ad8c9f req-118f6294-ae8b-4fa5-a72b-4cb82583f5b5 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "b21f8b55-68d7-4cd7-beed-2d61f932f84e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:23:46 compute-0 nova_compute[185723]: 2026-02-16 13:23:46.229 185727 DEBUG oslo_concurrency.lockutils [req-3ac0d519-ff4c-406c-9d4b-d601c2ad8c9f req-118f6294-ae8b-4fa5-a72b-4cb82583f5b5 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "b21f8b55-68d7-4cd7-beed-2d61f932f84e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:23:46 compute-0 nova_compute[185723]: 2026-02-16 13:23:46.230 185727 DEBUG oslo_concurrency.lockutils [req-3ac0d519-ff4c-406c-9d4b-d601c2ad8c9f req-118f6294-ae8b-4fa5-a72b-4cb82583f5b5 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "b21f8b55-68d7-4cd7-beed-2d61f932f84e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:23:46 compute-0 nova_compute[185723]: 2026-02-16 13:23:46.230 185727 DEBUG nova.compute.manager [req-3ac0d519-ff4c-406c-9d4b-d601c2ad8c9f req-118f6294-ae8b-4fa5-a72b-4cb82583f5b5 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: b21f8b55-68d7-4cd7-beed-2d61f932f84e] No waiting events found dispatching network-vif-unplugged-3bdc1813-a8d3-43b8-805c-95acd138d9d6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:23:46 compute-0 nova_compute[185723]: 2026-02-16 13:23:46.230 185727 DEBUG nova.compute.manager [req-3ac0d519-ff4c-406c-9d4b-d601c2ad8c9f req-118f6294-ae8b-4fa5-a72b-4cb82583f5b5 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: b21f8b55-68d7-4cd7-beed-2d61f932f84e] Received event network-vif-unplugged-3bdc1813-a8d3-43b8-805c-95acd138d9d6 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Feb 16 13:23:46 compute-0 nova_compute[185723]: 2026-02-16 13:23:46.331 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:23:47 compute-0 nova_compute[185723]: 2026-02-16 13:23:47.163 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:23:48 compute-0 ovn_controller[96072]: 2026-02-16T13:23:48Z|00049|memory_trim|INFO|Detected inactivity (last active 30003 ms ago): trimming memory
Feb 16 13:23:48 compute-0 nova_compute[185723]: 2026-02-16 13:23:48.782 185727 DEBUG nova.compute.manager [req-27519786-e0e7-4f92-838a-563797462eb0 req-27c8ed0a-2a69-435d-a8cb-ad4b51a7617f faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: b21f8b55-68d7-4cd7-beed-2d61f932f84e] Received event network-vif-plugged-3bdc1813-a8d3-43b8-805c-95acd138d9d6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:23:48 compute-0 nova_compute[185723]: 2026-02-16 13:23:48.782 185727 DEBUG oslo_concurrency.lockutils [req-27519786-e0e7-4f92-838a-563797462eb0 req-27c8ed0a-2a69-435d-a8cb-ad4b51a7617f faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "b21f8b55-68d7-4cd7-beed-2d61f932f84e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:23:48 compute-0 nova_compute[185723]: 2026-02-16 13:23:48.782 185727 DEBUG oslo_concurrency.lockutils [req-27519786-e0e7-4f92-838a-563797462eb0 req-27c8ed0a-2a69-435d-a8cb-ad4b51a7617f faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "b21f8b55-68d7-4cd7-beed-2d61f932f84e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:23:48 compute-0 nova_compute[185723]: 2026-02-16 13:23:48.783 185727 DEBUG oslo_concurrency.lockutils [req-27519786-e0e7-4f92-838a-563797462eb0 req-27c8ed0a-2a69-435d-a8cb-ad4b51a7617f faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "b21f8b55-68d7-4cd7-beed-2d61f932f84e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:23:48 compute-0 nova_compute[185723]: 2026-02-16 13:23:48.783 185727 DEBUG nova.compute.manager [req-27519786-e0e7-4f92-838a-563797462eb0 req-27c8ed0a-2a69-435d-a8cb-ad4b51a7617f faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: b21f8b55-68d7-4cd7-beed-2d61f932f84e] No waiting events found dispatching network-vif-plugged-3bdc1813-a8d3-43b8-805c-95acd138d9d6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:23:48 compute-0 nova_compute[185723]: 2026-02-16 13:23:48.783 185727 WARNING nova.compute.manager [req-27519786-e0e7-4f92-838a-563797462eb0 req-27c8ed0a-2a69-435d-a8cb-ad4b51a7617f faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: b21f8b55-68d7-4cd7-beed-2d61f932f84e] Received unexpected event network-vif-plugged-3bdc1813-a8d3-43b8-805c-95acd138d9d6 for instance with vm_state active and task_state migrating.
Feb 16 13:23:48 compute-0 nova_compute[185723]: 2026-02-16 13:23:48.918 185727 INFO nova.compute.manager [None req-307919dd-13fc-4b5d-9f47-6d3683aea8c2 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: b21f8b55-68d7-4cd7-beed-2d61f932f84e] Took 9.92 seconds for pre_live_migration on destination host compute-1.ctlplane.example.com.
Feb 16 13:23:48 compute-0 nova_compute[185723]: 2026-02-16 13:23:48.918 185727 DEBUG nova.compute.manager [None req-307919dd-13fc-4b5d-9f47-6d3683aea8c2 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: b21f8b55-68d7-4cd7-beed-2d61f932f84e] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Feb 16 13:23:48 compute-0 nova_compute[185723]: 2026-02-16 13:23:48.949 185727 DEBUG nova.compute.manager [None req-307919dd-13fc-4b5d-9f47-6d3683aea8c2 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=71680,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp2w75m052',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='b21f8b55-68d7-4cd7-beed-2d61f932f84e',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(735afe49-654b-4958-aac0-48e243770fe0),old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8939
Feb 16 13:23:48 compute-0 nova_compute[185723]: 2026-02-16 13:23:48.974 185727 DEBUG nova.objects.instance [None req-307919dd-13fc-4b5d-9f47-6d3683aea8c2 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lazy-loading 'migration_context' on Instance uuid b21f8b55-68d7-4cd7-beed-2d61f932f84e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 13:23:48 compute-0 nova_compute[185723]: 2026-02-16 13:23:48.976 185727 DEBUG nova.virt.libvirt.driver [None req-307919dd-13fc-4b5d-9f47-6d3683aea8c2 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: b21f8b55-68d7-4cd7-beed-2d61f932f84e] Starting monitoring of live migration _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10639
Feb 16 13:23:48 compute-0 nova_compute[185723]: 2026-02-16 13:23:48.979 185727 DEBUG nova.virt.libvirt.driver [None req-307919dd-13fc-4b5d-9f47-6d3683aea8c2 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: b21f8b55-68d7-4cd7-beed-2d61f932f84e] Operation thread is still running _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10440
Feb 16 13:23:48 compute-0 nova_compute[185723]: 2026-02-16 13:23:48.979 185727 DEBUG nova.virt.libvirt.driver [None req-307919dd-13fc-4b5d-9f47-6d3683aea8c2 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: b21f8b55-68d7-4cd7-beed-2d61f932f84e] Migration not running yet _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10449
Feb 16 13:23:49 compute-0 nova_compute[185723]: 2026-02-16 13:23:49.000 185727 DEBUG nova.virt.libvirt.vif [None req-307919dd-13fc-4b5d-9f47-6d3683aea8c2 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-16T13:21:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-2049385443',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-2049385443',id=3,image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-16T13:21:47Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='b5e0321e3a614b62a46eef7fb2e737ff',ramdisk_id='',reservation_id='r-h5dq1f8v',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteActionsViaActuator-1504038973',owner_user_name='tempest-TestExecuteActionsViaActuator-1504038973-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-16T13:21:47Z,user_data=None,user_id='53b5045c5aaf4a7d8d84dce2ac4aa424',uuid=b21f8b55-68d7-4cd7-beed-2d61f932f84e,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3bdc1813-a8d3-43b8-805c-95acd138d9d6", "address": "fa:16:3e:8a:da:08", "network": {"id": "a6199784-1742-41a7-9152-bb54abb7bef1", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-926379118-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5e0321e3a614b62a46eef7fb2e737ff", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap3bdc1813-a8", "ovs_interfaceid": "3bdc1813-a8d3-43b8-805c-95acd138d9d6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Feb 16 13:23:49 compute-0 nova_compute[185723]: 2026-02-16 13:23:49.000 185727 DEBUG nova.network.os_vif_util [None req-307919dd-13fc-4b5d-9f47-6d3683aea8c2 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Converting VIF {"id": "3bdc1813-a8d3-43b8-805c-95acd138d9d6", "address": "fa:16:3e:8a:da:08", "network": {"id": "a6199784-1742-41a7-9152-bb54abb7bef1", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-926379118-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5e0321e3a614b62a46eef7fb2e737ff", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap3bdc1813-a8", "ovs_interfaceid": "3bdc1813-a8d3-43b8-805c-95acd138d9d6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 13:23:49 compute-0 nova_compute[185723]: 2026-02-16 13:23:49.001 185727 DEBUG nova.network.os_vif_util [None req-307919dd-13fc-4b5d-9f47-6d3683aea8c2 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:8a:da:08,bridge_name='br-int',has_traffic_filtering=True,id=3bdc1813-a8d3-43b8-805c-95acd138d9d6,network=Network(a6199784-1742-41a7-9152-bb54abb7bef1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3bdc1813-a8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 13:23:49 compute-0 nova_compute[185723]: 2026-02-16 13:23:49.002 185727 DEBUG nova.virt.libvirt.migration [None req-307919dd-13fc-4b5d-9f47-6d3683aea8c2 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: b21f8b55-68d7-4cd7-beed-2d61f932f84e] Updating guest XML with vif config: <interface type="ethernet">
Feb 16 13:23:49 compute-0 nova_compute[185723]:   <mac address="fa:16:3e:8a:da:08"/>
Feb 16 13:23:49 compute-0 nova_compute[185723]:   <model type="virtio"/>
Feb 16 13:23:49 compute-0 nova_compute[185723]:   <driver name="vhost" rx_queue_size="512"/>
Feb 16 13:23:49 compute-0 nova_compute[185723]:   <mtu size="1442"/>
Feb 16 13:23:49 compute-0 nova_compute[185723]:   <target dev="tap3bdc1813-a8"/>
Feb 16 13:23:49 compute-0 nova_compute[185723]: </interface>
Feb 16 13:23:49 compute-0 nova_compute[185723]:  _update_vif_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:388
Feb 16 13:23:49 compute-0 nova_compute[185723]: 2026-02-16 13:23:49.002 185727 DEBUG nova.virt.libvirt.driver [None req-307919dd-13fc-4b5d-9f47-6d3683aea8c2 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: b21f8b55-68d7-4cd7-beed-2d61f932f84e] About to invoke the migrate API _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10272
Feb 16 13:23:49 compute-0 nova_compute[185723]: 2026-02-16 13:23:49.483 185727 DEBUG nova.virt.libvirt.migration [None req-307919dd-13fc-4b5d-9f47-6d3683aea8c2 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: b21f8b55-68d7-4cd7-beed-2d61f932f84e] Current None elapsed 0 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Feb 16 13:23:49 compute-0 nova_compute[185723]: 2026-02-16 13:23:49.485 185727 INFO nova.virt.libvirt.migration [None req-307919dd-13fc-4b5d-9f47-6d3683aea8c2 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: b21f8b55-68d7-4cd7-beed-2d61f932f84e] Increasing downtime to 50 ms after 0 sec elapsed time
Feb 16 13:23:49 compute-0 nova_compute[185723]: 2026-02-16 13:23:49.613 185727 INFO nova.virt.libvirt.driver [None req-307919dd-13fc-4b5d-9f47-6d3683aea8c2 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: b21f8b55-68d7-4cd7-beed-2d61f932f84e] Migration running for 0 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).
Feb 16 13:23:50 compute-0 nova_compute[185723]: 2026-02-16 13:23:50.117 185727 DEBUG nova.virt.libvirt.migration [None req-307919dd-13fc-4b5d-9f47-6d3683aea8c2 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: b21f8b55-68d7-4cd7-beed-2d61f932f84e] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Feb 16 13:23:50 compute-0 nova_compute[185723]: 2026-02-16 13:23:50.118 185727 DEBUG nova.virt.libvirt.migration [None req-307919dd-13fc-4b5d-9f47-6d3683aea8c2 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: b21f8b55-68d7-4cd7-beed-2d61f932f84e] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Feb 16 13:23:50 compute-0 nova_compute[185723]: 2026-02-16 13:23:50.621 185727 DEBUG nova.virt.libvirt.migration [None req-307919dd-13fc-4b5d-9f47-6d3683aea8c2 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: b21f8b55-68d7-4cd7-beed-2d61f932f84e] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Feb 16 13:23:50 compute-0 nova_compute[185723]: 2026-02-16 13:23:50.622 185727 DEBUG nova.virt.libvirt.migration [None req-307919dd-13fc-4b5d-9f47-6d3683aea8c2 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: b21f8b55-68d7-4cd7-beed-2d61f932f84e] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Feb 16 13:23:51 compute-0 nova_compute[185723]: 2026-02-16 13:23:51.097 185727 DEBUG nova.compute.manager [req-5220f890-0bc2-4d2f-92b4-26ab63c0b9a9 req-a9e78cff-887d-4c18-b94d-eed244d9dd50 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: b21f8b55-68d7-4cd7-beed-2d61f932f84e] Received event network-changed-3bdc1813-a8d3-43b8-805c-95acd138d9d6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:23:51 compute-0 nova_compute[185723]: 2026-02-16 13:23:51.097 185727 DEBUG nova.compute.manager [req-5220f890-0bc2-4d2f-92b4-26ab63c0b9a9 req-a9e78cff-887d-4c18-b94d-eed244d9dd50 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: b21f8b55-68d7-4cd7-beed-2d61f932f84e] Refreshing instance network info cache due to event network-changed-3bdc1813-a8d3-43b8-805c-95acd138d9d6. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 16 13:23:51 compute-0 nova_compute[185723]: 2026-02-16 13:23:51.098 185727 DEBUG oslo_concurrency.lockutils [req-5220f890-0bc2-4d2f-92b4-26ab63c0b9a9 req-a9e78cff-887d-4c18-b94d-eed244d9dd50 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "refresh_cache-b21f8b55-68d7-4cd7-beed-2d61f932f84e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 13:23:51 compute-0 nova_compute[185723]: 2026-02-16 13:23:51.098 185727 DEBUG oslo_concurrency.lockutils [req-5220f890-0bc2-4d2f-92b4-26ab63c0b9a9 req-a9e78cff-887d-4c18-b94d-eed244d9dd50 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquired lock "refresh_cache-b21f8b55-68d7-4cd7-beed-2d61f932f84e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 13:23:51 compute-0 nova_compute[185723]: 2026-02-16 13:23:51.099 185727 DEBUG nova.network.neutron [req-5220f890-0bc2-4d2f-92b4-26ab63c0b9a9 req-a9e78cff-887d-4c18-b94d-eed244d9dd50 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: b21f8b55-68d7-4cd7-beed-2d61f932f84e] Refreshing network info cache for port 3bdc1813-a8d3-43b8-805c-95acd138d9d6 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 16 13:23:51 compute-0 nova_compute[185723]: 2026-02-16 13:23:51.126 185727 DEBUG nova.virt.libvirt.migration [None req-307919dd-13fc-4b5d-9f47-6d3683aea8c2 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: b21f8b55-68d7-4cd7-beed-2d61f932f84e] Current 50 elapsed 2 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Feb 16 13:23:51 compute-0 nova_compute[185723]: 2026-02-16 13:23:51.126 185727 DEBUG nova.virt.libvirt.migration [None req-307919dd-13fc-4b5d-9f47-6d3683aea8c2 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: b21f8b55-68d7-4cd7-beed-2d61f932f84e] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Feb 16 13:23:51 compute-0 nova_compute[185723]: 2026-02-16 13:23:51.338 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:23:51 compute-0 nova_compute[185723]: 2026-02-16 13:23:51.583 185727 DEBUG nova.virt.driver [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] Emitting event <LifecycleEvent: 1771248231.5828216, b21f8b55-68d7-4cd7-beed-2d61f932f84e => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 13:23:51 compute-0 nova_compute[185723]: 2026-02-16 13:23:51.583 185727 INFO nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: b21f8b55-68d7-4cd7-beed-2d61f932f84e] VM Paused (Lifecycle Event)
Feb 16 13:23:51 compute-0 nova_compute[185723]: 2026-02-16 13:23:51.629 185727 DEBUG nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: b21f8b55-68d7-4cd7-beed-2d61f932f84e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:23:51 compute-0 nova_compute[185723]: 2026-02-16 13:23:51.634 185727 DEBUG nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: b21f8b55-68d7-4cd7-beed-2d61f932f84e] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 16 13:23:51 compute-0 nova_compute[185723]: 2026-02-16 13:23:51.716 185727 INFO nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: b21f8b55-68d7-4cd7-beed-2d61f932f84e] During sync_power_state the instance has a pending task (migrating). Skip.
Feb 16 13:23:51 compute-0 nova_compute[185723]: 2026-02-16 13:23:51.875 185727 DEBUG nova.virt.libvirt.migration [None req-307919dd-13fc-4b5d-9f47-6d3683aea8c2 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: b21f8b55-68d7-4cd7-beed-2d61f932f84e] Current 50 elapsed 2 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Feb 16 13:23:51 compute-0 nova_compute[185723]: 2026-02-16 13:23:51.875 185727 DEBUG nova.virt.libvirt.migration [None req-307919dd-13fc-4b5d-9f47-6d3683aea8c2 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: b21f8b55-68d7-4cd7-beed-2d61f932f84e] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Feb 16 13:23:52 compute-0 nova_compute[185723]: 2026-02-16 13:23:52.165 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:23:52 compute-0 kernel: tap3bdc1813-a8 (unregistering): left promiscuous mode
Feb 16 13:23:52 compute-0 NetworkManager[56177]: <info>  [1771248232.2940] device (tap3bdc1813-a8): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Feb 16 13:23:52 compute-0 nova_compute[185723]: 2026-02-16 13:23:52.304 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:23:52 compute-0 ovn_controller[96072]: 2026-02-16T13:23:52Z|00050|binding|INFO|Releasing lport 3bdc1813-a8d3-43b8-805c-95acd138d9d6 from this chassis (sb_readonly=0)
Feb 16 13:23:52 compute-0 ovn_controller[96072]: 2026-02-16T13:23:52Z|00051|binding|INFO|Setting lport 3bdc1813-a8d3-43b8-805c-95acd138d9d6 down in Southbound
Feb 16 13:23:52 compute-0 ovn_controller[96072]: 2026-02-16T13:23:52Z|00052|binding|INFO|Removing iface tap3bdc1813-a8 ovn-installed in OVS
Feb 16 13:23:52 compute-0 nova_compute[185723]: 2026-02-16 13:23:52.307 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:23:52 compute-0 nova_compute[185723]: 2026-02-16 13:23:52.314 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:23:52 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:23:52.328 105360 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8a:da:08 10.100.0.4'], port_security=['fa:16:3e:8a:da:08 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com,compute-1.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': '54c1a259-778a-4222-b2c6-8422ea19a065'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'b21f8b55-68d7-4cd7-beed-2d61f932f84e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a6199784-1742-41a7-9152-bb54abb7bef1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b5e0321e3a614b62a46eef7fb2e737ff', 'neutron:revision_number': '8', 'neutron:security_group_ids': '22e3f3ae-6435-49f2-b1a3-ead6d5ff75b1', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5a8796a1-c459-4e68-a95d-23fef829aa8d, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2e53046760>], logical_port=3bdc1813-a8d3-43b8-805c-95acd138d9d6) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2e53046760>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 13:23:52 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:23:52.331 105360 INFO neutron.agent.ovn.metadata.agent [-] Port 3bdc1813-a8d3-43b8-805c-95acd138d9d6 in datapath a6199784-1742-41a7-9152-bb54abb7bef1 unbound from our chassis
Feb 16 13:23:52 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:23:52.333 105360 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network a6199784-1742-41a7-9152-bb54abb7bef1
Feb 16 13:23:52 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:23:52.349 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[f0b18d73-80b9-4de3-a09d-d61e761be3cf]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:23:52 compute-0 systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000003.scope: Deactivated successfully.
Feb 16 13:23:52 compute-0 systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000003.scope: Consumed 15.984s CPU time.
Feb 16 13:23:52 compute-0 systemd-machined[155229]: Machine qemu-2-instance-00000003 terminated.
Feb 16 13:23:52 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:23:52.381 206452 DEBUG oslo.privsep.daemon [-] privsep: reply[fbfecdd9-48d7-432b-acbd-515f60b38a48]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:23:52 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:23:52.386 206452 DEBUG oslo.privsep.daemon [-] privsep: reply[4c8b8add-0e45-484f-bf7b-4a007fdd9fa9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:23:52 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:23:52.416 206452 DEBUG oslo.privsep.daemon [-] privsep: reply[c7650ec0-703d-4a49-8875-d92c40c78209]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:23:52 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:23:52.435 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[7ce0371c-87a6-4aeb-8a85-ae6c5c47bdc5]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa6199784-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:dd:b9:43'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 12, 'tx_packets': 9, 'rx_bytes': 784, 'tx_bytes': 522, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 12, 'tx_packets': 9, 'rx_bytes': 784, 'tx_bytes': 522, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 15], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 418760, 'reachable_time': 19845, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 207635, 'error': None, 'target': 'ovnmeta-a6199784-1742-41a7-9152-bb54abb7bef1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:23:52 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:23:52.450 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[1794ed99-9215-491b-a23e-72ee1cdba15e]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapa6199784-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 418769, 'tstamp': 418769}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 207636, 'error': None, 'target': 'ovnmeta-a6199784-1742-41a7-9152-bb54abb7bef1', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapa6199784-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 418771, 'tstamp': 418771}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 207636, 'error': None, 'target': 'ovnmeta-a6199784-1742-41a7-9152-bb54abb7bef1', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:23:52 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:23:52.452 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa6199784-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:23:52 compute-0 nova_compute[185723]: 2026-02-16 13:23:52.454 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:23:52 compute-0 nova_compute[185723]: 2026-02-16 13:23:52.458 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:23:52 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:23:52.459 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa6199784-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:23:52 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:23:52.460 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 13:23:52 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:23:52.460 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapa6199784-10, col_values=(('external_ids', {'iface-id': '3b5a298b-9fc2-4705-8faa-2b8cfb88937b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:23:52 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:23:52.461 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 13:23:52 compute-0 nova_compute[185723]: 2026-02-16 13:23:52.531 185727 DEBUG nova.virt.libvirt.guest [None req-307919dd-13fc-4b5d-9f47-6d3683aea8c2 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Domain has shutdown/gone away: Requested operation is not valid: domain is not running get_job_info /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:688
Feb 16 13:23:52 compute-0 nova_compute[185723]: 2026-02-16 13:23:52.532 185727 INFO nova.virt.libvirt.driver [None req-307919dd-13fc-4b5d-9f47-6d3683aea8c2 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: b21f8b55-68d7-4cd7-beed-2d61f932f84e] Migration operation has completed
Feb 16 13:23:52 compute-0 nova_compute[185723]: 2026-02-16 13:23:52.533 185727 INFO nova.compute.manager [None req-307919dd-13fc-4b5d-9f47-6d3683aea8c2 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: b21f8b55-68d7-4cd7-beed-2d61f932f84e] _post_live_migration() is started..
Feb 16 13:23:52 compute-0 nova_compute[185723]: 2026-02-16 13:23:52.540 185727 DEBUG nova.virt.libvirt.driver [None req-307919dd-13fc-4b5d-9f47-6d3683aea8c2 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: b21f8b55-68d7-4cd7-beed-2d61f932f84e] Migrate API has completed _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10279
Feb 16 13:23:52 compute-0 nova_compute[185723]: 2026-02-16 13:23:52.540 185727 DEBUG nova.virt.libvirt.driver [None req-307919dd-13fc-4b5d-9f47-6d3683aea8c2 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: b21f8b55-68d7-4cd7-beed-2d61f932f84e] Migration operation thread has finished _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10327
Feb 16 13:23:52 compute-0 nova_compute[185723]: 2026-02-16 13:23:52.540 185727 DEBUG nova.virt.libvirt.driver [None req-307919dd-13fc-4b5d-9f47-6d3683aea8c2 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: b21f8b55-68d7-4cd7-beed-2d61f932f84e] Migration operation thread notification thread_finished /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10630
Feb 16 13:23:53 compute-0 nova_compute[185723]: 2026-02-16 13:23:53.526 185727 DEBUG nova.compute.manager [req-351a6a73-1f46-4974-9774-a5b2bc1204fc req-c445bc3a-c3c8-4fd5-b186-4b6d28ef74b5 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: b21f8b55-68d7-4cd7-beed-2d61f932f84e] Received event network-vif-unplugged-3bdc1813-a8d3-43b8-805c-95acd138d9d6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:23:53 compute-0 nova_compute[185723]: 2026-02-16 13:23:53.526 185727 DEBUG oslo_concurrency.lockutils [req-351a6a73-1f46-4974-9774-a5b2bc1204fc req-c445bc3a-c3c8-4fd5-b186-4b6d28ef74b5 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "b21f8b55-68d7-4cd7-beed-2d61f932f84e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:23:53 compute-0 nova_compute[185723]: 2026-02-16 13:23:53.527 185727 DEBUG oslo_concurrency.lockutils [req-351a6a73-1f46-4974-9774-a5b2bc1204fc req-c445bc3a-c3c8-4fd5-b186-4b6d28ef74b5 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "b21f8b55-68d7-4cd7-beed-2d61f932f84e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:23:53 compute-0 nova_compute[185723]: 2026-02-16 13:23:53.527 185727 DEBUG oslo_concurrency.lockutils [req-351a6a73-1f46-4974-9774-a5b2bc1204fc req-c445bc3a-c3c8-4fd5-b186-4b6d28ef74b5 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "b21f8b55-68d7-4cd7-beed-2d61f932f84e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:23:53 compute-0 nova_compute[185723]: 2026-02-16 13:23:53.527 185727 DEBUG nova.compute.manager [req-351a6a73-1f46-4974-9774-a5b2bc1204fc req-c445bc3a-c3c8-4fd5-b186-4b6d28ef74b5 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: b21f8b55-68d7-4cd7-beed-2d61f932f84e] No waiting events found dispatching network-vif-unplugged-3bdc1813-a8d3-43b8-805c-95acd138d9d6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:23:53 compute-0 nova_compute[185723]: 2026-02-16 13:23:53.527 185727 DEBUG nova.compute.manager [req-351a6a73-1f46-4974-9774-a5b2bc1204fc req-c445bc3a-c3c8-4fd5-b186-4b6d28ef74b5 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: b21f8b55-68d7-4cd7-beed-2d61f932f84e] Received event network-vif-unplugged-3bdc1813-a8d3-43b8-805c-95acd138d9d6 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Feb 16 13:23:55 compute-0 systemd[1]: Stopping User Manager for UID 42436...
Feb 16 13:23:55 compute-0 systemd[207585]: Activating special unit Exit the Session...
Feb 16 13:23:55 compute-0 systemd[207585]: Stopped target Main User Target.
Feb 16 13:23:55 compute-0 systemd[207585]: Stopped target Basic System.
Feb 16 13:23:55 compute-0 systemd[207585]: Stopped target Paths.
Feb 16 13:23:55 compute-0 systemd[207585]: Stopped target Sockets.
Feb 16 13:23:55 compute-0 systemd[207585]: Stopped target Timers.
Feb 16 13:23:55 compute-0 systemd[207585]: Stopped Mark boot as successful after the user session has run 2 minutes.
Feb 16 13:23:55 compute-0 systemd[207585]: Stopped Daily Cleanup of User's Temporary Directories.
Feb 16 13:23:55 compute-0 systemd[207585]: Closed D-Bus User Message Bus Socket.
Feb 16 13:23:55 compute-0 systemd[207585]: Stopped Create User's Volatile Files and Directories.
Feb 16 13:23:55 compute-0 systemd[207585]: Removed slice User Application Slice.
Feb 16 13:23:55 compute-0 systemd[207585]: Reached target Shutdown.
Feb 16 13:23:55 compute-0 systemd[207585]: Finished Exit the Session.
Feb 16 13:23:55 compute-0 systemd[207585]: Reached target Exit the Session.
Feb 16 13:23:55 compute-0 systemd[1]: user@42436.service: Deactivated successfully.
Feb 16 13:23:55 compute-0 systemd[1]: Stopped User Manager for UID 42436.
Feb 16 13:23:55 compute-0 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Feb 16 13:23:55 compute-0 systemd[1]: run-user-42436.mount: Deactivated successfully.
Feb 16 13:23:55 compute-0 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Feb 16 13:23:55 compute-0 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Feb 16 13:23:55 compute-0 systemd[1]: Removed slice User Slice of UID 42436.
Feb 16 13:23:55 compute-0 podman[207656]: 2026-02-16 13:23:55.280675663 +0000 UTC m=+0.060521232 container health_status d69bd0be854ec8043fcba555b70c2cd311c7a2510d07d6bc9929fe7684abece9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Feb 16 13:23:55 compute-0 podman[207655]: 2026-02-16 13:23:55.303322799 +0000 UTC m=+0.079515807 container health_status 93151dcbec9f159583042df5101bdd7b5b1bcb5f3117955b4bc9cd1c0e465b98 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, config_id=openstack_network_exporter, container_name=openstack_network_exporter, io.openshift.expose-services=, managed_by=edpm_ansible, name=ubi9/ubi-minimal, maintainer=Red Hat, Inc., org.opencontainers.image.created=2026-02-05T04:57:10Z, release=1770267347, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2026-02-05T04:57:10Z, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, io.openshift.tags=minimal rhel9, version=9.7, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container)
Feb 16 13:23:55 compute-0 nova_compute[185723]: 2026-02-16 13:23:55.689 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:23:55 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:23:55.690 105360 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=7, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '52:96:1b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '16:6c:7a:bd:49:39'}, ipsec=False) old=SB_Global(nb_cfg=6) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 13:23:55 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:23:55.691 105360 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 16 13:23:55 compute-0 nova_compute[185723]: 2026-02-16 13:23:55.796 185727 DEBUG nova.compute.manager [req-27d2312d-776a-43fb-8d2e-c0ac06957205 req-6b68978a-4b5f-40c5-83c6-758920c3fb2b faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: b21f8b55-68d7-4cd7-beed-2d61f932f84e] Received event network-vif-plugged-3bdc1813-a8d3-43b8-805c-95acd138d9d6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:23:55 compute-0 nova_compute[185723]: 2026-02-16 13:23:55.796 185727 DEBUG oslo_concurrency.lockutils [req-27d2312d-776a-43fb-8d2e-c0ac06957205 req-6b68978a-4b5f-40c5-83c6-758920c3fb2b faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "b21f8b55-68d7-4cd7-beed-2d61f932f84e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:23:55 compute-0 nova_compute[185723]: 2026-02-16 13:23:55.797 185727 DEBUG oslo_concurrency.lockutils [req-27d2312d-776a-43fb-8d2e-c0ac06957205 req-6b68978a-4b5f-40c5-83c6-758920c3fb2b faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "b21f8b55-68d7-4cd7-beed-2d61f932f84e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:23:55 compute-0 nova_compute[185723]: 2026-02-16 13:23:55.797 185727 DEBUG oslo_concurrency.lockutils [req-27d2312d-776a-43fb-8d2e-c0ac06957205 req-6b68978a-4b5f-40c5-83c6-758920c3fb2b faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "b21f8b55-68d7-4cd7-beed-2d61f932f84e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:23:55 compute-0 nova_compute[185723]: 2026-02-16 13:23:55.797 185727 DEBUG nova.compute.manager [req-27d2312d-776a-43fb-8d2e-c0ac06957205 req-6b68978a-4b5f-40c5-83c6-758920c3fb2b faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: b21f8b55-68d7-4cd7-beed-2d61f932f84e] No waiting events found dispatching network-vif-plugged-3bdc1813-a8d3-43b8-805c-95acd138d9d6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:23:55 compute-0 nova_compute[185723]: 2026-02-16 13:23:55.797 185727 WARNING nova.compute.manager [req-27d2312d-776a-43fb-8d2e-c0ac06957205 req-6b68978a-4b5f-40c5-83c6-758920c3fb2b faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: b21f8b55-68d7-4cd7-beed-2d61f932f84e] Received unexpected event network-vif-plugged-3bdc1813-a8d3-43b8-805c-95acd138d9d6 for instance with vm_state active and task_state migrating.
Feb 16 13:23:55 compute-0 nova_compute[185723]: 2026-02-16 13:23:55.797 185727 DEBUG nova.compute.manager [req-27d2312d-776a-43fb-8d2e-c0ac06957205 req-6b68978a-4b5f-40c5-83c6-758920c3fb2b faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: b21f8b55-68d7-4cd7-beed-2d61f932f84e] Received event network-vif-unplugged-3bdc1813-a8d3-43b8-805c-95acd138d9d6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:23:55 compute-0 nova_compute[185723]: 2026-02-16 13:23:55.798 185727 DEBUG oslo_concurrency.lockutils [req-27d2312d-776a-43fb-8d2e-c0ac06957205 req-6b68978a-4b5f-40c5-83c6-758920c3fb2b faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "b21f8b55-68d7-4cd7-beed-2d61f932f84e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:23:55 compute-0 nova_compute[185723]: 2026-02-16 13:23:55.798 185727 DEBUG oslo_concurrency.lockutils [req-27d2312d-776a-43fb-8d2e-c0ac06957205 req-6b68978a-4b5f-40c5-83c6-758920c3fb2b faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "b21f8b55-68d7-4cd7-beed-2d61f932f84e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:23:55 compute-0 nova_compute[185723]: 2026-02-16 13:23:55.798 185727 DEBUG oslo_concurrency.lockutils [req-27d2312d-776a-43fb-8d2e-c0ac06957205 req-6b68978a-4b5f-40c5-83c6-758920c3fb2b faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "b21f8b55-68d7-4cd7-beed-2d61f932f84e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:23:55 compute-0 nova_compute[185723]: 2026-02-16 13:23:55.798 185727 DEBUG nova.compute.manager [req-27d2312d-776a-43fb-8d2e-c0ac06957205 req-6b68978a-4b5f-40c5-83c6-758920c3fb2b faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: b21f8b55-68d7-4cd7-beed-2d61f932f84e] No waiting events found dispatching network-vif-unplugged-3bdc1813-a8d3-43b8-805c-95acd138d9d6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:23:55 compute-0 nova_compute[185723]: 2026-02-16 13:23:55.798 185727 DEBUG nova.compute.manager [req-27d2312d-776a-43fb-8d2e-c0ac06957205 req-6b68978a-4b5f-40c5-83c6-758920c3fb2b faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: b21f8b55-68d7-4cd7-beed-2d61f932f84e] Received event network-vif-unplugged-3bdc1813-a8d3-43b8-805c-95acd138d9d6 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Feb 16 13:23:55 compute-0 nova_compute[185723]: 2026-02-16 13:23:55.799 185727 DEBUG nova.compute.manager [req-27d2312d-776a-43fb-8d2e-c0ac06957205 req-6b68978a-4b5f-40c5-83c6-758920c3fb2b faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: b21f8b55-68d7-4cd7-beed-2d61f932f84e] Received event network-vif-plugged-3bdc1813-a8d3-43b8-805c-95acd138d9d6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:23:55 compute-0 nova_compute[185723]: 2026-02-16 13:23:55.799 185727 DEBUG oslo_concurrency.lockutils [req-27d2312d-776a-43fb-8d2e-c0ac06957205 req-6b68978a-4b5f-40c5-83c6-758920c3fb2b faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "b21f8b55-68d7-4cd7-beed-2d61f932f84e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:23:55 compute-0 nova_compute[185723]: 2026-02-16 13:23:55.799 185727 DEBUG oslo_concurrency.lockutils [req-27d2312d-776a-43fb-8d2e-c0ac06957205 req-6b68978a-4b5f-40c5-83c6-758920c3fb2b faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "b21f8b55-68d7-4cd7-beed-2d61f932f84e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:23:55 compute-0 nova_compute[185723]: 2026-02-16 13:23:55.799 185727 DEBUG oslo_concurrency.lockutils [req-27d2312d-776a-43fb-8d2e-c0ac06957205 req-6b68978a-4b5f-40c5-83c6-758920c3fb2b faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "b21f8b55-68d7-4cd7-beed-2d61f932f84e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:23:55 compute-0 nova_compute[185723]: 2026-02-16 13:23:55.799 185727 DEBUG nova.compute.manager [req-27d2312d-776a-43fb-8d2e-c0ac06957205 req-6b68978a-4b5f-40c5-83c6-758920c3fb2b faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: b21f8b55-68d7-4cd7-beed-2d61f932f84e] No waiting events found dispatching network-vif-plugged-3bdc1813-a8d3-43b8-805c-95acd138d9d6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:23:55 compute-0 nova_compute[185723]: 2026-02-16 13:23:55.799 185727 WARNING nova.compute.manager [req-27d2312d-776a-43fb-8d2e-c0ac06957205 req-6b68978a-4b5f-40c5-83c6-758920c3fb2b faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: b21f8b55-68d7-4cd7-beed-2d61f932f84e] Received unexpected event network-vif-plugged-3bdc1813-a8d3-43b8-805c-95acd138d9d6 for instance with vm_state active and task_state migrating.
Feb 16 13:23:56 compute-0 nova_compute[185723]: 2026-02-16 13:23:56.283 185727 DEBUG nova.network.neutron [req-5220f890-0bc2-4d2f-92b4-26ab63c0b9a9 req-a9e78cff-887d-4c18-b94d-eed244d9dd50 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: b21f8b55-68d7-4cd7-beed-2d61f932f84e] Updated VIF entry in instance network info cache for port 3bdc1813-a8d3-43b8-805c-95acd138d9d6. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 16 13:23:56 compute-0 nova_compute[185723]: 2026-02-16 13:23:56.283 185727 DEBUG nova.network.neutron [req-5220f890-0bc2-4d2f-92b4-26ab63c0b9a9 req-a9e78cff-887d-4c18-b94d-eed244d9dd50 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: b21f8b55-68d7-4cd7-beed-2d61f932f84e] Updating instance_info_cache with network_info: [{"id": "3bdc1813-a8d3-43b8-805c-95acd138d9d6", "address": "fa:16:3e:8a:da:08", "network": {"id": "a6199784-1742-41a7-9152-bb54abb7bef1", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-926379118-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5e0321e3a614b62a46eef7fb2e737ff", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3bdc1813-a8", "ovs_interfaceid": "3bdc1813-a8d3-43b8-805c-95acd138d9d6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-1.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 13:23:56 compute-0 nova_compute[185723]: 2026-02-16 13:23:56.336 185727 DEBUG oslo_concurrency.lockutils [req-5220f890-0bc2-4d2f-92b4-26ab63c0b9a9 req-a9e78cff-887d-4c18-b94d-eed244d9dd50 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Releasing lock "refresh_cache-b21f8b55-68d7-4cd7-beed-2d61f932f84e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 13:23:56 compute-0 nova_compute[185723]: 2026-02-16 13:23:56.345 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:23:56 compute-0 nova_compute[185723]: 2026-02-16 13:23:56.455 185727 DEBUG nova.network.neutron [None req-307919dd-13fc-4b5d-9f47-6d3683aea8c2 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Activated binding for port 3bdc1813-a8d3-43b8-805c-95acd138d9d6 and host compute-1.ctlplane.example.com migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3181
Feb 16 13:23:56 compute-0 nova_compute[185723]: 2026-02-16 13:23:56.456 185727 DEBUG nova.compute.manager [None req-307919dd-13fc-4b5d-9f47-6d3683aea8c2 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: b21f8b55-68d7-4cd7-beed-2d61f932f84e] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "3bdc1813-a8d3-43b8-805c-95acd138d9d6", "address": "fa:16:3e:8a:da:08", "network": {"id": "a6199784-1742-41a7-9152-bb54abb7bef1", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-926379118-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5e0321e3a614b62a46eef7fb2e737ff", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3bdc1813-a8", "ovs_interfaceid": "3bdc1813-a8d3-43b8-805c-95acd138d9d6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9326
Feb 16 13:23:56 compute-0 nova_compute[185723]: 2026-02-16 13:23:56.457 185727 DEBUG nova.virt.libvirt.vif [None req-307919dd-13fc-4b5d-9f47-6d3683aea8c2 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-16T13:21:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-2049385443',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-2049385443',id=3,image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-16T13:21:47Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='b5e0321e3a614b62a46eef7fb2e737ff',ramdisk_id='',reservation_id='r-h5dq1f8v',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteActionsViaActuator-1504038973',owner_user_name='tempest-TestExecuteActionsViaActuator-1504038973-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-16T13:23:34Z,user_data=None,user_id='53b5045c5aaf4a7d8d84dce2ac4aa424',uuid=b21f8b55-68d7-4cd7-beed-2d61f932f84e,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3bdc1813-a8d3-43b8-805c-95acd138d9d6", "address": "fa:16:3e:8a:da:08", "network": {"id": "a6199784-1742-41a7-9152-bb54abb7bef1", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-926379118-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5e0321e3a614b62a46eef7fb2e737ff", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3bdc1813-a8", "ovs_interfaceid": "3bdc1813-a8d3-43b8-805c-95acd138d9d6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Feb 16 13:23:56 compute-0 nova_compute[185723]: 2026-02-16 13:23:56.458 185727 DEBUG nova.network.os_vif_util [None req-307919dd-13fc-4b5d-9f47-6d3683aea8c2 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Converting VIF {"id": "3bdc1813-a8d3-43b8-805c-95acd138d9d6", "address": "fa:16:3e:8a:da:08", "network": {"id": "a6199784-1742-41a7-9152-bb54abb7bef1", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-926379118-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5e0321e3a614b62a46eef7fb2e737ff", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3bdc1813-a8", "ovs_interfaceid": "3bdc1813-a8d3-43b8-805c-95acd138d9d6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 13:23:56 compute-0 nova_compute[185723]: 2026-02-16 13:23:56.459 185727 DEBUG nova.network.os_vif_util [None req-307919dd-13fc-4b5d-9f47-6d3683aea8c2 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:8a:da:08,bridge_name='br-int',has_traffic_filtering=True,id=3bdc1813-a8d3-43b8-805c-95acd138d9d6,network=Network(a6199784-1742-41a7-9152-bb54abb7bef1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3bdc1813-a8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 13:23:56 compute-0 nova_compute[185723]: 2026-02-16 13:23:56.459 185727 DEBUG os_vif [None req-307919dd-13fc-4b5d-9f47-6d3683aea8c2 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:8a:da:08,bridge_name='br-int',has_traffic_filtering=True,id=3bdc1813-a8d3-43b8-805c-95acd138d9d6,network=Network(a6199784-1742-41a7-9152-bb54abb7bef1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3bdc1813-a8') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Feb 16 13:23:56 compute-0 nova_compute[185723]: 2026-02-16 13:23:56.461 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:23:56 compute-0 nova_compute[185723]: 2026-02-16 13:23:56.462 185727 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3bdc1813-a8, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:23:56 compute-0 nova_compute[185723]: 2026-02-16 13:23:56.464 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:23:56 compute-0 nova_compute[185723]: 2026-02-16 13:23:56.468 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 13:23:56 compute-0 nova_compute[185723]: 2026-02-16 13:23:56.471 185727 INFO os_vif [None req-307919dd-13fc-4b5d-9f47-6d3683aea8c2 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:8a:da:08,bridge_name='br-int',has_traffic_filtering=True,id=3bdc1813-a8d3-43b8-805c-95acd138d9d6,network=Network(a6199784-1742-41a7-9152-bb54abb7bef1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3bdc1813-a8')
Feb 16 13:23:56 compute-0 nova_compute[185723]: 2026-02-16 13:23:56.472 185727 DEBUG oslo_concurrency.lockutils [None req-307919dd-13fc-4b5d-9f47-6d3683aea8c2 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:23:56 compute-0 nova_compute[185723]: 2026-02-16 13:23:56.473 185727 DEBUG oslo_concurrency.lockutils [None req-307919dd-13fc-4b5d-9f47-6d3683aea8c2 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:23:56 compute-0 nova_compute[185723]: 2026-02-16 13:23:56.473 185727 DEBUG oslo_concurrency.lockutils [None req-307919dd-13fc-4b5d-9f47-6d3683aea8c2 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:23:56 compute-0 nova_compute[185723]: 2026-02-16 13:23:56.473 185727 DEBUG nova.compute.manager [None req-307919dd-13fc-4b5d-9f47-6d3683aea8c2 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: b21f8b55-68d7-4cd7-beed-2d61f932f84e] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9349
Feb 16 13:23:56 compute-0 nova_compute[185723]: 2026-02-16 13:23:56.474 185727 INFO nova.virt.libvirt.driver [None req-307919dd-13fc-4b5d-9f47-6d3683aea8c2 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: b21f8b55-68d7-4cd7-beed-2d61f932f84e] Deleting instance files /var/lib/nova/instances/b21f8b55-68d7-4cd7-beed-2d61f932f84e_del
Feb 16 13:23:56 compute-0 nova_compute[185723]: 2026-02-16 13:23:56.475 185727 INFO nova.virt.libvirt.driver [None req-307919dd-13fc-4b5d-9f47-6d3683aea8c2 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: b21f8b55-68d7-4cd7-beed-2d61f932f84e] Deletion of /var/lib/nova/instances/b21f8b55-68d7-4cd7-beed-2d61f932f84e_del complete
Feb 16 13:23:57 compute-0 nova_compute[185723]: 2026-02-16 13:23:57.168 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:23:57 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:23:57.694 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=b0e583b2-47d7-4bde-bbd6-282143e0c194, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '7'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:23:58 compute-0 nova_compute[185723]: 2026-02-16 13:23:58.017 185727 DEBUG nova.compute.manager [req-5e65746c-6be9-4cb8-b06d-51a12dc7e20b req-401305d3-c9df-4c2c-aa95-5c65621b79b5 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: b21f8b55-68d7-4cd7-beed-2d61f932f84e] Received event network-vif-plugged-3bdc1813-a8d3-43b8-805c-95acd138d9d6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:23:58 compute-0 nova_compute[185723]: 2026-02-16 13:23:58.017 185727 DEBUG oslo_concurrency.lockutils [req-5e65746c-6be9-4cb8-b06d-51a12dc7e20b req-401305d3-c9df-4c2c-aa95-5c65621b79b5 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "b21f8b55-68d7-4cd7-beed-2d61f932f84e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:23:58 compute-0 nova_compute[185723]: 2026-02-16 13:23:58.017 185727 DEBUG oslo_concurrency.lockutils [req-5e65746c-6be9-4cb8-b06d-51a12dc7e20b req-401305d3-c9df-4c2c-aa95-5c65621b79b5 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "b21f8b55-68d7-4cd7-beed-2d61f932f84e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:23:58 compute-0 nova_compute[185723]: 2026-02-16 13:23:58.018 185727 DEBUG oslo_concurrency.lockutils [req-5e65746c-6be9-4cb8-b06d-51a12dc7e20b req-401305d3-c9df-4c2c-aa95-5c65621b79b5 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "b21f8b55-68d7-4cd7-beed-2d61f932f84e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:23:58 compute-0 nova_compute[185723]: 2026-02-16 13:23:58.018 185727 DEBUG nova.compute.manager [req-5e65746c-6be9-4cb8-b06d-51a12dc7e20b req-401305d3-c9df-4c2c-aa95-5c65621b79b5 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: b21f8b55-68d7-4cd7-beed-2d61f932f84e] No waiting events found dispatching network-vif-plugged-3bdc1813-a8d3-43b8-805c-95acd138d9d6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:23:58 compute-0 nova_compute[185723]: 2026-02-16 13:23:58.018 185727 WARNING nova.compute.manager [req-5e65746c-6be9-4cb8-b06d-51a12dc7e20b req-401305d3-c9df-4c2c-aa95-5c65621b79b5 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: b21f8b55-68d7-4cd7-beed-2d61f932f84e] Received unexpected event network-vif-plugged-3bdc1813-a8d3-43b8-805c-95acd138d9d6 for instance with vm_state active and task_state migrating.
Feb 16 13:23:58 compute-0 nova_compute[185723]: 2026-02-16 13:23:58.018 185727 DEBUG nova.compute.manager [req-5e65746c-6be9-4cb8-b06d-51a12dc7e20b req-401305d3-c9df-4c2c-aa95-5c65621b79b5 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: b21f8b55-68d7-4cd7-beed-2d61f932f84e] Received event network-vif-plugged-3bdc1813-a8d3-43b8-805c-95acd138d9d6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:23:58 compute-0 nova_compute[185723]: 2026-02-16 13:23:58.018 185727 DEBUG oslo_concurrency.lockutils [req-5e65746c-6be9-4cb8-b06d-51a12dc7e20b req-401305d3-c9df-4c2c-aa95-5c65621b79b5 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "b21f8b55-68d7-4cd7-beed-2d61f932f84e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:23:58 compute-0 nova_compute[185723]: 2026-02-16 13:23:58.019 185727 DEBUG oslo_concurrency.lockutils [req-5e65746c-6be9-4cb8-b06d-51a12dc7e20b req-401305d3-c9df-4c2c-aa95-5c65621b79b5 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "b21f8b55-68d7-4cd7-beed-2d61f932f84e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:23:58 compute-0 nova_compute[185723]: 2026-02-16 13:23:58.019 185727 DEBUG oslo_concurrency.lockutils [req-5e65746c-6be9-4cb8-b06d-51a12dc7e20b req-401305d3-c9df-4c2c-aa95-5c65621b79b5 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "b21f8b55-68d7-4cd7-beed-2d61f932f84e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:23:58 compute-0 nova_compute[185723]: 2026-02-16 13:23:58.019 185727 DEBUG nova.compute.manager [req-5e65746c-6be9-4cb8-b06d-51a12dc7e20b req-401305d3-c9df-4c2c-aa95-5c65621b79b5 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: b21f8b55-68d7-4cd7-beed-2d61f932f84e] No waiting events found dispatching network-vif-plugged-3bdc1813-a8d3-43b8-805c-95acd138d9d6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:23:58 compute-0 nova_compute[185723]: 2026-02-16 13:23:58.019 185727 WARNING nova.compute.manager [req-5e65746c-6be9-4cb8-b06d-51a12dc7e20b req-401305d3-c9df-4c2c-aa95-5c65621b79b5 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: b21f8b55-68d7-4cd7-beed-2d61f932f84e] Received unexpected event network-vif-plugged-3bdc1813-a8d3-43b8-805c-95acd138d9d6 for instance with vm_state active and task_state migrating.
Feb 16 13:23:59 compute-0 podman[195053]: time="2026-02-16T13:23:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:23:59 compute-0 podman[195053]: @ - - [16/Feb/2026:13:23:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17245 "" "Go-http-client/1.1"
Feb 16 13:23:59 compute-0 podman[195053]: @ - - [16/Feb/2026:13:23:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2635 "" "Go-http-client/1.1"
Feb 16 13:24:00 compute-0 podman[207696]: 2026-02-16 13:24:00.048058473 +0000 UTC m=+0.075227770 container health_status 19961a2f872cc72daf954f4dd556f980b5f58f137d145776df6132c167a8a93c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller)
Feb 16 13:24:00 compute-0 nova_compute[185723]: 2026-02-16 13:24:00.432 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:24:01 compute-0 openstack_network_exporter[197909]: ERROR   13:24:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:24:01 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:24:01 compute-0 openstack_network_exporter[197909]: ERROR   13:24:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:24:01 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:24:01 compute-0 nova_compute[185723]: 2026-02-16 13:24:01.433 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:24:01 compute-0 nova_compute[185723]: 2026-02-16 13:24:01.506 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:24:02 compute-0 nova_compute[185723]: 2026-02-16 13:24:02.171 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:24:02 compute-0 sshd-session[207722]: Connection closed by authenticating user root 146.190.226.24 port 58066 [preauth]
Feb 16 13:24:02 compute-0 nova_compute[185723]: 2026-02-16 13:24:02.434 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:24:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:24:03.215 105360 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:24:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:24:03.219 105360 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.004s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:24:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:24:03.220 105360 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:24:03 compute-0 nova_compute[185723]: 2026-02-16 13:24:03.434 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:24:03 compute-0 nova_compute[185723]: 2026-02-16 13:24:03.434 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 16 13:24:04 compute-0 nova_compute[185723]: 2026-02-16 13:24:04.225 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Acquiring lock "refresh_cache-139d8f81-7f89-4100-af32-e59289aeb6f5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 13:24:04 compute-0 nova_compute[185723]: 2026-02-16 13:24:04.225 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Acquired lock "refresh_cache-139d8f81-7f89-4100-af32-e59289aeb6f5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 13:24:04 compute-0 nova_compute[185723]: 2026-02-16 13:24:04.225 185727 DEBUG nova.network.neutron [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] [instance: 139d8f81-7f89-4100-af32-e59289aeb6f5] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Feb 16 13:24:05 compute-0 sshd-session[207738]: Invalid user oracle from 188.166.42.159 port 38268
Feb 16 13:24:05 compute-0 sshd-session[207738]: Connection closed by invalid user oracle 188.166.42.159 port 38268 [preauth]
Feb 16 13:24:06 compute-0 nova_compute[185723]: 2026-02-16 13:24:06.373 185727 DEBUG oslo_concurrency.lockutils [None req-307919dd-13fc-4b5d-9f47-6d3683aea8c2 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "b21f8b55-68d7-4cd7-beed-2d61f932f84e-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:24:06 compute-0 nova_compute[185723]: 2026-02-16 13:24:06.373 185727 DEBUG oslo_concurrency.lockutils [None req-307919dd-13fc-4b5d-9f47-6d3683aea8c2 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "b21f8b55-68d7-4cd7-beed-2d61f932f84e-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:24:06 compute-0 nova_compute[185723]: 2026-02-16 13:24:06.373 185727 DEBUG oslo_concurrency.lockutils [None req-307919dd-13fc-4b5d-9f47-6d3683aea8c2 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "b21f8b55-68d7-4cd7-beed-2d61f932f84e-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:24:06 compute-0 nova_compute[185723]: 2026-02-16 13:24:06.434 185727 DEBUG oslo_concurrency.lockutils [None req-307919dd-13fc-4b5d-9f47-6d3683aea8c2 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:24:06 compute-0 nova_compute[185723]: 2026-02-16 13:24:06.435 185727 DEBUG oslo_concurrency.lockutils [None req-307919dd-13fc-4b5d-9f47-6d3683aea8c2 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:24:06 compute-0 nova_compute[185723]: 2026-02-16 13:24:06.436 185727 DEBUG oslo_concurrency.lockutils [None req-307919dd-13fc-4b5d-9f47-6d3683aea8c2 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:24:06 compute-0 nova_compute[185723]: 2026-02-16 13:24:06.436 185727 DEBUG nova.compute.resource_tracker [None req-307919dd-13fc-4b5d-9f47-6d3683aea8c2 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 16 13:24:06 compute-0 nova_compute[185723]: 2026-02-16 13:24:06.507 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:24:06 compute-0 nova_compute[185723]: 2026-02-16 13:24:06.555 185727 DEBUG oslo_concurrency.processutils [None req-307919dd-13fc-4b5d-9f47-6d3683aea8c2 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/393399a7-477d-4663-9a81-2b968a02bb03/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:24:06 compute-0 nova_compute[185723]: 2026-02-16 13:24:06.613 185727 DEBUG oslo_concurrency.processutils [None req-307919dd-13fc-4b5d-9f47-6d3683aea8c2 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/393399a7-477d-4663-9a81-2b968a02bb03/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:24:06 compute-0 nova_compute[185723]: 2026-02-16 13:24:06.614 185727 DEBUG oslo_concurrency.processutils [None req-307919dd-13fc-4b5d-9f47-6d3683aea8c2 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/393399a7-477d-4663-9a81-2b968a02bb03/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:24:06 compute-0 nova_compute[185723]: 2026-02-16 13:24:06.691 185727 DEBUG oslo_concurrency.processutils [None req-307919dd-13fc-4b5d-9f47-6d3683aea8c2 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/393399a7-477d-4663-9a81-2b968a02bb03/disk --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:24:06 compute-0 nova_compute[185723]: 2026-02-16 13:24:06.697 185727 DEBUG oslo_concurrency.processutils [None req-307919dd-13fc-4b5d-9f47-6d3683aea8c2 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/139d8f81-7f89-4100-af32-e59289aeb6f5/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:24:06 compute-0 nova_compute[185723]: 2026-02-16 13:24:06.760 185727 DEBUG oslo_concurrency.processutils [None req-307919dd-13fc-4b5d-9f47-6d3683aea8c2 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/139d8f81-7f89-4100-af32-e59289aeb6f5/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:24:06 compute-0 nova_compute[185723]: 2026-02-16 13:24:06.761 185727 DEBUG oslo_concurrency.processutils [None req-307919dd-13fc-4b5d-9f47-6d3683aea8c2 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/139d8f81-7f89-4100-af32-e59289aeb6f5/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:24:06 compute-0 nova_compute[185723]: 2026-02-16 13:24:06.817 185727 DEBUG oslo_concurrency.processutils [None req-307919dd-13fc-4b5d-9f47-6d3683aea8c2 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/139d8f81-7f89-4100-af32-e59289aeb6f5/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:24:06 compute-0 nova_compute[185723]: 2026-02-16 13:24:06.946 185727 WARNING nova.virt.libvirt.driver [None req-307919dd-13fc-4b5d-9f47-6d3683aea8c2 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 13:24:06 compute-0 nova_compute[185723]: 2026-02-16 13:24:06.947 185727 DEBUG nova.compute.resource_tracker [None req-307919dd-13fc-4b5d-9f47-6d3683aea8c2 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5491MB free_disk=73.16933059692383GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 16 13:24:06 compute-0 nova_compute[185723]: 2026-02-16 13:24:06.947 185727 DEBUG oslo_concurrency.lockutils [None req-307919dd-13fc-4b5d-9f47-6d3683aea8c2 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:24:06 compute-0 nova_compute[185723]: 2026-02-16 13:24:06.947 185727 DEBUG oslo_concurrency.lockutils [None req-307919dd-13fc-4b5d-9f47-6d3683aea8c2 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:24:07 compute-0 nova_compute[185723]: 2026-02-16 13:24:07.094 185727 DEBUG nova.compute.resource_tracker [None req-307919dd-13fc-4b5d-9f47-6d3683aea8c2 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Migration for instance b21f8b55-68d7-4cd7-beed-2d61f932f84e refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903
Feb 16 13:24:07 compute-0 nova_compute[185723]: 2026-02-16 13:24:07.162 185727 DEBUG nova.compute.resource_tracker [None req-307919dd-13fc-4b5d-9f47-6d3683aea8c2 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: b21f8b55-68d7-4cd7-beed-2d61f932f84e] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1491
Feb 16 13:24:07 compute-0 nova_compute[185723]: 2026-02-16 13:24:07.174 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:24:07 compute-0 nova_compute[185723]: 2026-02-16 13:24:07.216 185727 DEBUG nova.compute.resource_tracker [None req-307919dd-13fc-4b5d-9f47-6d3683aea8c2 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Instance 139d8f81-7f89-4100-af32-e59289aeb6f5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 16 13:24:07 compute-0 nova_compute[185723]: 2026-02-16 13:24:07.216 185727 DEBUG nova.compute.resource_tracker [None req-307919dd-13fc-4b5d-9f47-6d3683aea8c2 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Instance 393399a7-477d-4663-9a81-2b968a02bb03 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 16 13:24:07 compute-0 nova_compute[185723]: 2026-02-16 13:24:07.217 185727 DEBUG nova.compute.resource_tracker [None req-307919dd-13fc-4b5d-9f47-6d3683aea8c2 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Migration 735afe49-654b-4958-aac0-48e243770fe0 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640
Feb 16 13:24:07 compute-0 nova_compute[185723]: 2026-02-16 13:24:07.217 185727 DEBUG nova.compute.resource_tracker [None req-307919dd-13fc-4b5d-9f47-6d3683aea8c2 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 16 13:24:07 compute-0 nova_compute[185723]: 2026-02-16 13:24:07.218 185727 DEBUG nova.compute.resource_tracker [None req-307919dd-13fc-4b5d-9f47-6d3683aea8c2 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 16 13:24:07 compute-0 nova_compute[185723]: 2026-02-16 13:24:07.410 185727 DEBUG nova.compute.provider_tree [None req-307919dd-13fc-4b5d-9f47-6d3683aea8c2 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Inventory has not changed in ProviderTree for provider: c9501a85-df32-4b8f-bce0-9425ef1e7866 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 13:24:07 compute-0 nova_compute[185723]: 2026-02-16 13:24:07.430 185727 DEBUG nova.scheduler.client.report [None req-307919dd-13fc-4b5d-9f47-6d3683aea8c2 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Inventory has not changed for provider c9501a85-df32-4b8f-bce0-9425ef1e7866 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 13:24:07 compute-0 nova_compute[185723]: 2026-02-16 13:24:07.462 185727 DEBUG nova.compute.resource_tracker [None req-307919dd-13fc-4b5d-9f47-6d3683aea8c2 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 16 13:24:07 compute-0 nova_compute[185723]: 2026-02-16 13:24:07.463 185727 DEBUG oslo_concurrency.lockutils [None req-307919dd-13fc-4b5d-9f47-6d3683aea8c2 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.515s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:24:07 compute-0 nova_compute[185723]: 2026-02-16 13:24:07.471 185727 INFO nova.compute.manager [None req-307919dd-13fc-4b5d-9f47-6d3683aea8c2 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: b21f8b55-68d7-4cd7-beed-2d61f932f84e] Migrating instance to compute-1.ctlplane.example.com finished successfully.
Feb 16 13:24:07 compute-0 nova_compute[185723]: 2026-02-16 13:24:07.540 185727 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1771248232.5307903, b21f8b55-68d7-4cd7-beed-2d61f932f84e => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 13:24:07 compute-0 nova_compute[185723]: 2026-02-16 13:24:07.541 185727 INFO nova.compute.manager [-] [instance: b21f8b55-68d7-4cd7-beed-2d61f932f84e] VM Stopped (Lifecycle Event)
Feb 16 13:24:07 compute-0 nova_compute[185723]: 2026-02-16 13:24:07.579 185727 DEBUG nova.compute.manager [None req-d98a21c7-6397-4a5a-9d99-e975d1c52f88 - - - - - -] [instance: b21f8b55-68d7-4cd7-beed-2d61f932f84e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:24:07 compute-0 nova_compute[185723]: 2026-02-16 13:24:07.634 185727 INFO nova.scheduler.client.report [None req-307919dd-13fc-4b5d-9f47-6d3683aea8c2 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Deleted allocation for migration 735afe49-654b-4958-aac0-48e243770fe0
Feb 16 13:24:07 compute-0 nova_compute[185723]: 2026-02-16 13:24:07.634 185727 DEBUG nova.virt.libvirt.driver [None req-307919dd-13fc-4b5d-9f47-6d3683aea8c2 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: b21f8b55-68d7-4cd7-beed-2d61f932f84e] Live migration monitoring is all done _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10662
Feb 16 13:24:08 compute-0 nova_compute[185723]: 2026-02-16 13:24:08.104 185727 DEBUG nova.network.neutron [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] [instance: 139d8f81-7f89-4100-af32-e59289aeb6f5] Updating instance_info_cache with network_info: [{"id": "b14d49f7-53e0-4c41-a463-8f16b26817ae", "address": "fa:16:3e:49:6b:ae", "network": {"id": "a6199784-1742-41a7-9152-bb54abb7bef1", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-926379118-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5e0321e3a614b62a46eef7fb2e737ff", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb14d49f7-53", "ovs_interfaceid": "b14d49f7-53e0-4c41-a463-8f16b26817ae", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 13:24:08 compute-0 nova_compute[185723]: 2026-02-16 13:24:08.130 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Releasing lock "refresh_cache-139d8f81-7f89-4100-af32-e59289aeb6f5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 13:24:08 compute-0 nova_compute[185723]: 2026-02-16 13:24:08.131 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] [instance: 139d8f81-7f89-4100-af32-e59289aeb6f5] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Feb 16 13:24:08 compute-0 nova_compute[185723]: 2026-02-16 13:24:08.131 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:24:08 compute-0 nova_compute[185723]: 2026-02-16 13:24:08.132 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:24:08 compute-0 nova_compute[185723]: 2026-02-16 13:24:08.132 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:24:08 compute-0 nova_compute[185723]: 2026-02-16 13:24:08.132 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 16 13:24:08 compute-0 nova_compute[185723]: 2026-02-16 13:24:08.132 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:24:08 compute-0 nova_compute[185723]: 2026-02-16 13:24:08.164 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:24:08 compute-0 nova_compute[185723]: 2026-02-16 13:24:08.164 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:24:08 compute-0 nova_compute[185723]: 2026-02-16 13:24:08.165 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:24:08 compute-0 nova_compute[185723]: 2026-02-16 13:24:08.165 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 16 13:24:08 compute-0 nova_compute[185723]: 2026-02-16 13:24:08.323 185727 DEBUG oslo_concurrency.processutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/393399a7-477d-4663-9a81-2b968a02bb03/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:24:08 compute-0 nova_compute[185723]: 2026-02-16 13:24:08.374 185727 DEBUG oslo_concurrency.processutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/393399a7-477d-4663-9a81-2b968a02bb03/disk --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:24:08 compute-0 nova_compute[185723]: 2026-02-16 13:24:08.375 185727 DEBUG oslo_concurrency.processutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/393399a7-477d-4663-9a81-2b968a02bb03/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:24:08 compute-0 nova_compute[185723]: 2026-02-16 13:24:08.427 185727 DEBUG oslo_concurrency.processutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/393399a7-477d-4663-9a81-2b968a02bb03/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:24:08 compute-0 nova_compute[185723]: 2026-02-16 13:24:08.433 185727 DEBUG oslo_concurrency.processutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/139d8f81-7f89-4100-af32-e59289aeb6f5/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:24:08 compute-0 nova_compute[185723]: 2026-02-16 13:24:08.484 185727 DEBUG oslo_concurrency.processutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/139d8f81-7f89-4100-af32-e59289aeb6f5/disk --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:24:08 compute-0 nova_compute[185723]: 2026-02-16 13:24:08.485 185727 DEBUG oslo_concurrency.processutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/139d8f81-7f89-4100-af32-e59289aeb6f5/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:24:08 compute-0 nova_compute[185723]: 2026-02-16 13:24:08.538 185727 DEBUG oslo_concurrency.processutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/139d8f81-7f89-4100-af32-e59289aeb6f5/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:24:08 compute-0 nova_compute[185723]: 2026-02-16 13:24:08.714 185727 WARNING nova.virt.libvirt.driver [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 13:24:08 compute-0 nova_compute[185723]: 2026-02-16 13:24:08.717 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5485MB free_disk=73.16933059692383GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 16 13:24:08 compute-0 nova_compute[185723]: 2026-02-16 13:24:08.718 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:24:08 compute-0 nova_compute[185723]: 2026-02-16 13:24:08.718 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:24:08 compute-0 nova_compute[185723]: 2026-02-16 13:24:08.880 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Instance 139d8f81-7f89-4100-af32-e59289aeb6f5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 16 13:24:08 compute-0 nova_compute[185723]: 2026-02-16 13:24:08.881 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Instance 393399a7-477d-4663-9a81-2b968a02bb03 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 16 13:24:08 compute-0 nova_compute[185723]: 2026-02-16 13:24:08.881 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 16 13:24:08 compute-0 nova_compute[185723]: 2026-02-16 13:24:08.881 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 16 13:24:09 compute-0 nova_compute[185723]: 2026-02-16 13:24:09.069 185727 DEBUG nova.compute.provider_tree [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Inventory has not changed in ProviderTree for provider: c9501a85-df32-4b8f-bce0-9425ef1e7866 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 13:24:09 compute-0 nova_compute[185723]: 2026-02-16 13:24:09.090 185727 DEBUG nova.scheduler.client.report [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Inventory has not changed for provider c9501a85-df32-4b8f-bce0-9425ef1e7866 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 13:24:09 compute-0 nova_compute[185723]: 2026-02-16 13:24:09.092 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 16 13:24:09 compute-0 nova_compute[185723]: 2026-02-16 13:24:09.092 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.374s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:24:10 compute-0 podman[207765]: 2026-02-16 13:24:10.057174862 +0000 UTC m=+0.088874201 container health_status 4ec6105d1727f201f656d7e6e358931be8ceabc5af37fb5ad779abc620122180 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 16 13:24:10 compute-0 nova_compute[185723]: 2026-02-16 13:24:10.547 185727 DEBUG oslo_concurrency.lockutils [None req-e51d6f6c-4c1f-4ffe-842c-00cf92b60fe8 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Acquiring lock "393399a7-477d-4663-9a81-2b968a02bb03" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:24:10 compute-0 nova_compute[185723]: 2026-02-16 13:24:10.547 185727 DEBUG oslo_concurrency.lockutils [None req-e51d6f6c-4c1f-4ffe-842c-00cf92b60fe8 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Lock "393399a7-477d-4663-9a81-2b968a02bb03" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:24:10 compute-0 nova_compute[185723]: 2026-02-16 13:24:10.548 185727 DEBUG oslo_concurrency.lockutils [None req-e51d6f6c-4c1f-4ffe-842c-00cf92b60fe8 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Acquiring lock "393399a7-477d-4663-9a81-2b968a02bb03-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:24:10 compute-0 nova_compute[185723]: 2026-02-16 13:24:10.548 185727 DEBUG oslo_concurrency.lockutils [None req-e51d6f6c-4c1f-4ffe-842c-00cf92b60fe8 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Lock "393399a7-477d-4663-9a81-2b968a02bb03-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:24:10 compute-0 nova_compute[185723]: 2026-02-16 13:24:10.548 185727 DEBUG oslo_concurrency.lockutils [None req-e51d6f6c-4c1f-4ffe-842c-00cf92b60fe8 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Lock "393399a7-477d-4663-9a81-2b968a02bb03-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:24:10 compute-0 nova_compute[185723]: 2026-02-16 13:24:10.550 185727 INFO nova.compute.manager [None req-e51d6f6c-4c1f-4ffe-842c-00cf92b60fe8 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 393399a7-477d-4663-9a81-2b968a02bb03] Terminating instance
Feb 16 13:24:10 compute-0 nova_compute[185723]: 2026-02-16 13:24:10.551 185727 DEBUG nova.compute.manager [None req-e51d6f6c-4c1f-4ffe-842c-00cf92b60fe8 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 393399a7-477d-4663-9a81-2b968a02bb03] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Feb 16 13:24:10 compute-0 kernel: tap6fabb627-8b (unregistering): left promiscuous mode
Feb 16 13:24:10 compute-0 NetworkManager[56177]: <info>  [1771248250.5780] device (tap6fabb627-8b): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Feb 16 13:24:10 compute-0 ovn_controller[96072]: 2026-02-16T13:24:10Z|00053|binding|INFO|Releasing lport 6fabb627-8b4f-4fd3-b05a-ceb17816ec5d from this chassis (sb_readonly=0)
Feb 16 13:24:10 compute-0 nova_compute[185723]: 2026-02-16 13:24:10.587 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:24:10 compute-0 ovn_controller[96072]: 2026-02-16T13:24:10Z|00054|binding|INFO|Setting lport 6fabb627-8b4f-4fd3-b05a-ceb17816ec5d down in Southbound
Feb 16 13:24:10 compute-0 ovn_controller[96072]: 2026-02-16T13:24:10Z|00055|binding|INFO|Removing iface tap6fabb627-8b ovn-installed in OVS
Feb 16 13:24:10 compute-0 nova_compute[185723]: 2026-02-16 13:24:10.590 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:24:10 compute-0 nova_compute[185723]: 2026-02-16 13:24:10.593 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:24:10 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:24:10.595 105360 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ac:31:46 10.100.0.7'], port_security=['fa:16:3e:ac:31:46 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '393399a7-477d-4663-9a81-2b968a02bb03', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a6199784-1742-41a7-9152-bb54abb7bef1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b5e0321e3a614b62a46eef7fb2e737ff', 'neutron:revision_number': '4', 'neutron:security_group_ids': '22e3f3ae-6435-49f2-b1a3-ead6d5ff75b1', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5a8796a1-c459-4e68-a95d-23fef829aa8d, chassis=[], tunnel_key=8, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2e53046760>], logical_port=6fabb627-8b4f-4fd3-b05a-ceb17816ec5d) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2e53046760>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 13:24:10 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:24:10.596 105360 INFO neutron.agent.ovn.metadata.agent [-] Port 6fabb627-8b4f-4fd3-b05a-ceb17816ec5d in datapath a6199784-1742-41a7-9152-bb54abb7bef1 unbound from our chassis
Feb 16 13:24:10 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:24:10.597 105360 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network a6199784-1742-41a7-9152-bb54abb7bef1
Feb 16 13:24:10 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:24:10.608 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[e5924971-f2e5-4179-82fc-dd187faf0710]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:24:10 compute-0 systemd[1]: machine-qemu\x2d4\x2dinstance\x2d00000006.scope: Deactivated successfully.
Feb 16 13:24:10 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:24:10.633 206452 DEBUG oslo.privsep.daemon [-] privsep: reply[2cd15f7b-84f8-4f5e-8f87-f3a123578427]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:24:10 compute-0 systemd[1]: machine-qemu\x2d4\x2dinstance\x2d00000006.scope: Consumed 13.263s CPU time.
Feb 16 13:24:10 compute-0 systemd-machined[155229]: Machine qemu-4-instance-00000006 terminated.
Feb 16 13:24:10 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:24:10.637 206452 DEBUG oslo.privsep.daemon [-] privsep: reply[80ba16f8-a278-4e6a-9a30-22d03bb35e8e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:24:10 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:24:10.663 206452 DEBUG oslo.privsep.daemon [-] privsep: reply[3c3919d4-bf3e-4133-88f0-0c47acb57a17]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:24:10 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:24:10.676 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[71563bc4-2a6d-40f9-acc4-15a39db35010]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa6199784-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:dd:b9:43'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 12, 'tx_packets': 11, 'rx_bytes': 784, 'tx_bytes': 606, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 12, 'tx_packets': 11, 'rx_bytes': 784, 'tx_bytes': 606, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 15], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 418760, 'reachable_time': 42186, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 207803, 'error': None, 'target': 'ovnmeta-a6199784-1742-41a7-9152-bb54abb7bef1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:24:10 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:24:10.686 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[42532347-a8d1-4831-9644-3a6e26ad11b8]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapa6199784-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 418769, 'tstamp': 418769}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 207804, 'error': None, 'target': 'ovnmeta-a6199784-1742-41a7-9152-bb54abb7bef1', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapa6199784-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 418771, 'tstamp': 418771}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 207804, 'error': None, 'target': 'ovnmeta-a6199784-1742-41a7-9152-bb54abb7bef1', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:24:10 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:24:10.688 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa6199784-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:24:10 compute-0 nova_compute[185723]: 2026-02-16 13:24:10.689 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:24:10 compute-0 nova_compute[185723]: 2026-02-16 13:24:10.693 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:24:10 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:24:10.694 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa6199784-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:24:10 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:24:10.694 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 13:24:10 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:24:10.694 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapa6199784-10, col_values=(('external_ids', {'iface-id': '3b5a298b-9fc2-4705-8faa-2b8cfb88937b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:24:10 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:24:10.695 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 13:24:10 compute-0 nova_compute[185723]: 2026-02-16 13:24:10.778 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:24:10 compute-0 nova_compute[185723]: 2026-02-16 13:24:10.812 185727 INFO nova.virt.libvirt.driver [-] [instance: 393399a7-477d-4663-9a81-2b968a02bb03] Instance destroyed successfully.
Feb 16 13:24:10 compute-0 nova_compute[185723]: 2026-02-16 13:24:10.813 185727 DEBUG nova.objects.instance [None req-e51d6f6c-4c1f-4ffe-842c-00cf92b60fe8 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Lazy-loading 'resources' on Instance uuid 393399a7-477d-4663-9a81-2b968a02bb03 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 13:24:10 compute-0 nova_compute[185723]: 2026-02-16 13:24:10.846 185727 DEBUG nova.virt.libvirt.vif [None req-e51d6f6c-4c1f-4ffe-842c-00cf92b60fe8 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-16T13:23:00Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-2071860474',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-2071860474',id=6,image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-16T13:23:19Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b5e0321e3a614b62a46eef7fb2e737ff',ramdisk_id='',reservation_id='r-paq9ey49',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteActionsViaActuator-1504038973',owner_user_name='tempest-TestExecuteActionsViaActuator-1504038973-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-16T13:23:19Z,user_data=None,user_id='53b5045c5aaf4a7d8d84dce2ac4aa424',uuid=393399a7-477d-4663-9a81-2b968a02bb03,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "6fabb627-8b4f-4fd3-b05a-ceb17816ec5d", "address": "fa:16:3e:ac:31:46", "network": {"id": "a6199784-1742-41a7-9152-bb54abb7bef1", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-926379118-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5e0321e3a614b62a46eef7fb2e737ff", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6fabb627-8b", "ovs_interfaceid": "6fabb627-8b4f-4fd3-b05a-ceb17816ec5d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Feb 16 13:24:10 compute-0 nova_compute[185723]: 2026-02-16 13:24:10.846 185727 DEBUG nova.network.os_vif_util [None req-e51d6f6c-4c1f-4ffe-842c-00cf92b60fe8 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Converting VIF {"id": "6fabb627-8b4f-4fd3-b05a-ceb17816ec5d", "address": "fa:16:3e:ac:31:46", "network": {"id": "a6199784-1742-41a7-9152-bb54abb7bef1", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-926379118-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5e0321e3a614b62a46eef7fb2e737ff", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6fabb627-8b", "ovs_interfaceid": "6fabb627-8b4f-4fd3-b05a-ceb17816ec5d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 13:24:10 compute-0 nova_compute[185723]: 2026-02-16 13:24:10.847 185727 DEBUG nova.network.os_vif_util [None req-e51d6f6c-4c1f-4ffe-842c-00cf92b60fe8 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ac:31:46,bridge_name='br-int',has_traffic_filtering=True,id=6fabb627-8b4f-4fd3-b05a-ceb17816ec5d,network=Network(a6199784-1742-41a7-9152-bb54abb7bef1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6fabb627-8b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 13:24:10 compute-0 nova_compute[185723]: 2026-02-16 13:24:10.847 185727 DEBUG os_vif [None req-e51d6f6c-4c1f-4ffe-842c-00cf92b60fe8 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ac:31:46,bridge_name='br-int',has_traffic_filtering=True,id=6fabb627-8b4f-4fd3-b05a-ceb17816ec5d,network=Network(a6199784-1742-41a7-9152-bb54abb7bef1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6fabb627-8b') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Feb 16 13:24:10 compute-0 nova_compute[185723]: 2026-02-16 13:24:10.850 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:24:10 compute-0 nova_compute[185723]: 2026-02-16 13:24:10.850 185727 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6fabb627-8b, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:24:10 compute-0 nova_compute[185723]: 2026-02-16 13:24:10.852 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:24:10 compute-0 nova_compute[185723]: 2026-02-16 13:24:10.853 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:24:10 compute-0 nova_compute[185723]: 2026-02-16 13:24:10.857 185727 INFO os_vif [None req-e51d6f6c-4c1f-4ffe-842c-00cf92b60fe8 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ac:31:46,bridge_name='br-int',has_traffic_filtering=True,id=6fabb627-8b4f-4fd3-b05a-ceb17816ec5d,network=Network(a6199784-1742-41a7-9152-bb54abb7bef1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6fabb627-8b')
Feb 16 13:24:10 compute-0 nova_compute[185723]: 2026-02-16 13:24:10.857 185727 INFO nova.virt.libvirt.driver [None req-e51d6f6c-4c1f-4ffe-842c-00cf92b60fe8 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 393399a7-477d-4663-9a81-2b968a02bb03] Deleting instance files /var/lib/nova/instances/393399a7-477d-4663-9a81-2b968a02bb03_del
Feb 16 13:24:10 compute-0 nova_compute[185723]: 2026-02-16 13:24:10.859 185727 INFO nova.virt.libvirt.driver [None req-e51d6f6c-4c1f-4ffe-842c-00cf92b60fe8 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 393399a7-477d-4663-9a81-2b968a02bb03] Deletion of /var/lib/nova/instances/393399a7-477d-4663-9a81-2b968a02bb03_del complete
Feb 16 13:24:10 compute-0 nova_compute[185723]: 2026-02-16 13:24:10.991 185727 INFO nova.compute.manager [None req-e51d6f6c-4c1f-4ffe-842c-00cf92b60fe8 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 393399a7-477d-4663-9a81-2b968a02bb03] Took 0.44 seconds to destroy the instance on the hypervisor.
Feb 16 13:24:10 compute-0 nova_compute[185723]: 2026-02-16 13:24:10.991 185727 DEBUG oslo.service.loopingcall [None req-e51d6f6c-4c1f-4ffe-842c-00cf92b60fe8 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Feb 16 13:24:10 compute-0 nova_compute[185723]: 2026-02-16 13:24:10.991 185727 DEBUG nova.compute.manager [-] [instance: 393399a7-477d-4663-9a81-2b968a02bb03] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Feb 16 13:24:10 compute-0 nova_compute[185723]: 2026-02-16 13:24:10.992 185727 DEBUG nova.network.neutron [-] [instance: 393399a7-477d-4663-9a81-2b968a02bb03] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Feb 16 13:24:10 compute-0 nova_compute[185723]: 2026-02-16 13:24:10.997 185727 DEBUG nova.compute.manager [req-256b026a-d0d8-41c5-9765-c00c92f7248a req-d75abcce-4bba-4bfe-add5-616c0329bd24 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 393399a7-477d-4663-9a81-2b968a02bb03] Received event network-vif-unplugged-6fabb627-8b4f-4fd3-b05a-ceb17816ec5d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:24:10 compute-0 nova_compute[185723]: 2026-02-16 13:24:10.997 185727 DEBUG oslo_concurrency.lockutils [req-256b026a-d0d8-41c5-9765-c00c92f7248a req-d75abcce-4bba-4bfe-add5-616c0329bd24 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "393399a7-477d-4663-9a81-2b968a02bb03-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:24:10 compute-0 nova_compute[185723]: 2026-02-16 13:24:10.998 185727 DEBUG oslo_concurrency.lockutils [req-256b026a-d0d8-41c5-9765-c00c92f7248a req-d75abcce-4bba-4bfe-add5-616c0329bd24 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "393399a7-477d-4663-9a81-2b968a02bb03-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:24:10 compute-0 nova_compute[185723]: 2026-02-16 13:24:10.998 185727 DEBUG oslo_concurrency.lockutils [req-256b026a-d0d8-41c5-9765-c00c92f7248a req-d75abcce-4bba-4bfe-add5-616c0329bd24 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "393399a7-477d-4663-9a81-2b968a02bb03-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:24:10 compute-0 nova_compute[185723]: 2026-02-16 13:24:10.998 185727 DEBUG nova.compute.manager [req-256b026a-d0d8-41c5-9765-c00c92f7248a req-d75abcce-4bba-4bfe-add5-616c0329bd24 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 393399a7-477d-4663-9a81-2b968a02bb03] No waiting events found dispatching network-vif-unplugged-6fabb627-8b4f-4fd3-b05a-ceb17816ec5d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:24:10 compute-0 nova_compute[185723]: 2026-02-16 13:24:10.998 185727 DEBUG nova.compute.manager [req-256b026a-d0d8-41c5-9765-c00c92f7248a req-d75abcce-4bba-4bfe-add5-616c0329bd24 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 393399a7-477d-4663-9a81-2b968a02bb03] Received event network-vif-unplugged-6fabb627-8b4f-4fd3-b05a-ceb17816ec5d for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Feb 16 13:24:11 compute-0 nova_compute[185723]: 2026-02-16 13:24:11.087 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:24:12 compute-0 nova_compute[185723]: 2026-02-16 13:24:12.175 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:24:13 compute-0 nova_compute[185723]: 2026-02-16 13:24:13.167 185727 DEBUG nova.compute.manager [req-da233190-46e6-4fc8-9932-745ff3949b50 req-b49268b1-ee72-4ce7-9d1d-a2f35adca45b faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 393399a7-477d-4663-9a81-2b968a02bb03] Received event network-vif-plugged-6fabb627-8b4f-4fd3-b05a-ceb17816ec5d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:24:13 compute-0 nova_compute[185723]: 2026-02-16 13:24:13.167 185727 DEBUG oslo_concurrency.lockutils [req-da233190-46e6-4fc8-9932-745ff3949b50 req-b49268b1-ee72-4ce7-9d1d-a2f35adca45b faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "393399a7-477d-4663-9a81-2b968a02bb03-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:24:13 compute-0 nova_compute[185723]: 2026-02-16 13:24:13.167 185727 DEBUG oslo_concurrency.lockutils [req-da233190-46e6-4fc8-9932-745ff3949b50 req-b49268b1-ee72-4ce7-9d1d-a2f35adca45b faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "393399a7-477d-4663-9a81-2b968a02bb03-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:24:13 compute-0 nova_compute[185723]: 2026-02-16 13:24:13.168 185727 DEBUG oslo_concurrency.lockutils [req-da233190-46e6-4fc8-9932-745ff3949b50 req-b49268b1-ee72-4ce7-9d1d-a2f35adca45b faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "393399a7-477d-4663-9a81-2b968a02bb03-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:24:13 compute-0 nova_compute[185723]: 2026-02-16 13:24:13.168 185727 DEBUG nova.compute.manager [req-da233190-46e6-4fc8-9932-745ff3949b50 req-b49268b1-ee72-4ce7-9d1d-a2f35adca45b faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 393399a7-477d-4663-9a81-2b968a02bb03] No waiting events found dispatching network-vif-plugged-6fabb627-8b4f-4fd3-b05a-ceb17816ec5d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:24:13 compute-0 nova_compute[185723]: 2026-02-16 13:24:13.168 185727 WARNING nova.compute.manager [req-da233190-46e6-4fc8-9932-745ff3949b50 req-b49268b1-ee72-4ce7-9d1d-a2f35adca45b faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 393399a7-477d-4663-9a81-2b968a02bb03] Received unexpected event network-vif-plugged-6fabb627-8b4f-4fd3-b05a-ceb17816ec5d for instance with vm_state active and task_state deleting.
Feb 16 13:24:13 compute-0 nova_compute[185723]: 2026-02-16 13:24:13.246 185727 DEBUG nova.network.neutron [-] [instance: 393399a7-477d-4663-9a81-2b968a02bb03] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 13:24:13 compute-0 nova_compute[185723]: 2026-02-16 13:24:13.274 185727 INFO nova.compute.manager [-] [instance: 393399a7-477d-4663-9a81-2b968a02bb03] Took 2.28 seconds to deallocate network for instance.
Feb 16 13:24:13 compute-0 nova_compute[185723]: 2026-02-16 13:24:13.336 185727 DEBUG oslo_concurrency.lockutils [None req-e51d6f6c-4c1f-4ffe-842c-00cf92b60fe8 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:24:13 compute-0 nova_compute[185723]: 2026-02-16 13:24:13.336 185727 DEBUG oslo_concurrency.lockutils [None req-e51d6f6c-4c1f-4ffe-842c-00cf92b60fe8 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:24:13 compute-0 nova_compute[185723]: 2026-02-16 13:24:13.400 185727 DEBUG nova.compute.manager [req-2a817c4f-11c1-4221-ac24-0336fe7c16a3 req-e0284829-10d2-40e6-a31e-600cff1af542 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 393399a7-477d-4663-9a81-2b968a02bb03] Received event network-vif-deleted-6fabb627-8b4f-4fd3-b05a-ceb17816ec5d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:24:13 compute-0 nova_compute[185723]: 2026-02-16 13:24:13.461 185727 DEBUG nova.compute.provider_tree [None req-e51d6f6c-4c1f-4ffe-842c-00cf92b60fe8 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Inventory has not changed in ProviderTree for provider: c9501a85-df32-4b8f-bce0-9425ef1e7866 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 13:24:13 compute-0 nova_compute[185723]: 2026-02-16 13:24:13.478 185727 DEBUG nova.scheduler.client.report [None req-e51d6f6c-4c1f-4ffe-842c-00cf92b60fe8 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Inventory has not changed for provider c9501a85-df32-4b8f-bce0-9425ef1e7866 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 13:24:13 compute-0 nova_compute[185723]: 2026-02-16 13:24:13.509 185727 DEBUG oslo_concurrency.lockutils [None req-e51d6f6c-4c1f-4ffe-842c-00cf92b60fe8 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.173s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:24:13 compute-0 nova_compute[185723]: 2026-02-16 13:24:13.546 185727 INFO nova.scheduler.client.report [None req-e51d6f6c-4c1f-4ffe-842c-00cf92b60fe8 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Deleted allocations for instance 393399a7-477d-4663-9a81-2b968a02bb03
Feb 16 13:24:13 compute-0 nova_compute[185723]: 2026-02-16 13:24:13.659 185727 DEBUG oslo_concurrency.lockutils [None req-e51d6f6c-4c1f-4ffe-842c-00cf92b60fe8 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Lock "393399a7-477d-4663-9a81-2b968a02bb03" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.111s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:24:15 compute-0 nova_compute[185723]: 2026-02-16 13:24:15.852 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:24:16 compute-0 nova_compute[185723]: 2026-02-16 13:24:16.102 185727 DEBUG oslo_concurrency.lockutils [None req-07f4e300-44d7-44ee-bb67-800503c03163 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Acquiring lock "139d8f81-7f89-4100-af32-e59289aeb6f5" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:24:16 compute-0 nova_compute[185723]: 2026-02-16 13:24:16.103 185727 DEBUG oslo_concurrency.lockutils [None req-07f4e300-44d7-44ee-bb67-800503c03163 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Lock "139d8f81-7f89-4100-af32-e59289aeb6f5" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:24:16 compute-0 nova_compute[185723]: 2026-02-16 13:24:16.103 185727 DEBUG oslo_concurrency.lockutils [None req-07f4e300-44d7-44ee-bb67-800503c03163 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Acquiring lock "139d8f81-7f89-4100-af32-e59289aeb6f5-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:24:16 compute-0 nova_compute[185723]: 2026-02-16 13:24:16.104 185727 DEBUG oslo_concurrency.lockutils [None req-07f4e300-44d7-44ee-bb67-800503c03163 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Lock "139d8f81-7f89-4100-af32-e59289aeb6f5-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:24:16 compute-0 nova_compute[185723]: 2026-02-16 13:24:16.105 185727 DEBUG oslo_concurrency.lockutils [None req-07f4e300-44d7-44ee-bb67-800503c03163 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Lock "139d8f81-7f89-4100-af32-e59289aeb6f5-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:24:16 compute-0 nova_compute[185723]: 2026-02-16 13:24:16.107 185727 INFO nova.compute.manager [None req-07f4e300-44d7-44ee-bb67-800503c03163 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 139d8f81-7f89-4100-af32-e59289aeb6f5] Terminating instance
Feb 16 13:24:16 compute-0 nova_compute[185723]: 2026-02-16 13:24:16.108 185727 DEBUG nova.compute.manager [None req-07f4e300-44d7-44ee-bb67-800503c03163 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 139d8f81-7f89-4100-af32-e59289aeb6f5] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Feb 16 13:24:16 compute-0 kernel: tapb14d49f7-53 (unregistering): left promiscuous mode
Feb 16 13:24:16 compute-0 NetworkManager[56177]: <info>  [1771248256.1351] device (tapb14d49f7-53): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Feb 16 13:24:16 compute-0 ovn_controller[96072]: 2026-02-16T13:24:16Z|00056|binding|INFO|Releasing lport b14d49f7-53e0-4c41-a463-8f16b26817ae from this chassis (sb_readonly=0)
Feb 16 13:24:16 compute-0 ovn_controller[96072]: 2026-02-16T13:24:16Z|00057|binding|INFO|Setting lport b14d49f7-53e0-4c41-a463-8f16b26817ae down in Southbound
Feb 16 13:24:16 compute-0 ovn_controller[96072]: 2026-02-16T13:24:16Z|00058|binding|INFO|Removing iface tapb14d49f7-53 ovn-installed in OVS
Feb 16 13:24:16 compute-0 nova_compute[185723]: 2026-02-16 13:24:16.140 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:24:16 compute-0 nova_compute[185723]: 2026-02-16 13:24:16.142 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:24:16 compute-0 nova_compute[185723]: 2026-02-16 13:24:16.155 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:24:16 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:24:16.151 105360 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:49:6b:ae 10.100.0.8'], port_security=['fa:16:3e:49:6b:ae 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '139d8f81-7f89-4100-af32-e59289aeb6f5', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a6199784-1742-41a7-9152-bb54abb7bef1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b5e0321e3a614b62a46eef7fb2e737ff', 'neutron:revision_number': '4', 'neutron:security_group_ids': '22e3f3ae-6435-49f2-b1a3-ead6d5ff75b1', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5a8796a1-c459-4e68-a95d-23fef829aa8d, chassis=[], tunnel_key=7, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2e53046760>], logical_port=b14d49f7-53e0-4c41-a463-8f16b26817ae) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2e53046760>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 13:24:16 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:24:16.154 105360 INFO neutron.agent.ovn.metadata.agent [-] Port b14d49f7-53e0-4c41-a463-8f16b26817ae in datapath a6199784-1742-41a7-9152-bb54abb7bef1 unbound from our chassis
Feb 16 13:24:16 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:24:16.157 105360 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network a6199784-1742-41a7-9152-bb54abb7bef1, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 16 13:24:16 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:24:16.158 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[63161bcd-e8b6-44e3-983a-8e26b0db7fea]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:24:16 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:24:16.159 105360 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-a6199784-1742-41a7-9152-bb54abb7bef1 namespace which is not needed anymore
Feb 16 13:24:16 compute-0 systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000005.scope: Deactivated successfully.
Feb 16 13:24:16 compute-0 systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000005.scope: Consumed 14.119s CPU time.
Feb 16 13:24:16 compute-0 systemd-machined[155229]: Machine qemu-3-instance-00000005 terminated.
Feb 16 13:24:16 compute-0 neutron-haproxy-ovnmeta-a6199784-1742-41a7-9152-bb54abb7bef1[206992]: [NOTICE]   (206996) : haproxy version is 2.8.14-c23fe91
Feb 16 13:24:16 compute-0 neutron-haproxy-ovnmeta-a6199784-1742-41a7-9152-bb54abb7bef1[206992]: [NOTICE]   (206996) : path to executable is /usr/sbin/haproxy
Feb 16 13:24:16 compute-0 neutron-haproxy-ovnmeta-a6199784-1742-41a7-9152-bb54abb7bef1[206992]: [WARNING]  (206996) : Exiting Master process...
Feb 16 13:24:16 compute-0 neutron-haproxy-ovnmeta-a6199784-1742-41a7-9152-bb54abb7bef1[206992]: [WARNING]  (206996) : Exiting Master process...
Feb 16 13:24:16 compute-0 neutron-haproxy-ovnmeta-a6199784-1742-41a7-9152-bb54abb7bef1[206992]: [ALERT]    (206996) : Current worker (206998) exited with code 143 (Terminated)
Feb 16 13:24:16 compute-0 neutron-haproxy-ovnmeta-a6199784-1742-41a7-9152-bb54abb7bef1[206992]: [WARNING]  (206996) : All workers exited. Exiting... (0)
Feb 16 13:24:16 compute-0 systemd[1]: libpod-ab76ecbfc722e0ed48e66496cd71d756e9981f8ef0a34e55d14276dbd768f8cf.scope: Deactivated successfully.
Feb 16 13:24:16 compute-0 podman[207847]: 2026-02-16 13:24:16.304903145 +0000 UTC m=+0.051552039 container died ab76ecbfc722e0ed48e66496cd71d756e9981f8ef0a34e55d14276dbd768f8cf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a6199784-1742-41a7-9152-bb54abb7bef1, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team)
Feb 16 13:24:16 compute-0 kernel: tapb14d49f7-53: entered promiscuous mode
Feb 16 13:24:16 compute-0 NetworkManager[56177]: <info>  [1771248256.3261] manager: (tapb14d49f7-53): new Tun device (/org/freedesktop/NetworkManager/Devices/32)
Feb 16 13:24:16 compute-0 systemd-udevd[207827]: Network interface NamePolicy= disabled on kernel command line.
Feb 16 13:24:16 compute-0 ovn_controller[96072]: 2026-02-16T13:24:16Z|00059|binding|INFO|Claiming lport b14d49f7-53e0-4c41-a463-8f16b26817ae for this chassis.
Feb 16 13:24:16 compute-0 ovn_controller[96072]: 2026-02-16T13:24:16Z|00060|binding|INFO|b14d49f7-53e0-4c41-a463-8f16b26817ae: Claiming fa:16:3e:49:6b:ae 10.100.0.8
Feb 16 13:24:16 compute-0 nova_compute[185723]: 2026-02-16 13:24:16.328 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:24:16 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ab76ecbfc722e0ed48e66496cd71d756e9981f8ef0a34e55d14276dbd768f8cf-userdata-shm.mount: Deactivated successfully.
Feb 16 13:24:16 compute-0 kernel: tapb14d49f7-53 (unregistering): left promiscuous mode
Feb 16 13:24:16 compute-0 systemd[1]: var-lib-containers-storage-overlay-890aef6c8cdf419a2b6c513b894d36adca80637c7d3be7cad014923b430b09ce-merged.mount: Deactivated successfully.
Feb 16 13:24:16 compute-0 ovn_controller[96072]: 2026-02-16T13:24:16Z|00061|binding|INFO|Setting lport b14d49f7-53e0-4c41-a463-8f16b26817ae ovn-installed in OVS
Feb 16 13:24:16 compute-0 nova_compute[185723]: 2026-02-16 13:24:16.340 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:24:16 compute-0 podman[207847]: 2026-02-16 13:24:16.342839122 +0000 UTC m=+0.089488016 container cleanup ab76ecbfc722e0ed48e66496cd71d756e9981f8ef0a34e55d14276dbd768f8cf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a6199784-1742-41a7-9152-bb54abb7bef1, org.label-schema.build-date=20260127, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 16 13:24:16 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:24:16.345 105360 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:49:6b:ae 10.100.0.8'], port_security=['fa:16:3e:49:6b:ae 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '139d8f81-7f89-4100-af32-e59289aeb6f5', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a6199784-1742-41a7-9152-bb54abb7bef1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b5e0321e3a614b62a46eef7fb2e737ff', 'neutron:revision_number': '4', 'neutron:security_group_ids': '22e3f3ae-6435-49f2-b1a3-ead6d5ff75b1', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5a8796a1-c459-4e68-a95d-23fef829aa8d, chassis=[<ovs.db.idl.Row object at 0x7f2e53046760>], tunnel_key=7, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2e53046760>], logical_port=b14d49f7-53e0-4c41-a463-8f16b26817ae) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 13:24:16 compute-0 ovn_controller[96072]: 2026-02-16T13:24:16Z|00062|binding|INFO|Setting lport b14d49f7-53e0-4c41-a463-8f16b26817ae up in Southbound
Feb 16 13:24:16 compute-0 ovn_controller[96072]: 2026-02-16T13:24:16Z|00063|binding|INFO|Releasing lport b14d49f7-53e0-4c41-a463-8f16b26817ae from this chassis (sb_readonly=1)
Feb 16 13:24:16 compute-0 nova_compute[185723]: 2026-02-16 13:24:16.347 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:24:16 compute-0 ovn_controller[96072]: 2026-02-16T13:24:16Z|00064|binding|INFO|Removing iface tapb14d49f7-53 ovn-installed in OVS
Feb 16 13:24:16 compute-0 ovn_controller[96072]: 2026-02-16T13:24:16Z|00065|if_status|INFO|Not setting lport b14d49f7-53e0-4c41-a463-8f16b26817ae down as sb is readonly
Feb 16 13:24:16 compute-0 nova_compute[185723]: 2026-02-16 13:24:16.349 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:24:16 compute-0 nova_compute[185723]: 2026-02-16 13:24:16.352 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:24:16 compute-0 ovn_controller[96072]: 2026-02-16T13:24:16Z|00066|binding|INFO|Releasing lport b14d49f7-53e0-4c41-a463-8f16b26817ae from this chassis (sb_readonly=0)
Feb 16 13:24:16 compute-0 ovn_controller[96072]: 2026-02-16T13:24:16Z|00067|binding|INFO|Setting lport b14d49f7-53e0-4c41-a463-8f16b26817ae down in Southbound
Feb 16 13:24:16 compute-0 systemd[1]: libpod-conmon-ab76ecbfc722e0ed48e66496cd71d756e9981f8ef0a34e55d14276dbd768f8cf.scope: Deactivated successfully.
Feb 16 13:24:16 compute-0 nova_compute[185723]: 2026-02-16 13:24:16.376 185727 INFO nova.virt.libvirt.driver [-] [instance: 139d8f81-7f89-4100-af32-e59289aeb6f5] Instance destroyed successfully.
Feb 16 13:24:16 compute-0 nova_compute[185723]: 2026-02-16 13:24:16.376 185727 DEBUG nova.objects.instance [None req-07f4e300-44d7-44ee-bb67-800503c03163 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Lazy-loading 'resources' on Instance uuid 139d8f81-7f89-4100-af32-e59289aeb6f5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 13:24:16 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:24:16.379 105360 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:49:6b:ae 10.100.0.8'], port_security=['fa:16:3e:49:6b:ae 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '139d8f81-7f89-4100-af32-e59289aeb6f5', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a6199784-1742-41a7-9152-bb54abb7bef1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b5e0321e3a614b62a46eef7fb2e737ff', 'neutron:revision_number': '4', 'neutron:security_group_ids': '22e3f3ae-6435-49f2-b1a3-ead6d5ff75b1', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5a8796a1-c459-4e68-a95d-23fef829aa8d, chassis=[], tunnel_key=7, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2e53046760>], logical_port=b14d49f7-53e0-4c41-a463-8f16b26817ae) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2e53046760>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 13:24:16 compute-0 podman[207886]: 2026-02-16 13:24:16.406533803 +0000 UTC m=+0.038320708 container remove ab76ecbfc722e0ed48e66496cd71d756e9981f8ef0a34e55d14276dbd768f8cf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a6199784-1742-41a7-9152-bb54abb7bef1, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2)
Feb 16 13:24:16 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:24:16.411 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[4f7e5cdf-21ea-4e24-80c1-22bc409af70f]: (4, ('Mon Feb 16 01:24:16 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-a6199784-1742-41a7-9152-bb54abb7bef1 (ab76ecbfc722e0ed48e66496cd71d756e9981f8ef0a34e55d14276dbd768f8cf)\nab76ecbfc722e0ed48e66496cd71d756e9981f8ef0a34e55d14276dbd768f8cf\nMon Feb 16 01:24:16 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-a6199784-1742-41a7-9152-bb54abb7bef1 (ab76ecbfc722e0ed48e66496cd71d756e9981f8ef0a34e55d14276dbd768f8cf)\nab76ecbfc722e0ed48e66496cd71d756e9981f8ef0a34e55d14276dbd768f8cf\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:24:16 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:24:16.413 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[bab5eda9-42e5-400a-ab75-0b94703bbd71]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:24:16 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:24:16.414 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa6199784-10, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:24:16 compute-0 nova_compute[185723]: 2026-02-16 13:24:16.415 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:24:16 compute-0 kernel: tapa6199784-10: left promiscuous mode
Feb 16 13:24:16 compute-0 nova_compute[185723]: 2026-02-16 13:24:16.423 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:24:16 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:24:16.425 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[7bdeae8e-9bb6-4152-b0bd-55fc5aa2f0fa]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:24:16 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:24:16.438 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[5df496b0-fd2e-4211-ae59-44f11d02d306]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:24:16 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:24:16.440 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[4ba6a8bf-a67a-47df-9bfb-24be7afec330]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:24:16 compute-0 nova_compute[185723]: 2026-02-16 13:24:16.441 185727 DEBUG nova.virt.libvirt.vif [None req-07f4e300-44d7-44ee-bb67-800503c03163 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-16T13:22:35Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-1427510884',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-1427510884',id=5,image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-16T13:22:51Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b5e0321e3a614b62a46eef7fb2e737ff',ramdisk_id='',reservation_id='r-gqinun03',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteActionsViaActuator-1504038973',owner_user_name='tempest-TestExecuteActionsViaActuator-1504038973-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-16T13:22:51Z,user_data=None,user_id='53b5045c5aaf4a7d8d84dce2ac4aa424',uuid=139d8f81-7f89-4100-af32-e59289aeb6f5,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b14d49f7-53e0-4c41-a463-8f16b26817ae", "address": "fa:16:3e:49:6b:ae", "network": {"id": "a6199784-1742-41a7-9152-bb54abb7bef1", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-926379118-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5e0321e3a614b62a46eef7fb2e737ff", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb14d49f7-53", "ovs_interfaceid": "b14d49f7-53e0-4c41-a463-8f16b26817ae", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Feb 16 13:24:16 compute-0 nova_compute[185723]: 2026-02-16 13:24:16.442 185727 DEBUG nova.network.os_vif_util [None req-07f4e300-44d7-44ee-bb67-800503c03163 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Converting VIF {"id": "b14d49f7-53e0-4c41-a463-8f16b26817ae", "address": "fa:16:3e:49:6b:ae", "network": {"id": "a6199784-1742-41a7-9152-bb54abb7bef1", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-926379118-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5e0321e3a614b62a46eef7fb2e737ff", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb14d49f7-53", "ovs_interfaceid": "b14d49f7-53e0-4c41-a463-8f16b26817ae", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 13:24:16 compute-0 nova_compute[185723]: 2026-02-16 13:24:16.443 185727 DEBUG nova.network.os_vif_util [None req-07f4e300-44d7-44ee-bb67-800503c03163 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:49:6b:ae,bridge_name='br-int',has_traffic_filtering=True,id=b14d49f7-53e0-4c41-a463-8f16b26817ae,network=Network(a6199784-1742-41a7-9152-bb54abb7bef1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb14d49f7-53') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 13:24:16 compute-0 nova_compute[185723]: 2026-02-16 13:24:16.443 185727 DEBUG os_vif [None req-07f4e300-44d7-44ee-bb67-800503c03163 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:49:6b:ae,bridge_name='br-int',has_traffic_filtering=True,id=b14d49f7-53e0-4c41-a463-8f16b26817ae,network=Network(a6199784-1742-41a7-9152-bb54abb7bef1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb14d49f7-53') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Feb 16 13:24:16 compute-0 nova_compute[185723]: 2026-02-16 13:24:16.445 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:24:16 compute-0 nova_compute[185723]: 2026-02-16 13:24:16.445 185727 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb14d49f7-53, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:24:16 compute-0 nova_compute[185723]: 2026-02-16 13:24:16.446 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:24:16 compute-0 nova_compute[185723]: 2026-02-16 13:24:16.448 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:24:16 compute-0 nova_compute[185723]: 2026-02-16 13:24:16.450 185727 INFO os_vif [None req-07f4e300-44d7-44ee-bb67-800503c03163 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:49:6b:ae,bridge_name='br-int',has_traffic_filtering=True,id=b14d49f7-53e0-4c41-a463-8f16b26817ae,network=Network(a6199784-1742-41a7-9152-bb54abb7bef1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb14d49f7-53')
Feb 16 13:24:16 compute-0 nova_compute[185723]: 2026-02-16 13:24:16.451 185727 INFO nova.virt.libvirt.driver [None req-07f4e300-44d7-44ee-bb67-800503c03163 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 139d8f81-7f89-4100-af32-e59289aeb6f5] Deleting instance files /var/lib/nova/instances/139d8f81-7f89-4100-af32-e59289aeb6f5_del
Feb 16 13:24:16 compute-0 nova_compute[185723]: 2026-02-16 13:24:16.451 185727 INFO nova.virt.libvirt.driver [None req-07f4e300-44d7-44ee-bb67-800503c03163 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 139d8f81-7f89-4100-af32-e59289aeb6f5] Deletion of /var/lib/nova/instances/139d8f81-7f89-4100-af32-e59289aeb6f5_del complete
Feb 16 13:24:16 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:24:16.453 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[e5e01507-7e80-4a73-bd25-def8ce868415]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 418754, 'reachable_time': 43332, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 207909, 'error': None, 'target': 'ovnmeta-a6199784-1742-41a7-9152-bb54abb7bef1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:24:16 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:24:16.456 105762 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-a6199784-1742-41a7-9152-bb54abb7bef1 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Feb 16 13:24:16 compute-0 systemd[1]: run-netns-ovnmeta\x2da6199784\x2d1742\x2d41a7\x2d9152\x2dbb54abb7bef1.mount: Deactivated successfully.
Feb 16 13:24:16 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:24:16.456 105762 DEBUG oslo.privsep.daemon [-] privsep: reply[455a9cf5-61b7-4079-b33d-ca4a55e9e42a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:24:16 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:24:16.458 105360 INFO neutron.agent.ovn.metadata.agent [-] Port b14d49f7-53e0-4c41-a463-8f16b26817ae in datapath a6199784-1742-41a7-9152-bb54abb7bef1 unbound from our chassis
Feb 16 13:24:16 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:24:16.459 105360 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network a6199784-1742-41a7-9152-bb54abb7bef1, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 16 13:24:16 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:24:16.461 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[9341ddea-4243-4083-b42c-059851d5daf3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:24:16 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:24:16.461 105360 INFO neutron.agent.ovn.metadata.agent [-] Port b14d49f7-53e0-4c41-a463-8f16b26817ae in datapath a6199784-1742-41a7-9152-bb54abb7bef1 unbound from our chassis
Feb 16 13:24:16 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:24:16.463 105360 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network a6199784-1742-41a7-9152-bb54abb7bef1, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 16 13:24:16 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:24:16.463 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[26fba870-f650-4d0a-b94c-665947533c2d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:24:16 compute-0 nova_compute[185723]: 2026-02-16 13:24:16.644 185727 INFO nova.compute.manager [None req-07f4e300-44d7-44ee-bb67-800503c03163 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 139d8f81-7f89-4100-af32-e59289aeb6f5] Took 0.54 seconds to destroy the instance on the hypervisor.
Feb 16 13:24:16 compute-0 nova_compute[185723]: 2026-02-16 13:24:16.644 185727 DEBUG oslo.service.loopingcall [None req-07f4e300-44d7-44ee-bb67-800503c03163 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Feb 16 13:24:16 compute-0 nova_compute[185723]: 2026-02-16 13:24:16.645 185727 DEBUG nova.compute.manager [-] [instance: 139d8f81-7f89-4100-af32-e59289aeb6f5] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Feb 16 13:24:16 compute-0 nova_compute[185723]: 2026-02-16 13:24:16.645 185727 DEBUG nova.network.neutron [-] [instance: 139d8f81-7f89-4100-af32-e59289aeb6f5] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Feb 16 13:24:16 compute-0 nova_compute[185723]: 2026-02-16 13:24:16.886 185727 DEBUG nova.compute.manager [req-214af974-a107-4979-9bcf-27211acadd1a req-942ba96e-096b-431f-883f-f43857559dd1 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 139d8f81-7f89-4100-af32-e59289aeb6f5] Received event network-vif-unplugged-b14d49f7-53e0-4c41-a463-8f16b26817ae external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:24:16 compute-0 nova_compute[185723]: 2026-02-16 13:24:16.886 185727 DEBUG oslo_concurrency.lockutils [req-214af974-a107-4979-9bcf-27211acadd1a req-942ba96e-096b-431f-883f-f43857559dd1 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "139d8f81-7f89-4100-af32-e59289aeb6f5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:24:16 compute-0 nova_compute[185723]: 2026-02-16 13:24:16.887 185727 DEBUG oslo_concurrency.lockutils [req-214af974-a107-4979-9bcf-27211acadd1a req-942ba96e-096b-431f-883f-f43857559dd1 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "139d8f81-7f89-4100-af32-e59289aeb6f5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:24:16 compute-0 nova_compute[185723]: 2026-02-16 13:24:16.887 185727 DEBUG oslo_concurrency.lockutils [req-214af974-a107-4979-9bcf-27211acadd1a req-942ba96e-096b-431f-883f-f43857559dd1 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "139d8f81-7f89-4100-af32-e59289aeb6f5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:24:16 compute-0 nova_compute[185723]: 2026-02-16 13:24:16.888 185727 DEBUG nova.compute.manager [req-214af974-a107-4979-9bcf-27211acadd1a req-942ba96e-096b-431f-883f-f43857559dd1 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 139d8f81-7f89-4100-af32-e59289aeb6f5] No waiting events found dispatching network-vif-unplugged-b14d49f7-53e0-4c41-a463-8f16b26817ae pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:24:16 compute-0 nova_compute[185723]: 2026-02-16 13:24:16.888 185727 DEBUG nova.compute.manager [req-214af974-a107-4979-9bcf-27211acadd1a req-942ba96e-096b-431f-883f-f43857559dd1 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 139d8f81-7f89-4100-af32-e59289aeb6f5] Received event network-vif-unplugged-b14d49f7-53e0-4c41-a463-8f16b26817ae for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Feb 16 13:24:17 compute-0 nova_compute[185723]: 2026-02-16 13:24:17.178 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:24:18 compute-0 nova_compute[185723]: 2026-02-16 13:24:18.698 185727 DEBUG nova.network.neutron [-] [instance: 139d8f81-7f89-4100-af32-e59289aeb6f5] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 13:24:18 compute-0 nova_compute[185723]: 2026-02-16 13:24:18.738 185727 INFO nova.compute.manager [-] [instance: 139d8f81-7f89-4100-af32-e59289aeb6f5] Took 2.09 seconds to deallocate network for instance.
Feb 16 13:24:18 compute-0 nova_compute[185723]: 2026-02-16 13:24:18.866 185727 DEBUG oslo_concurrency.lockutils [None req-07f4e300-44d7-44ee-bb67-800503c03163 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:24:18 compute-0 nova_compute[185723]: 2026-02-16 13:24:18.867 185727 DEBUG oslo_concurrency.lockutils [None req-07f4e300-44d7-44ee-bb67-800503c03163 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:24:19 compute-0 nova_compute[185723]: 2026-02-16 13:24:19.040 185727 DEBUG nova.compute.provider_tree [None req-07f4e300-44d7-44ee-bb67-800503c03163 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Inventory has not changed in ProviderTree for provider: c9501a85-df32-4b8f-bce0-9425ef1e7866 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 13:24:19 compute-0 nova_compute[185723]: 2026-02-16 13:24:19.114 185727 DEBUG nova.compute.manager [req-c08c7a2f-39fa-49ae-8081-a2989205befd req-bb173ffc-b7ff-48ea-9d01-6fb85dc03f8b faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 139d8f81-7f89-4100-af32-e59289aeb6f5] Received event network-vif-deleted-b14d49f7-53e0-4c41-a463-8f16b26817ae external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:24:19 compute-0 nova_compute[185723]: 2026-02-16 13:24:19.124 185727 DEBUG nova.scheduler.client.report [None req-07f4e300-44d7-44ee-bb67-800503c03163 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Inventory has not changed for provider c9501a85-df32-4b8f-bce0-9425ef1e7866 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 13:24:19 compute-0 nova_compute[185723]: 2026-02-16 13:24:19.194 185727 DEBUG oslo_concurrency.lockutils [None req-07f4e300-44d7-44ee-bb67-800503c03163 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.327s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:24:19 compute-0 nova_compute[185723]: 2026-02-16 13:24:19.212 185727 DEBUG nova.compute.manager [req-f79e479a-2148-4516-8deb-3de127677ab7 req-90de16c5-25b9-4213-8524-7dbea07675d2 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 139d8f81-7f89-4100-af32-e59289aeb6f5] Received event network-vif-plugged-b14d49f7-53e0-4c41-a463-8f16b26817ae external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:24:19 compute-0 nova_compute[185723]: 2026-02-16 13:24:19.213 185727 DEBUG oslo_concurrency.lockutils [req-f79e479a-2148-4516-8deb-3de127677ab7 req-90de16c5-25b9-4213-8524-7dbea07675d2 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "139d8f81-7f89-4100-af32-e59289aeb6f5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:24:19 compute-0 nova_compute[185723]: 2026-02-16 13:24:19.213 185727 DEBUG oslo_concurrency.lockutils [req-f79e479a-2148-4516-8deb-3de127677ab7 req-90de16c5-25b9-4213-8524-7dbea07675d2 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "139d8f81-7f89-4100-af32-e59289aeb6f5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:24:19 compute-0 nova_compute[185723]: 2026-02-16 13:24:19.214 185727 DEBUG oslo_concurrency.lockutils [req-f79e479a-2148-4516-8deb-3de127677ab7 req-90de16c5-25b9-4213-8524-7dbea07675d2 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "139d8f81-7f89-4100-af32-e59289aeb6f5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:24:19 compute-0 nova_compute[185723]: 2026-02-16 13:24:19.214 185727 DEBUG nova.compute.manager [req-f79e479a-2148-4516-8deb-3de127677ab7 req-90de16c5-25b9-4213-8524-7dbea07675d2 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 139d8f81-7f89-4100-af32-e59289aeb6f5] No waiting events found dispatching network-vif-plugged-b14d49f7-53e0-4c41-a463-8f16b26817ae pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:24:19 compute-0 nova_compute[185723]: 2026-02-16 13:24:19.214 185727 WARNING nova.compute.manager [req-f79e479a-2148-4516-8deb-3de127677ab7 req-90de16c5-25b9-4213-8524-7dbea07675d2 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 139d8f81-7f89-4100-af32-e59289aeb6f5] Received unexpected event network-vif-plugged-b14d49f7-53e0-4c41-a463-8f16b26817ae for instance with vm_state deleted and task_state None.
Feb 16 13:24:19 compute-0 nova_compute[185723]: 2026-02-16 13:24:19.214 185727 DEBUG nova.compute.manager [req-f79e479a-2148-4516-8deb-3de127677ab7 req-90de16c5-25b9-4213-8524-7dbea07675d2 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 139d8f81-7f89-4100-af32-e59289aeb6f5] Received event network-vif-plugged-b14d49f7-53e0-4c41-a463-8f16b26817ae external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:24:19 compute-0 nova_compute[185723]: 2026-02-16 13:24:19.215 185727 DEBUG oslo_concurrency.lockutils [req-f79e479a-2148-4516-8deb-3de127677ab7 req-90de16c5-25b9-4213-8524-7dbea07675d2 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "139d8f81-7f89-4100-af32-e59289aeb6f5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:24:19 compute-0 nova_compute[185723]: 2026-02-16 13:24:19.215 185727 DEBUG oslo_concurrency.lockutils [req-f79e479a-2148-4516-8deb-3de127677ab7 req-90de16c5-25b9-4213-8524-7dbea07675d2 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "139d8f81-7f89-4100-af32-e59289aeb6f5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:24:19 compute-0 nova_compute[185723]: 2026-02-16 13:24:19.215 185727 DEBUG oslo_concurrency.lockutils [req-f79e479a-2148-4516-8deb-3de127677ab7 req-90de16c5-25b9-4213-8524-7dbea07675d2 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "139d8f81-7f89-4100-af32-e59289aeb6f5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:24:19 compute-0 nova_compute[185723]: 2026-02-16 13:24:19.215 185727 DEBUG nova.compute.manager [req-f79e479a-2148-4516-8deb-3de127677ab7 req-90de16c5-25b9-4213-8524-7dbea07675d2 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 139d8f81-7f89-4100-af32-e59289aeb6f5] No waiting events found dispatching network-vif-plugged-b14d49f7-53e0-4c41-a463-8f16b26817ae pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:24:19 compute-0 nova_compute[185723]: 2026-02-16 13:24:19.216 185727 WARNING nova.compute.manager [req-f79e479a-2148-4516-8deb-3de127677ab7 req-90de16c5-25b9-4213-8524-7dbea07675d2 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 139d8f81-7f89-4100-af32-e59289aeb6f5] Received unexpected event network-vif-plugged-b14d49f7-53e0-4c41-a463-8f16b26817ae for instance with vm_state deleted and task_state None.
Feb 16 13:24:19 compute-0 nova_compute[185723]: 2026-02-16 13:24:19.216 185727 DEBUG nova.compute.manager [req-f79e479a-2148-4516-8deb-3de127677ab7 req-90de16c5-25b9-4213-8524-7dbea07675d2 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 139d8f81-7f89-4100-af32-e59289aeb6f5] Received event network-vif-plugged-b14d49f7-53e0-4c41-a463-8f16b26817ae external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:24:19 compute-0 nova_compute[185723]: 2026-02-16 13:24:19.216 185727 DEBUG oslo_concurrency.lockutils [req-f79e479a-2148-4516-8deb-3de127677ab7 req-90de16c5-25b9-4213-8524-7dbea07675d2 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "139d8f81-7f89-4100-af32-e59289aeb6f5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:24:19 compute-0 nova_compute[185723]: 2026-02-16 13:24:19.216 185727 DEBUG oslo_concurrency.lockutils [req-f79e479a-2148-4516-8deb-3de127677ab7 req-90de16c5-25b9-4213-8524-7dbea07675d2 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "139d8f81-7f89-4100-af32-e59289aeb6f5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:24:19 compute-0 nova_compute[185723]: 2026-02-16 13:24:19.216 185727 DEBUG oslo_concurrency.lockutils [req-f79e479a-2148-4516-8deb-3de127677ab7 req-90de16c5-25b9-4213-8524-7dbea07675d2 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "139d8f81-7f89-4100-af32-e59289aeb6f5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:24:19 compute-0 nova_compute[185723]: 2026-02-16 13:24:19.217 185727 DEBUG nova.compute.manager [req-f79e479a-2148-4516-8deb-3de127677ab7 req-90de16c5-25b9-4213-8524-7dbea07675d2 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 139d8f81-7f89-4100-af32-e59289aeb6f5] No waiting events found dispatching network-vif-plugged-b14d49f7-53e0-4c41-a463-8f16b26817ae pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:24:19 compute-0 nova_compute[185723]: 2026-02-16 13:24:19.217 185727 WARNING nova.compute.manager [req-f79e479a-2148-4516-8deb-3de127677ab7 req-90de16c5-25b9-4213-8524-7dbea07675d2 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 139d8f81-7f89-4100-af32-e59289aeb6f5] Received unexpected event network-vif-plugged-b14d49f7-53e0-4c41-a463-8f16b26817ae for instance with vm_state deleted and task_state None.
Feb 16 13:24:19 compute-0 nova_compute[185723]: 2026-02-16 13:24:19.217 185727 DEBUG nova.compute.manager [req-f79e479a-2148-4516-8deb-3de127677ab7 req-90de16c5-25b9-4213-8524-7dbea07675d2 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 139d8f81-7f89-4100-af32-e59289aeb6f5] Received event network-vif-unplugged-b14d49f7-53e0-4c41-a463-8f16b26817ae external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:24:19 compute-0 nova_compute[185723]: 2026-02-16 13:24:19.217 185727 DEBUG oslo_concurrency.lockutils [req-f79e479a-2148-4516-8deb-3de127677ab7 req-90de16c5-25b9-4213-8524-7dbea07675d2 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "139d8f81-7f89-4100-af32-e59289aeb6f5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:24:19 compute-0 nova_compute[185723]: 2026-02-16 13:24:19.218 185727 DEBUG oslo_concurrency.lockutils [req-f79e479a-2148-4516-8deb-3de127677ab7 req-90de16c5-25b9-4213-8524-7dbea07675d2 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "139d8f81-7f89-4100-af32-e59289aeb6f5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:24:19 compute-0 nova_compute[185723]: 2026-02-16 13:24:19.218 185727 DEBUG oslo_concurrency.lockutils [req-f79e479a-2148-4516-8deb-3de127677ab7 req-90de16c5-25b9-4213-8524-7dbea07675d2 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "139d8f81-7f89-4100-af32-e59289aeb6f5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:24:19 compute-0 nova_compute[185723]: 2026-02-16 13:24:19.218 185727 DEBUG nova.compute.manager [req-f79e479a-2148-4516-8deb-3de127677ab7 req-90de16c5-25b9-4213-8524-7dbea07675d2 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 139d8f81-7f89-4100-af32-e59289aeb6f5] No waiting events found dispatching network-vif-unplugged-b14d49f7-53e0-4c41-a463-8f16b26817ae pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:24:19 compute-0 nova_compute[185723]: 2026-02-16 13:24:19.218 185727 WARNING nova.compute.manager [req-f79e479a-2148-4516-8deb-3de127677ab7 req-90de16c5-25b9-4213-8524-7dbea07675d2 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 139d8f81-7f89-4100-af32-e59289aeb6f5] Received unexpected event network-vif-unplugged-b14d49f7-53e0-4c41-a463-8f16b26817ae for instance with vm_state deleted and task_state None.
Feb 16 13:24:19 compute-0 nova_compute[185723]: 2026-02-16 13:24:19.219 185727 DEBUG nova.compute.manager [req-f79e479a-2148-4516-8deb-3de127677ab7 req-90de16c5-25b9-4213-8524-7dbea07675d2 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 139d8f81-7f89-4100-af32-e59289aeb6f5] Received event network-vif-plugged-b14d49f7-53e0-4c41-a463-8f16b26817ae external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:24:19 compute-0 nova_compute[185723]: 2026-02-16 13:24:19.219 185727 DEBUG oslo_concurrency.lockutils [req-f79e479a-2148-4516-8deb-3de127677ab7 req-90de16c5-25b9-4213-8524-7dbea07675d2 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "139d8f81-7f89-4100-af32-e59289aeb6f5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:24:19 compute-0 nova_compute[185723]: 2026-02-16 13:24:19.219 185727 DEBUG oslo_concurrency.lockutils [req-f79e479a-2148-4516-8deb-3de127677ab7 req-90de16c5-25b9-4213-8524-7dbea07675d2 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "139d8f81-7f89-4100-af32-e59289aeb6f5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:24:19 compute-0 nova_compute[185723]: 2026-02-16 13:24:19.219 185727 DEBUG oslo_concurrency.lockutils [req-f79e479a-2148-4516-8deb-3de127677ab7 req-90de16c5-25b9-4213-8524-7dbea07675d2 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "139d8f81-7f89-4100-af32-e59289aeb6f5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:24:19 compute-0 nova_compute[185723]: 2026-02-16 13:24:19.220 185727 DEBUG nova.compute.manager [req-f79e479a-2148-4516-8deb-3de127677ab7 req-90de16c5-25b9-4213-8524-7dbea07675d2 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 139d8f81-7f89-4100-af32-e59289aeb6f5] No waiting events found dispatching network-vif-plugged-b14d49f7-53e0-4c41-a463-8f16b26817ae pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:24:19 compute-0 nova_compute[185723]: 2026-02-16 13:24:19.220 185727 WARNING nova.compute.manager [req-f79e479a-2148-4516-8deb-3de127677ab7 req-90de16c5-25b9-4213-8524-7dbea07675d2 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 139d8f81-7f89-4100-af32-e59289aeb6f5] Received unexpected event network-vif-plugged-b14d49f7-53e0-4c41-a463-8f16b26817ae for instance with vm_state deleted and task_state None.
Feb 16 13:24:19 compute-0 nova_compute[185723]: 2026-02-16 13:24:19.264 185727 INFO nova.scheduler.client.report [None req-07f4e300-44d7-44ee-bb67-800503c03163 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Deleted allocations for instance 139d8f81-7f89-4100-af32-e59289aeb6f5
Feb 16 13:24:19 compute-0 nova_compute[185723]: 2026-02-16 13:24:19.390 185727 DEBUG oslo_concurrency.lockutils [None req-07f4e300-44d7-44ee-bb67-800503c03163 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Lock "139d8f81-7f89-4100-af32-e59289aeb6f5" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.287s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:24:21 compute-0 nova_compute[185723]: 2026-02-16 13:24:21.448 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:24:21 compute-0 sshd-session[207912]: Connection closed by authenticating user root 64.227.72.94 port 36530 [preauth]
Feb 16 13:24:22 compute-0 nova_compute[185723]: 2026-02-16 13:24:22.179 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:24:22 compute-0 sshd-session[207910]: Connection closed by authenticating user root 146.190.22.227 port 51190 [preauth]
Feb 16 13:24:25 compute-0 nova_compute[185723]: 2026-02-16 13:24:25.812 185727 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1771248250.8106565, 393399a7-477d-4663-9a81-2b968a02bb03 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 13:24:25 compute-0 nova_compute[185723]: 2026-02-16 13:24:25.813 185727 INFO nova.compute.manager [-] [instance: 393399a7-477d-4663-9a81-2b968a02bb03] VM Stopped (Lifecycle Event)
Feb 16 13:24:25 compute-0 nova_compute[185723]: 2026-02-16 13:24:25.853 185727 DEBUG nova.compute.manager [None req-77d0a02b-6370-4419-b3c9-14cdfba0fbf5 - - - - - -] [instance: 393399a7-477d-4663-9a81-2b968a02bb03] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:24:26 compute-0 podman[207917]: 2026-02-16 13:24:26.020519514 +0000 UTC m=+0.055434335 container health_status d69bd0be854ec8043fcba555b70c2cd311c7a2510d07d6bc9929fe7684abece9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 16 13:24:26 compute-0 podman[207916]: 2026-02-16 13:24:26.026315069 +0000 UTC m=+0.061731483 container health_status 93151dcbec9f159583042df5101bdd7b5b1bcb5f3117955b4bc9cd1c0e465b98 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9/ubi-minimal, release=1770267347, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, version=9.7, architecture=x86_64, config_id=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, managed_by=edpm_ansible, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2026-02-05T04:57:10Z, vcs-type=git, build-date=2026-02-05T04:57:10Z, distribution-scope=public)
Feb 16 13:24:26 compute-0 nova_compute[185723]: 2026-02-16 13:24:26.449 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:24:27 compute-0 nova_compute[185723]: 2026-02-16 13:24:27.182 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:24:29 compute-0 podman[195053]: time="2026-02-16T13:24:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:24:29 compute-0 podman[195053]: @ - - [16/Feb/2026:13:24:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16012 "" "Go-http-client/1.1"
Feb 16 13:24:29 compute-0 podman[195053]: @ - - [16/Feb/2026:13:24:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2171 "" "Go-http-client/1.1"
Feb 16 13:24:31 compute-0 podman[207956]: 2026-02-16 13:24:31.08161268 +0000 UTC m=+0.126932472 container health_status 19961a2f872cc72daf954f4dd556f980b5f58f137d145776df6132c167a8a93c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true)
Feb 16 13:24:31 compute-0 nova_compute[185723]: 2026-02-16 13:24:31.374 185727 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1771248256.3734252, 139d8f81-7f89-4100-af32-e59289aeb6f5 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 13:24:31 compute-0 nova_compute[185723]: 2026-02-16 13:24:31.375 185727 INFO nova.compute.manager [-] [instance: 139d8f81-7f89-4100-af32-e59289aeb6f5] VM Stopped (Lifecycle Event)
Feb 16 13:24:31 compute-0 nova_compute[185723]: 2026-02-16 13:24:31.403 185727 DEBUG nova.compute.manager [None req-04b8942b-dca1-4c78-b209-1a8d69145a81 - - - - - -] [instance: 139d8f81-7f89-4100-af32-e59289aeb6f5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:24:31 compute-0 openstack_network_exporter[197909]: ERROR   13:24:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:24:31 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:24:31 compute-0 openstack_network_exporter[197909]: ERROR   13:24:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:24:31 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:24:31 compute-0 nova_compute[185723]: 2026-02-16 13:24:31.452 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:24:32 compute-0 nova_compute[185723]: 2026-02-16 13:24:32.184 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:24:36 compute-0 nova_compute[185723]: 2026-02-16 13:24:36.453 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:24:37 compute-0 nova_compute[185723]: 2026-02-16 13:24:37.186 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:24:41 compute-0 podman[207983]: 2026-02-16 13:24:41.035385596 +0000 UTC m=+0.075192029 container health_status 4ec6105d1727f201f656d7e6e358931be8ceabc5af37fb5ad779abc620122180 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 16 13:24:41 compute-0 nova_compute[185723]: 2026-02-16 13:24:41.455 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:24:42 compute-0 nova_compute[185723]: 2026-02-16 13:24:42.187 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:24:46 compute-0 nova_compute[185723]: 2026-02-16 13:24:46.457 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:24:47 compute-0 nova_compute[185723]: 2026-02-16 13:24:47.188 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:24:48 compute-0 nova_compute[185723]: 2026-02-16 13:24:48.660 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:24:51 compute-0 nova_compute[185723]: 2026-02-16 13:24:51.458 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:24:52 compute-0 nova_compute[185723]: 2026-02-16 13:24:52.248 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:24:56 compute-0 nova_compute[185723]: 2026-02-16 13:24:56.459 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:24:57 compute-0 podman[208009]: 2026-02-16 13:24:57.011924669 +0000 UTC m=+0.044452042 container health_status d69bd0be854ec8043fcba555b70c2cd311c7a2510d07d6bc9929fe7684abece9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Feb 16 13:24:57 compute-0 podman[208008]: 2026-02-16 13:24:57.012160895 +0000 UTC m=+0.050007020 container health_status 93151dcbec9f159583042df5101bdd7b5b1bcb5f3117955b4bc9cd1c0e465b98 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2026-02-05T04:57:10Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., config_id=openstack_network_exporter, distribution-scope=public, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, release=1770267347, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.created=2026-02-05T04:57:10Z, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.component=ubi9-minimal-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=ubi9/ubi-minimal, architecture=x86_64, vcs-type=git, version=9.7)
Feb 16 13:24:57 compute-0 nova_compute[185723]: 2026-02-16 13:24:57.251 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:24:58 compute-0 nova_compute[185723]: 2026-02-16 13:24:58.427 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:24:59 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:24:59.407 105360 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=8, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '52:96:1b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '16:6c:7a:bd:49:39'}, ipsec=False) old=SB_Global(nb_cfg=7) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 13:24:59 compute-0 nova_compute[185723]: 2026-02-16 13:24:59.407 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:24:59 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:24:59.408 105360 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 16 13:24:59 compute-0 podman[195053]: time="2026-02-16T13:24:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:24:59 compute-0 podman[195053]: @ - - [16/Feb/2026:13:24:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16012 "" "Go-http-client/1.1"
Feb 16 13:24:59 compute-0 podman[195053]: @ - - [16/Feb/2026:13:24:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2172 "" "Go-http-client/1.1"
Feb 16 13:25:00 compute-0 sshd-session[208046]: Invalid user user from 188.166.42.159 port 51760
Feb 16 13:25:00 compute-0 sshd-session[208046]: Connection closed by invalid user user 188.166.42.159 port 51760 [preauth]
Feb 16 13:25:01 compute-0 openstack_network_exporter[197909]: ERROR   13:25:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:25:01 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:25:01 compute-0 openstack_network_exporter[197909]: ERROR   13:25:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:25:01 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:25:01 compute-0 nova_compute[185723]: 2026-02-16 13:25:01.433 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:25:01 compute-0 nova_compute[185723]: 2026-02-16 13:25:01.433 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:25:01 compute-0 nova_compute[185723]: 2026-02-16 13:25:01.461 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:25:02 compute-0 podman[208048]: 2026-02-16 13:25:02.039124029 +0000 UTC m=+0.082770618 container health_status 19961a2f872cc72daf954f4dd556f980b5f58f137d145776df6132c167a8a93c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true)
Feb 16 13:25:02 compute-0 nova_compute[185723]: 2026-02-16 13:25:02.300 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:25:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:25:03.217 105360 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:25:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:25:03.217 105360 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:25:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:25:03.217 105360 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:25:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:25:03.410 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=b0e583b2-47d7-4bde-bbd6-282143e0c194, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '8'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:25:04 compute-0 nova_compute[185723]: 2026-02-16 13:25:04.432 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:25:04 compute-0 nova_compute[185723]: 2026-02-16 13:25:04.433 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:25:04 compute-0 nova_compute[185723]: 2026-02-16 13:25:04.473 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:25:04 compute-0 nova_compute[185723]: 2026-02-16 13:25:04.474 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:25:04 compute-0 nova_compute[185723]: 2026-02-16 13:25:04.474 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:25:04 compute-0 nova_compute[185723]: 2026-02-16 13:25:04.474 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 16 13:25:04 compute-0 nova_compute[185723]: 2026-02-16 13:25:04.644 185727 WARNING nova.virt.libvirt.driver [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 13:25:04 compute-0 nova_compute[185723]: 2026-02-16 13:25:04.646 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5845MB free_disk=73.22747802734375GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 16 13:25:04 compute-0 nova_compute[185723]: 2026-02-16 13:25:04.646 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:25:04 compute-0 nova_compute[185723]: 2026-02-16 13:25:04.646 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:25:04 compute-0 nova_compute[185723]: 2026-02-16 13:25:04.746 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 16 13:25:04 compute-0 nova_compute[185723]: 2026-02-16 13:25:04.746 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 16 13:25:04 compute-0 nova_compute[185723]: 2026-02-16 13:25:04.772 185727 DEBUG nova.compute.provider_tree [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Inventory has not changed in ProviderTree for provider: c9501a85-df32-4b8f-bce0-9425ef1e7866 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 13:25:04 compute-0 nova_compute[185723]: 2026-02-16 13:25:04.796 185727 DEBUG nova.scheduler.client.report [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Inventory has not changed for provider c9501a85-df32-4b8f-bce0-9425ef1e7866 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 13:25:04 compute-0 nova_compute[185723]: 2026-02-16 13:25:04.858 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 16 13:25:04 compute-0 nova_compute[185723]: 2026-02-16 13:25:04.859 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.212s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:25:05 compute-0 nova_compute[185723]: 2026-02-16 13:25:05.858 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:25:05 compute-0 nova_compute[185723]: 2026-02-16 13:25:05.859 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 16 13:25:05 compute-0 nova_compute[185723]: 2026-02-16 13:25:05.859 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 16 13:25:05 compute-0 nova_compute[185723]: 2026-02-16 13:25:05.902 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 16 13:25:05 compute-0 nova_compute[185723]: 2026-02-16 13:25:05.903 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:25:05 compute-0 nova_compute[185723]: 2026-02-16 13:25:05.904 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:25:06 compute-0 nova_compute[185723]: 2026-02-16 13:25:06.462 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:25:06 compute-0 nova_compute[185723]: 2026-02-16 13:25:06.473 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:25:07 compute-0 nova_compute[185723]: 2026-02-16 13:25:07.304 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:25:07 compute-0 nova_compute[185723]: 2026-02-16 13:25:07.432 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:25:07 compute-0 nova_compute[185723]: 2026-02-16 13:25:07.433 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 16 13:25:08 compute-0 sshd-session[208075]: Connection closed by authenticating user root 146.190.226.24 port 51234 [preauth]
Feb 16 13:25:11 compute-0 sshd-session[208077]: Connection closed by authenticating user root 64.227.72.94 port 39146 [preauth]
Feb 16 13:25:11 compute-0 nova_compute[185723]: 2026-02-16 13:25:11.465 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:25:12 compute-0 podman[208079]: 2026-02-16 13:25:12.020184857 +0000 UTC m=+0.054405390 container health_status 4ec6105d1727f201f656d7e6e358931be8ceabc5af37fb5ad779abc620122180 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 16 13:25:12 compute-0 nova_compute[185723]: 2026-02-16 13:25:12.337 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:25:16 compute-0 nova_compute[185723]: 2026-02-16 13:25:16.466 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:25:17 compute-0 nova_compute[185723]: 2026-02-16 13:25:17.338 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:25:21 compute-0 nova_compute[185723]: 2026-02-16 13:25:21.467 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:25:22 compute-0 nova_compute[185723]: 2026-02-16 13:25:22.339 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:25:26 compute-0 nova_compute[185723]: 2026-02-16 13:25:26.468 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:25:27 compute-0 nova_compute[185723]: 2026-02-16 13:25:27.341 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:25:28 compute-0 podman[208106]: 2026-02-16 13:25:28.026320219 +0000 UTC m=+0.063011035 container health_status d69bd0be854ec8043fcba555b70c2cd311c7a2510d07d6bc9929fe7684abece9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Feb 16 13:25:28 compute-0 podman[208105]: 2026-02-16 13:25:28.031224361 +0000 UTC m=+0.068294056 container health_status 93151dcbec9f159583042df5101bdd7b5b1bcb5f3117955b4bc9cd1c0e465b98 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.created=2026-02-05T04:57:10Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, managed_by=edpm_ansible, config_id=openstack_network_exporter, distribution-scope=public, release=1770267347, container_name=openstack_network_exporter, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., version=9.7, build-date=2026-02-05T04:57:10Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, vcs-type=git, architecture=x86_64, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c)
Feb 16 13:25:29 compute-0 podman[195053]: time="2026-02-16T13:25:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:25:29 compute-0 podman[195053]: @ - - [16/Feb/2026:13:25:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16012 "" "Go-http-client/1.1"
Feb 16 13:25:29 compute-0 podman[195053]: @ - - [16/Feb/2026:13:25:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2178 "" "Go-http-client/1.1"
Feb 16 13:25:30 compute-0 ovn_controller[96072]: 2026-02-16T13:25:30Z|00068|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Feb 16 13:25:31 compute-0 openstack_network_exporter[197909]: ERROR   13:25:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:25:31 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:25:31 compute-0 openstack_network_exporter[197909]: ERROR   13:25:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:25:31 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:25:31 compute-0 nova_compute[185723]: 2026-02-16 13:25:31.471 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:25:32 compute-0 nova_compute[185723]: 2026-02-16 13:25:32.343 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:25:33 compute-0 podman[208145]: 2026-02-16 13:25:33.059796303 +0000 UTC m=+0.097735691 container health_status 19961a2f872cc72daf954f4dd556f980b5f58f137d145776df6132c167a8a93c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 16 13:25:36 compute-0 nova_compute[185723]: 2026-02-16 13:25:36.472 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:25:37 compute-0 nova_compute[185723]: 2026-02-16 13:25:37.363 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:25:41 compute-0 nova_compute[185723]: 2026-02-16 13:25:41.474 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:25:42 compute-0 nova_compute[185723]: 2026-02-16 13:25:42.407 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:25:43 compute-0 podman[208172]: 2026-02-16 13:25:43.002093059 +0000 UTC m=+0.047963893 container health_status 4ec6105d1727f201f656d7e6e358931be8ceabc5af37fb5ad779abc620122180 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 16 13:25:46 compute-0 nova_compute[185723]: 2026-02-16 13:25:46.476 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:25:47 compute-0 nova_compute[185723]: 2026-02-16 13:25:47.453 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:25:51 compute-0 nova_compute[185723]: 2026-02-16 13:25:51.477 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:25:52 compute-0 nova_compute[185723]: 2026-02-16 13:25:52.483 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:25:56 compute-0 nova_compute[185723]: 2026-02-16 13:25:56.479 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:25:56 compute-0 sshd-session[208196]: Invalid user vps from 188.166.42.159 port 59938
Feb 16 13:25:57 compute-0 sshd-session[208196]: Connection closed by invalid user vps 188.166.42.159 port 59938 [preauth]
Feb 16 13:25:57 compute-0 nova_compute[185723]: 2026-02-16 13:25:57.520 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:25:59 compute-0 podman[208200]: 2026-02-16 13:25:59.058419557 +0000 UTC m=+0.091121415 container health_status 93151dcbec9f159583042df5101bdd7b5b1bcb5f3117955b4bc9cd1c0e465b98 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., org.opencontainers.image.created=2026-02-05T04:57:10Z, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-type=git, container_name=openstack_network_exporter, distribution-scope=public, release=1770267347, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, version=9.7, build-date=2026-02-05T04:57:10Z, architecture=x86_64, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9)
Feb 16 13:25:59 compute-0 podman[208201]: 2026-02-16 13:25:59.063609857 +0000 UTC m=+0.088316514 container health_status d69bd0be854ec8043fcba555b70c2cd311c7a2510d07d6bc9929fe7684abece9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3)
Feb 16 13:25:59 compute-0 sshd-session[208198]: Connection closed by authenticating user root 146.190.22.227 port 51156 [preauth]
Feb 16 13:25:59 compute-0 podman[195053]: time="2026-02-16T13:25:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:25:59 compute-0 podman[195053]: @ - - [16/Feb/2026:13:25:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16012 "" "Go-http-client/1.1"
Feb 16 13:25:59 compute-0 podman[195053]: @ - - [16/Feb/2026:13:25:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2170 "" "Go-http-client/1.1"
Feb 16 13:26:01 compute-0 openstack_network_exporter[197909]: ERROR   13:26:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:26:01 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:26:01 compute-0 openstack_network_exporter[197909]: ERROR   13:26:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:26:01 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:26:01 compute-0 nova_compute[185723]: 2026-02-16 13:26:01.433 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:26:01 compute-0 nova_compute[185723]: 2026-02-16 13:26:01.481 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:26:01 compute-0 sshd-session[208242]: Connection closed by authenticating user root 64.227.72.94 port 53958 [preauth]
Feb 16 13:26:02 compute-0 nova_compute[185723]: 2026-02-16 13:26:02.433 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:26:02 compute-0 nova_compute[185723]: 2026-02-16 13:26:02.557 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:26:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:26:03.218 105360 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:26:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:26:03.218 105360 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:26:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:26:03.219 105360 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:26:04 compute-0 podman[208244]: 2026-02-16 13:26:04.068058429 +0000 UTC m=+0.107615959 container health_status 19961a2f872cc72daf954f4dd556f980b5f58f137d145776df6132c167a8a93c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true)
Feb 16 13:26:05 compute-0 nova_compute[185723]: 2026-02-16 13:26:05.433 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:26:05 compute-0 nova_compute[185723]: 2026-02-16 13:26:05.434 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 16 13:26:05 compute-0 nova_compute[185723]: 2026-02-16 13:26:05.434 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 16 13:26:05 compute-0 nova_compute[185723]: 2026-02-16 13:26:05.452 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 16 13:26:05 compute-0 nova_compute[185723]: 2026-02-16 13:26:05.452 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:26:05 compute-0 nova_compute[185723]: 2026-02-16 13:26:05.454 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:26:06 compute-0 nova_compute[185723]: 2026-02-16 13:26:06.433 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:26:06 compute-0 nova_compute[185723]: 2026-02-16 13:26:06.433 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:26:06 compute-0 nova_compute[185723]: 2026-02-16 13:26:06.458 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:26:06 compute-0 nova_compute[185723]: 2026-02-16 13:26:06.459 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:26:06 compute-0 nova_compute[185723]: 2026-02-16 13:26:06.459 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:26:06 compute-0 nova_compute[185723]: 2026-02-16 13:26:06.459 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 16 13:26:06 compute-0 nova_compute[185723]: 2026-02-16 13:26:06.484 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:26:06 compute-0 nova_compute[185723]: 2026-02-16 13:26:06.595 185727 WARNING nova.virt.libvirt.driver [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 13:26:06 compute-0 nova_compute[185723]: 2026-02-16 13:26:06.596 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5852MB free_disk=73.22755432128906GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 16 13:26:06 compute-0 nova_compute[185723]: 2026-02-16 13:26:06.596 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:26:06 compute-0 nova_compute[185723]: 2026-02-16 13:26:06.596 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:26:06 compute-0 nova_compute[185723]: 2026-02-16 13:26:06.673 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 16 13:26:06 compute-0 nova_compute[185723]: 2026-02-16 13:26:06.673 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 16 13:26:06 compute-0 nova_compute[185723]: 2026-02-16 13:26:06.696 185727 DEBUG nova.scheduler.client.report [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Refreshing inventories for resource provider c9501a85-df32-4b8f-bce0-9425ef1e7866 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Feb 16 13:26:06 compute-0 nova_compute[185723]: 2026-02-16 13:26:06.727 185727 DEBUG nova.scheduler.client.report [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Updating ProviderTree inventory for provider c9501a85-df32-4b8f-bce0-9425ef1e7866 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Feb 16 13:26:06 compute-0 nova_compute[185723]: 2026-02-16 13:26:06.728 185727 DEBUG nova.compute.provider_tree [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Updating inventory in ProviderTree for provider c9501a85-df32-4b8f-bce0-9425ef1e7866 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Feb 16 13:26:06 compute-0 nova_compute[185723]: 2026-02-16 13:26:06.748 185727 DEBUG nova.scheduler.client.report [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Refreshing aggregate associations for resource provider c9501a85-df32-4b8f-bce0-9425ef1e7866, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Feb 16 13:26:06 compute-0 nova_compute[185723]: 2026-02-16 13:26:06.775 185727 DEBUG nova.scheduler.client.report [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Refreshing trait associations for resource provider c9501a85-df32-4b8f-bce0-9425ef1e7866, traits: COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_MMX,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_SSE,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_SSE2,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_SECURITY_TPM_1_2,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_VOLUME_EXTEND,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_STORAGE_BUS_USB,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_SECURITY_TPM_2_0,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_ACCELERATORS,COMPUTE_STORAGE_BUS_FDC,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NODE,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_SSE41,COMPUTE_IMAGE_TYPE_AKI _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Feb 16 13:26:06 compute-0 nova_compute[185723]: 2026-02-16 13:26:06.803 185727 DEBUG nova.compute.provider_tree [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Inventory has not changed in ProviderTree for provider: c9501a85-df32-4b8f-bce0-9425ef1e7866 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 13:26:06 compute-0 nova_compute[185723]: 2026-02-16 13:26:06.830 185727 DEBUG nova.scheduler.client.report [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Inventory has not changed for provider c9501a85-df32-4b8f-bce0-9425ef1e7866 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 13:26:06 compute-0 nova_compute[185723]: 2026-02-16 13:26:06.833 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 16 13:26:06 compute-0 nova_compute[185723]: 2026-02-16 13:26:06.833 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.237s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:26:07 compute-0 nova_compute[185723]: 2026-02-16 13:26:07.611 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:26:07 compute-0 nova_compute[185723]: 2026-02-16 13:26:07.829 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:26:08 compute-0 nova_compute[185723]: 2026-02-16 13:26:08.676 185727 DEBUG oslo_concurrency.lockutils [None req-967194eb-f58b-427a-bf7c-b57358159c8c 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] Acquiring lock "c6353280-0641-466d-9963-30eb530755e9" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:26:08 compute-0 nova_compute[185723]: 2026-02-16 13:26:08.677 185727 DEBUG oslo_concurrency.lockutils [None req-967194eb-f58b-427a-bf7c-b57358159c8c 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] Lock "c6353280-0641-466d-9963-30eb530755e9" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:26:08 compute-0 nova_compute[185723]: 2026-02-16 13:26:08.699 185727 DEBUG nova.compute.manager [None req-967194eb-f58b-427a-bf7c-b57358159c8c 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] [instance: c6353280-0641-466d-9963-30eb530755e9] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Feb 16 13:26:08 compute-0 nova_compute[185723]: 2026-02-16 13:26:08.785 185727 DEBUG oslo_concurrency.lockutils [None req-967194eb-f58b-427a-bf7c-b57358159c8c 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:26:08 compute-0 nova_compute[185723]: 2026-02-16 13:26:08.785 185727 DEBUG oslo_concurrency.lockutils [None req-967194eb-f58b-427a-bf7c-b57358159c8c 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:26:08 compute-0 nova_compute[185723]: 2026-02-16 13:26:08.791 185727 DEBUG nova.virt.hardware [None req-967194eb-f58b-427a-bf7c-b57358159c8c 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Feb 16 13:26:08 compute-0 nova_compute[185723]: 2026-02-16 13:26:08.792 185727 INFO nova.compute.claims [None req-967194eb-f58b-427a-bf7c-b57358159c8c 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] [instance: c6353280-0641-466d-9963-30eb530755e9] Claim successful on node compute-0.ctlplane.example.com
Feb 16 13:26:09 compute-0 nova_compute[185723]: 2026-02-16 13:26:09.158 185727 DEBUG nova.compute.provider_tree [None req-967194eb-f58b-427a-bf7c-b57358159c8c 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] Inventory has not changed in ProviderTree for provider: c9501a85-df32-4b8f-bce0-9425ef1e7866 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 13:26:09 compute-0 nova_compute[185723]: 2026-02-16 13:26:09.172 185727 DEBUG nova.scheduler.client.report [None req-967194eb-f58b-427a-bf7c-b57358159c8c 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] Inventory has not changed for provider c9501a85-df32-4b8f-bce0-9425ef1e7866 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 13:26:09 compute-0 nova_compute[185723]: 2026-02-16 13:26:09.195 185727 DEBUG oslo_concurrency.lockutils [None req-967194eb-f58b-427a-bf7c-b57358159c8c 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.410s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:26:09 compute-0 nova_compute[185723]: 2026-02-16 13:26:09.196 185727 DEBUG nova.compute.manager [None req-967194eb-f58b-427a-bf7c-b57358159c8c 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] [instance: c6353280-0641-466d-9963-30eb530755e9] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Feb 16 13:26:09 compute-0 nova_compute[185723]: 2026-02-16 13:26:09.260 185727 DEBUG nova.compute.manager [None req-967194eb-f58b-427a-bf7c-b57358159c8c 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] [instance: c6353280-0641-466d-9963-30eb530755e9] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Feb 16 13:26:09 compute-0 nova_compute[185723]: 2026-02-16 13:26:09.261 185727 DEBUG nova.network.neutron [None req-967194eb-f58b-427a-bf7c-b57358159c8c 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] [instance: c6353280-0641-466d-9963-30eb530755e9] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Feb 16 13:26:09 compute-0 nova_compute[185723]: 2026-02-16 13:26:09.286 185727 INFO nova.virt.libvirt.driver [None req-967194eb-f58b-427a-bf7c-b57358159c8c 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] [instance: c6353280-0641-466d-9963-30eb530755e9] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Feb 16 13:26:09 compute-0 nova_compute[185723]: 2026-02-16 13:26:09.320 185727 DEBUG nova.compute.manager [None req-967194eb-f58b-427a-bf7c-b57358159c8c 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] [instance: c6353280-0641-466d-9963-30eb530755e9] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Feb 16 13:26:09 compute-0 nova_compute[185723]: 2026-02-16 13:26:09.416 185727 DEBUG nova.compute.manager [None req-967194eb-f58b-427a-bf7c-b57358159c8c 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] [instance: c6353280-0641-466d-9963-30eb530755e9] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Feb 16 13:26:09 compute-0 nova_compute[185723]: 2026-02-16 13:26:09.417 185727 DEBUG nova.virt.libvirt.driver [None req-967194eb-f58b-427a-bf7c-b57358159c8c 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] [instance: c6353280-0641-466d-9963-30eb530755e9] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Feb 16 13:26:09 compute-0 nova_compute[185723]: 2026-02-16 13:26:09.417 185727 INFO nova.virt.libvirt.driver [None req-967194eb-f58b-427a-bf7c-b57358159c8c 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] [instance: c6353280-0641-466d-9963-30eb530755e9] Creating image(s)
Feb 16 13:26:09 compute-0 nova_compute[185723]: 2026-02-16 13:26:09.418 185727 DEBUG oslo_concurrency.lockutils [None req-967194eb-f58b-427a-bf7c-b57358159c8c 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] Acquiring lock "/var/lib/nova/instances/c6353280-0641-466d-9963-30eb530755e9/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:26:09 compute-0 nova_compute[185723]: 2026-02-16 13:26:09.418 185727 DEBUG oslo_concurrency.lockutils [None req-967194eb-f58b-427a-bf7c-b57358159c8c 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] Lock "/var/lib/nova/instances/c6353280-0641-466d-9963-30eb530755e9/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:26:09 compute-0 nova_compute[185723]: 2026-02-16 13:26:09.418 185727 DEBUG oslo_concurrency.lockutils [None req-967194eb-f58b-427a-bf7c-b57358159c8c 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] Lock "/var/lib/nova/instances/c6353280-0641-466d-9963-30eb530755e9/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:26:09 compute-0 nova_compute[185723]: 2026-02-16 13:26:09.432 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:26:09 compute-0 nova_compute[185723]: 2026-02-16 13:26:09.432 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 16 13:26:09 compute-0 nova_compute[185723]: 2026-02-16 13:26:09.433 185727 DEBUG oslo_concurrency.processutils [None req-967194eb-f58b-427a-bf7c-b57358159c8c 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:26:09 compute-0 nova_compute[185723]: 2026-02-16 13:26:09.456 185727 DEBUG nova.policy [None req-967194eb-f58b-427a-bf7c-b57358159c8c 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '566db36bffff4193a494fef52f968126', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '67efa696c46c451ba23d1157e0816503', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Feb 16 13:26:09 compute-0 nova_compute[185723]: 2026-02-16 13:26:09.480 185727 DEBUG oslo_concurrency.processutils [None req-967194eb-f58b-427a-bf7c-b57358159c8c 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json" returned: 0 in 0.047s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:26:09 compute-0 nova_compute[185723]: 2026-02-16 13:26:09.481 185727 DEBUG oslo_concurrency.lockutils [None req-967194eb-f58b-427a-bf7c-b57358159c8c 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] Acquiring lock "755ae6cb63977e865a71a8c38a1a67ce95c9d0b7" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:26:09 compute-0 nova_compute[185723]: 2026-02-16 13:26:09.482 185727 DEBUG oslo_concurrency.lockutils [None req-967194eb-f58b-427a-bf7c-b57358159c8c 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] Lock "755ae6cb63977e865a71a8c38a1a67ce95c9d0b7" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:26:09 compute-0 nova_compute[185723]: 2026-02-16 13:26:09.492 185727 DEBUG oslo_concurrency.processutils [None req-967194eb-f58b-427a-bf7c-b57358159c8c 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:26:09 compute-0 nova_compute[185723]: 2026-02-16 13:26:09.535 185727 DEBUG oslo_concurrency.processutils [None req-967194eb-f58b-427a-bf7c-b57358159c8c 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json" returned: 0 in 0.043s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:26:09 compute-0 nova_compute[185723]: 2026-02-16 13:26:09.536 185727 DEBUG oslo_concurrency.processutils [None req-967194eb-f58b-427a-bf7c-b57358159c8c 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7,backing_fmt=raw /var/lib/nova/instances/c6353280-0641-466d-9963-30eb530755e9/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:26:09 compute-0 nova_compute[185723]: 2026-02-16 13:26:09.561 185727 DEBUG oslo_concurrency.processutils [None req-967194eb-f58b-427a-bf7c-b57358159c8c 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7,backing_fmt=raw /var/lib/nova/instances/c6353280-0641-466d-9963-30eb530755e9/disk 1073741824" returned: 0 in 0.024s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:26:09 compute-0 nova_compute[185723]: 2026-02-16 13:26:09.562 185727 DEBUG oslo_concurrency.lockutils [None req-967194eb-f58b-427a-bf7c-b57358159c8c 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] Lock "755ae6cb63977e865a71a8c38a1a67ce95c9d0b7" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.080s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:26:09 compute-0 nova_compute[185723]: 2026-02-16 13:26:09.562 185727 DEBUG oslo_concurrency.processutils [None req-967194eb-f58b-427a-bf7c-b57358159c8c 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:26:09 compute-0 nova_compute[185723]: 2026-02-16 13:26:09.606 185727 DEBUG oslo_concurrency.processutils [None req-967194eb-f58b-427a-bf7c-b57358159c8c 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:26:09 compute-0 nova_compute[185723]: 2026-02-16 13:26:09.607 185727 DEBUG nova.virt.disk.api [None req-967194eb-f58b-427a-bf7c-b57358159c8c 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] Checking if we can resize image /var/lib/nova/instances/c6353280-0641-466d-9963-30eb530755e9/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Feb 16 13:26:09 compute-0 nova_compute[185723]: 2026-02-16 13:26:09.608 185727 DEBUG oslo_concurrency.processutils [None req-967194eb-f58b-427a-bf7c-b57358159c8c 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c6353280-0641-466d-9963-30eb530755e9/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:26:09 compute-0 nova_compute[185723]: 2026-02-16 13:26:09.673 185727 DEBUG oslo_concurrency.processutils [None req-967194eb-f58b-427a-bf7c-b57358159c8c 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c6353280-0641-466d-9963-30eb530755e9/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:26:09 compute-0 nova_compute[185723]: 2026-02-16 13:26:09.674 185727 DEBUG nova.virt.disk.api [None req-967194eb-f58b-427a-bf7c-b57358159c8c 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] Cannot resize image /var/lib/nova/instances/c6353280-0641-466d-9963-30eb530755e9/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Feb 16 13:26:09 compute-0 nova_compute[185723]: 2026-02-16 13:26:09.675 185727 DEBUG nova.objects.instance [None req-967194eb-f58b-427a-bf7c-b57358159c8c 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] Lazy-loading 'migration_context' on Instance uuid c6353280-0641-466d-9963-30eb530755e9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 13:26:09 compute-0 nova_compute[185723]: 2026-02-16 13:26:09.692 185727 DEBUG nova.virt.libvirt.driver [None req-967194eb-f58b-427a-bf7c-b57358159c8c 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] [instance: c6353280-0641-466d-9963-30eb530755e9] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Feb 16 13:26:09 compute-0 nova_compute[185723]: 2026-02-16 13:26:09.693 185727 DEBUG nova.virt.libvirt.driver [None req-967194eb-f58b-427a-bf7c-b57358159c8c 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] [instance: c6353280-0641-466d-9963-30eb530755e9] Ensure instance console log exists: /var/lib/nova/instances/c6353280-0641-466d-9963-30eb530755e9/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Feb 16 13:26:09 compute-0 nova_compute[185723]: 2026-02-16 13:26:09.693 185727 DEBUG oslo_concurrency.lockutils [None req-967194eb-f58b-427a-bf7c-b57358159c8c 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:26:09 compute-0 nova_compute[185723]: 2026-02-16 13:26:09.693 185727 DEBUG oslo_concurrency.lockutils [None req-967194eb-f58b-427a-bf7c-b57358159c8c 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:26:09 compute-0 nova_compute[185723]: 2026-02-16 13:26:09.694 185727 DEBUG oslo_concurrency.lockutils [None req-967194eb-f58b-427a-bf7c-b57358159c8c 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:26:10 compute-0 nova_compute[185723]: 2026-02-16 13:26:10.439 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:26:10 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:26:10.440 105360 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=9, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '52:96:1b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '16:6c:7a:bd:49:39'}, ipsec=False) old=SB_Global(nb_cfg=8) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 13:26:10 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:26:10.441 105360 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 16 13:26:10 compute-0 nova_compute[185723]: 2026-02-16 13:26:10.527 185727 DEBUG nova.network.neutron [None req-967194eb-f58b-427a-bf7c-b57358159c8c 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] [instance: c6353280-0641-466d-9963-30eb530755e9] Successfully created port: 68d12bd9-0c21-41b6-b775-1de285c4be2c _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Feb 16 13:26:11 compute-0 nova_compute[185723]: 2026-02-16 13:26:11.485 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:26:11 compute-0 nova_compute[185723]: 2026-02-16 13:26:11.770 185727 DEBUG nova.network.neutron [None req-967194eb-f58b-427a-bf7c-b57358159c8c 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] [instance: c6353280-0641-466d-9963-30eb530755e9] Successfully updated port: 68d12bd9-0c21-41b6-b775-1de285c4be2c _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Feb 16 13:26:11 compute-0 nova_compute[185723]: 2026-02-16 13:26:11.787 185727 DEBUG oslo_concurrency.lockutils [None req-967194eb-f58b-427a-bf7c-b57358159c8c 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] Acquiring lock "refresh_cache-c6353280-0641-466d-9963-30eb530755e9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 13:26:11 compute-0 nova_compute[185723]: 2026-02-16 13:26:11.787 185727 DEBUG oslo_concurrency.lockutils [None req-967194eb-f58b-427a-bf7c-b57358159c8c 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] Acquired lock "refresh_cache-c6353280-0641-466d-9963-30eb530755e9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 13:26:11 compute-0 nova_compute[185723]: 2026-02-16 13:26:11.788 185727 DEBUG nova.network.neutron [None req-967194eb-f58b-427a-bf7c-b57358159c8c 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] [instance: c6353280-0641-466d-9963-30eb530755e9] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 16 13:26:11 compute-0 nova_compute[185723]: 2026-02-16 13:26:11.869 185727 DEBUG nova.compute.manager [req-048bc99d-c423-4834-9f66-807b50299d73 req-b004c738-c0a4-4a6e-8b65-dac01e6419be faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: c6353280-0641-466d-9963-30eb530755e9] Received event network-changed-68d12bd9-0c21-41b6-b775-1de285c4be2c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:26:11 compute-0 nova_compute[185723]: 2026-02-16 13:26:11.870 185727 DEBUG nova.compute.manager [req-048bc99d-c423-4834-9f66-807b50299d73 req-b004c738-c0a4-4a6e-8b65-dac01e6419be faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: c6353280-0641-466d-9963-30eb530755e9] Refreshing instance network info cache due to event network-changed-68d12bd9-0c21-41b6-b775-1de285c4be2c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 16 13:26:11 compute-0 nova_compute[185723]: 2026-02-16 13:26:11.870 185727 DEBUG oslo_concurrency.lockutils [req-048bc99d-c423-4834-9f66-807b50299d73 req-b004c738-c0a4-4a6e-8b65-dac01e6419be faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "refresh_cache-c6353280-0641-466d-9963-30eb530755e9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 13:26:12 compute-0 nova_compute[185723]: 2026-02-16 13:26:12.480 185727 DEBUG nova.network.neutron [None req-967194eb-f58b-427a-bf7c-b57358159c8c 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] [instance: c6353280-0641-466d-9963-30eb530755e9] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Feb 16 13:26:12 compute-0 sshd-session[208287]: Connection closed by authenticating user root 146.190.226.24 port 52814 [preauth]
Feb 16 13:26:12 compute-0 nova_compute[185723]: 2026-02-16 13:26:12.625 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:26:13 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:26:13.443 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=b0e583b2-47d7-4bde-bbd6-282143e0c194, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '9'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:26:13 compute-0 nova_compute[185723]: 2026-02-16 13:26:13.811 185727 DEBUG nova.network.neutron [None req-967194eb-f58b-427a-bf7c-b57358159c8c 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] [instance: c6353280-0641-466d-9963-30eb530755e9] Updating instance_info_cache with network_info: [{"id": "68d12bd9-0c21-41b6-b775-1de285c4be2c", "address": "fa:16:3e:44:c3:87", "network": {"id": "85f5abea-ac25-4244-a69b-79e29b2ba1fc", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-950978049-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "67efa696c46c451ba23d1157e0816503", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap68d12bd9-0c", "ovs_interfaceid": "68d12bd9-0c21-41b6-b775-1de285c4be2c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 13:26:13 compute-0 nova_compute[185723]: 2026-02-16 13:26:13.837 185727 DEBUG oslo_concurrency.lockutils [None req-967194eb-f58b-427a-bf7c-b57358159c8c 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] Releasing lock "refresh_cache-c6353280-0641-466d-9963-30eb530755e9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 13:26:13 compute-0 nova_compute[185723]: 2026-02-16 13:26:13.837 185727 DEBUG nova.compute.manager [None req-967194eb-f58b-427a-bf7c-b57358159c8c 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] [instance: c6353280-0641-466d-9963-30eb530755e9] Instance network_info: |[{"id": "68d12bd9-0c21-41b6-b775-1de285c4be2c", "address": "fa:16:3e:44:c3:87", "network": {"id": "85f5abea-ac25-4244-a69b-79e29b2ba1fc", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-950978049-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "67efa696c46c451ba23d1157e0816503", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap68d12bd9-0c", "ovs_interfaceid": "68d12bd9-0c21-41b6-b775-1de285c4be2c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Feb 16 13:26:13 compute-0 nova_compute[185723]: 2026-02-16 13:26:13.838 185727 DEBUG oslo_concurrency.lockutils [req-048bc99d-c423-4834-9f66-807b50299d73 req-b004c738-c0a4-4a6e-8b65-dac01e6419be faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquired lock "refresh_cache-c6353280-0641-466d-9963-30eb530755e9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 13:26:13 compute-0 nova_compute[185723]: 2026-02-16 13:26:13.838 185727 DEBUG nova.network.neutron [req-048bc99d-c423-4834-9f66-807b50299d73 req-b004c738-c0a4-4a6e-8b65-dac01e6419be faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: c6353280-0641-466d-9963-30eb530755e9] Refreshing network info cache for port 68d12bd9-0c21-41b6-b775-1de285c4be2c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 16 13:26:13 compute-0 nova_compute[185723]: 2026-02-16 13:26:13.841 185727 DEBUG nova.virt.libvirt.driver [None req-967194eb-f58b-427a-bf7c-b57358159c8c 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] [instance: c6353280-0641-466d-9963-30eb530755e9] Start _get_guest_xml network_info=[{"id": "68d12bd9-0c21-41b6-b775-1de285c4be2c", "address": "fa:16:3e:44:c3:87", "network": {"id": "85f5abea-ac25-4244-a69b-79e29b2ba1fc", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-950978049-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "67efa696c46c451ba23d1157e0816503", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap68d12bd9-0c", "ovs_interfaceid": "68d12bd9-0c21-41b6-b775-1de285c4be2c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-16T13:17:01Z,direct_url=<?>,disk_format='qcow2',id=6fb9af7f-2971-4890-a777-6e99e888717f,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='fa8ec31824694513a42cc22c81880a5c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-16T13:17:03Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'encrypted': False, 'device_type': 'disk', 'disk_bus': 'virtio', 'encryption_options': None, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'size': 0, 'image_id': '6fb9af7f-2971-4890-a777-6e99e888717f'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Feb 16 13:26:13 compute-0 nova_compute[185723]: 2026-02-16 13:26:13.846 185727 WARNING nova.virt.libvirt.driver [None req-967194eb-f58b-427a-bf7c-b57358159c8c 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 13:26:13 compute-0 nova_compute[185723]: 2026-02-16 13:26:13.853 185727 DEBUG nova.virt.libvirt.host [None req-967194eb-f58b-427a-bf7c-b57358159c8c 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Feb 16 13:26:13 compute-0 nova_compute[185723]: 2026-02-16 13:26:13.854 185727 DEBUG nova.virt.libvirt.host [None req-967194eb-f58b-427a-bf7c-b57358159c8c 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Feb 16 13:26:13 compute-0 nova_compute[185723]: 2026-02-16 13:26:13.863 185727 DEBUG nova.virt.libvirt.host [None req-967194eb-f58b-427a-bf7c-b57358159c8c 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Feb 16 13:26:13 compute-0 nova_compute[185723]: 2026-02-16 13:26:13.864 185727 DEBUG nova.virt.libvirt.host [None req-967194eb-f58b-427a-bf7c-b57358159c8c 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Feb 16 13:26:13 compute-0 nova_compute[185723]: 2026-02-16 13:26:13.866 185727 DEBUG nova.virt.libvirt.driver [None req-967194eb-f58b-427a-bf7c-b57358159c8c 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Feb 16 13:26:13 compute-0 nova_compute[185723]: 2026-02-16 13:26:13.866 185727 DEBUG nova.virt.hardware [None req-967194eb-f58b-427a-bf7c-b57358159c8c 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-02-16T13:16:59Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='6d89f72c-1760-421e-a5f2-83dfc3723b84',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-16T13:17:01Z,direct_url=<?>,disk_format='qcow2',id=6fb9af7f-2971-4890-a777-6e99e888717f,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='fa8ec31824694513a42cc22c81880a5c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-16T13:17:03Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Feb 16 13:26:13 compute-0 nova_compute[185723]: 2026-02-16 13:26:13.866 185727 DEBUG nova.virt.hardware [None req-967194eb-f58b-427a-bf7c-b57358159c8c 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Feb 16 13:26:13 compute-0 nova_compute[185723]: 2026-02-16 13:26:13.866 185727 DEBUG nova.virt.hardware [None req-967194eb-f58b-427a-bf7c-b57358159c8c 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Feb 16 13:26:13 compute-0 nova_compute[185723]: 2026-02-16 13:26:13.867 185727 DEBUG nova.virt.hardware [None req-967194eb-f58b-427a-bf7c-b57358159c8c 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Feb 16 13:26:13 compute-0 nova_compute[185723]: 2026-02-16 13:26:13.867 185727 DEBUG nova.virt.hardware [None req-967194eb-f58b-427a-bf7c-b57358159c8c 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Feb 16 13:26:13 compute-0 nova_compute[185723]: 2026-02-16 13:26:13.867 185727 DEBUG nova.virt.hardware [None req-967194eb-f58b-427a-bf7c-b57358159c8c 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Feb 16 13:26:13 compute-0 nova_compute[185723]: 2026-02-16 13:26:13.867 185727 DEBUG nova.virt.hardware [None req-967194eb-f58b-427a-bf7c-b57358159c8c 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Feb 16 13:26:13 compute-0 nova_compute[185723]: 2026-02-16 13:26:13.867 185727 DEBUG nova.virt.hardware [None req-967194eb-f58b-427a-bf7c-b57358159c8c 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Feb 16 13:26:13 compute-0 nova_compute[185723]: 2026-02-16 13:26:13.868 185727 DEBUG nova.virt.hardware [None req-967194eb-f58b-427a-bf7c-b57358159c8c 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Feb 16 13:26:13 compute-0 nova_compute[185723]: 2026-02-16 13:26:13.868 185727 DEBUG nova.virt.hardware [None req-967194eb-f58b-427a-bf7c-b57358159c8c 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Feb 16 13:26:13 compute-0 nova_compute[185723]: 2026-02-16 13:26:13.868 185727 DEBUG nova.virt.hardware [None req-967194eb-f58b-427a-bf7c-b57358159c8c 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Feb 16 13:26:13 compute-0 nova_compute[185723]: 2026-02-16 13:26:13.871 185727 DEBUG nova.virt.libvirt.vif [None req-967194eb-f58b-427a-bf7c-b57358159c8c 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-16T13:26:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteBasicStrategy-server-92456537',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutebasicstrategy-server-92456537',id=8,image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='67efa696c46c451ba23d1157e0816503',ramdisk_id='',reservation_id='r-d5o7uffx',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteBasicStrategy-2074109192',owner_user_name='tempest-TestExecuteBasicStrategy-2074109192-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-16T13:26:09Z,user_data=None,user_id='566db36bffff4193a494fef52f968126',uuid=c6353280-0641-466d-9963-30eb530755e9,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "68d12bd9-0c21-41b6-b775-1de285c4be2c", "address": "fa:16:3e:44:c3:87", "network": {"id": "85f5abea-ac25-4244-a69b-79e29b2ba1fc", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-950978049-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "67efa696c46c451ba23d1157e0816503", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap68d12bd9-0c", "ovs_interfaceid": "68d12bd9-0c21-41b6-b775-1de285c4be2c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Feb 16 13:26:13 compute-0 nova_compute[185723]: 2026-02-16 13:26:13.872 185727 DEBUG nova.network.os_vif_util [None req-967194eb-f58b-427a-bf7c-b57358159c8c 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] Converting VIF {"id": "68d12bd9-0c21-41b6-b775-1de285c4be2c", "address": "fa:16:3e:44:c3:87", "network": {"id": "85f5abea-ac25-4244-a69b-79e29b2ba1fc", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-950978049-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "67efa696c46c451ba23d1157e0816503", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap68d12bd9-0c", "ovs_interfaceid": "68d12bd9-0c21-41b6-b775-1de285c4be2c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 13:26:13 compute-0 nova_compute[185723]: 2026-02-16 13:26:13.873 185727 DEBUG nova.network.os_vif_util [None req-967194eb-f58b-427a-bf7c-b57358159c8c 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:44:c3:87,bridge_name='br-int',has_traffic_filtering=True,id=68d12bd9-0c21-41b6-b775-1de285c4be2c,network=Network(85f5abea-ac25-4244-a69b-79e29b2ba1fc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap68d12bd9-0c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 13:26:13 compute-0 nova_compute[185723]: 2026-02-16 13:26:13.873 185727 DEBUG nova.objects.instance [None req-967194eb-f58b-427a-bf7c-b57358159c8c 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] Lazy-loading 'pci_devices' on Instance uuid c6353280-0641-466d-9963-30eb530755e9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 13:26:13 compute-0 nova_compute[185723]: 2026-02-16 13:26:13.894 185727 DEBUG nova.virt.libvirt.driver [None req-967194eb-f58b-427a-bf7c-b57358159c8c 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] [instance: c6353280-0641-466d-9963-30eb530755e9] End _get_guest_xml xml=<domain type="kvm">
Feb 16 13:26:13 compute-0 nova_compute[185723]:   <uuid>c6353280-0641-466d-9963-30eb530755e9</uuid>
Feb 16 13:26:13 compute-0 nova_compute[185723]:   <name>instance-00000008</name>
Feb 16 13:26:13 compute-0 nova_compute[185723]:   <memory>131072</memory>
Feb 16 13:26:13 compute-0 nova_compute[185723]:   <vcpu>1</vcpu>
Feb 16 13:26:13 compute-0 nova_compute[185723]:   <metadata>
Feb 16 13:26:13 compute-0 nova_compute[185723]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb 16 13:26:13 compute-0 nova_compute[185723]:       <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Feb 16 13:26:13 compute-0 nova_compute[185723]:       <nova:name>tempest-TestExecuteBasicStrategy-server-92456537</nova:name>
Feb 16 13:26:13 compute-0 nova_compute[185723]:       <nova:creationTime>2026-02-16 13:26:13</nova:creationTime>
Feb 16 13:26:13 compute-0 nova_compute[185723]:       <nova:flavor name="m1.nano">
Feb 16 13:26:13 compute-0 nova_compute[185723]:         <nova:memory>128</nova:memory>
Feb 16 13:26:13 compute-0 nova_compute[185723]:         <nova:disk>1</nova:disk>
Feb 16 13:26:13 compute-0 nova_compute[185723]:         <nova:swap>0</nova:swap>
Feb 16 13:26:13 compute-0 nova_compute[185723]:         <nova:ephemeral>0</nova:ephemeral>
Feb 16 13:26:13 compute-0 nova_compute[185723]:         <nova:vcpus>1</nova:vcpus>
Feb 16 13:26:13 compute-0 nova_compute[185723]:       </nova:flavor>
Feb 16 13:26:13 compute-0 nova_compute[185723]:       <nova:owner>
Feb 16 13:26:13 compute-0 nova_compute[185723]:         <nova:user uuid="566db36bffff4193a494fef52f968126">tempest-TestExecuteBasicStrategy-2074109192-project-member</nova:user>
Feb 16 13:26:13 compute-0 nova_compute[185723]:         <nova:project uuid="67efa696c46c451ba23d1157e0816503">tempest-TestExecuteBasicStrategy-2074109192</nova:project>
Feb 16 13:26:13 compute-0 nova_compute[185723]:       </nova:owner>
Feb 16 13:26:13 compute-0 nova_compute[185723]:       <nova:root type="image" uuid="6fb9af7f-2971-4890-a777-6e99e888717f"/>
Feb 16 13:26:13 compute-0 nova_compute[185723]:       <nova:ports>
Feb 16 13:26:13 compute-0 nova_compute[185723]:         <nova:port uuid="68d12bd9-0c21-41b6-b775-1de285c4be2c">
Feb 16 13:26:13 compute-0 nova_compute[185723]:           <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Feb 16 13:26:13 compute-0 nova_compute[185723]:         </nova:port>
Feb 16 13:26:13 compute-0 nova_compute[185723]:       </nova:ports>
Feb 16 13:26:13 compute-0 nova_compute[185723]:     </nova:instance>
Feb 16 13:26:13 compute-0 nova_compute[185723]:   </metadata>
Feb 16 13:26:13 compute-0 nova_compute[185723]:   <sysinfo type="smbios">
Feb 16 13:26:13 compute-0 nova_compute[185723]:     <system>
Feb 16 13:26:13 compute-0 nova_compute[185723]:       <entry name="manufacturer">RDO</entry>
Feb 16 13:26:13 compute-0 nova_compute[185723]:       <entry name="product">OpenStack Compute</entry>
Feb 16 13:26:13 compute-0 nova_compute[185723]:       <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Feb 16 13:26:13 compute-0 nova_compute[185723]:       <entry name="serial">c6353280-0641-466d-9963-30eb530755e9</entry>
Feb 16 13:26:13 compute-0 nova_compute[185723]:       <entry name="uuid">c6353280-0641-466d-9963-30eb530755e9</entry>
Feb 16 13:26:13 compute-0 nova_compute[185723]:       <entry name="family">Virtual Machine</entry>
Feb 16 13:26:13 compute-0 nova_compute[185723]:     </system>
Feb 16 13:26:13 compute-0 nova_compute[185723]:   </sysinfo>
Feb 16 13:26:13 compute-0 nova_compute[185723]:   <os>
Feb 16 13:26:13 compute-0 nova_compute[185723]:     <type arch="x86_64" machine="q35">hvm</type>
Feb 16 13:26:13 compute-0 nova_compute[185723]:     <boot dev="hd"/>
Feb 16 13:26:13 compute-0 nova_compute[185723]:     <smbios mode="sysinfo"/>
Feb 16 13:26:13 compute-0 nova_compute[185723]:   </os>
Feb 16 13:26:13 compute-0 nova_compute[185723]:   <features>
Feb 16 13:26:13 compute-0 nova_compute[185723]:     <acpi/>
Feb 16 13:26:13 compute-0 nova_compute[185723]:     <apic/>
Feb 16 13:26:13 compute-0 nova_compute[185723]:     <vmcoreinfo/>
Feb 16 13:26:13 compute-0 nova_compute[185723]:   </features>
Feb 16 13:26:13 compute-0 nova_compute[185723]:   <clock offset="utc">
Feb 16 13:26:13 compute-0 nova_compute[185723]:     <timer name="pit" tickpolicy="delay"/>
Feb 16 13:26:13 compute-0 nova_compute[185723]:     <timer name="rtc" tickpolicy="catchup"/>
Feb 16 13:26:13 compute-0 nova_compute[185723]:     <timer name="hpet" present="no"/>
Feb 16 13:26:13 compute-0 nova_compute[185723]:   </clock>
Feb 16 13:26:13 compute-0 nova_compute[185723]:   <cpu mode="custom" match="exact">
Feb 16 13:26:13 compute-0 nova_compute[185723]:     <model>Nehalem</model>
Feb 16 13:26:13 compute-0 nova_compute[185723]:     <topology sockets="1" cores="1" threads="1"/>
Feb 16 13:26:13 compute-0 nova_compute[185723]:   </cpu>
Feb 16 13:26:13 compute-0 nova_compute[185723]:   <devices>
Feb 16 13:26:13 compute-0 nova_compute[185723]:     <disk type="file" device="disk">
Feb 16 13:26:13 compute-0 nova_compute[185723]:       <driver name="qemu" type="qcow2" cache="none"/>
Feb 16 13:26:13 compute-0 nova_compute[185723]:       <source file="/var/lib/nova/instances/c6353280-0641-466d-9963-30eb530755e9/disk"/>
Feb 16 13:26:13 compute-0 nova_compute[185723]:       <target dev="vda" bus="virtio"/>
Feb 16 13:26:13 compute-0 nova_compute[185723]:     </disk>
Feb 16 13:26:13 compute-0 nova_compute[185723]:     <disk type="file" device="cdrom">
Feb 16 13:26:13 compute-0 nova_compute[185723]:       <driver name="qemu" type="raw" cache="none"/>
Feb 16 13:26:13 compute-0 nova_compute[185723]:       <source file="/var/lib/nova/instances/c6353280-0641-466d-9963-30eb530755e9/disk.config"/>
Feb 16 13:26:13 compute-0 nova_compute[185723]:       <target dev="sda" bus="sata"/>
Feb 16 13:26:13 compute-0 nova_compute[185723]:     </disk>
Feb 16 13:26:13 compute-0 nova_compute[185723]:     <interface type="ethernet">
Feb 16 13:26:13 compute-0 nova_compute[185723]:       <mac address="fa:16:3e:44:c3:87"/>
Feb 16 13:26:13 compute-0 nova_compute[185723]:       <model type="virtio"/>
Feb 16 13:26:13 compute-0 nova_compute[185723]:       <driver name="vhost" rx_queue_size="512"/>
Feb 16 13:26:13 compute-0 nova_compute[185723]:       <mtu size="1442"/>
Feb 16 13:26:13 compute-0 nova_compute[185723]:       <target dev="tap68d12bd9-0c"/>
Feb 16 13:26:13 compute-0 nova_compute[185723]:     </interface>
Feb 16 13:26:13 compute-0 nova_compute[185723]:     <serial type="pty">
Feb 16 13:26:13 compute-0 nova_compute[185723]:       <log file="/var/lib/nova/instances/c6353280-0641-466d-9963-30eb530755e9/console.log" append="off"/>
Feb 16 13:26:13 compute-0 nova_compute[185723]:     </serial>
Feb 16 13:26:13 compute-0 nova_compute[185723]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Feb 16 13:26:13 compute-0 nova_compute[185723]:     <video>
Feb 16 13:26:13 compute-0 nova_compute[185723]:       <model type="virtio"/>
Feb 16 13:26:13 compute-0 nova_compute[185723]:     </video>
Feb 16 13:26:13 compute-0 nova_compute[185723]:     <input type="tablet" bus="usb"/>
Feb 16 13:26:13 compute-0 nova_compute[185723]:     <rng model="virtio">
Feb 16 13:26:13 compute-0 nova_compute[185723]:       <backend model="random">/dev/urandom</backend>
Feb 16 13:26:13 compute-0 nova_compute[185723]:     </rng>
Feb 16 13:26:13 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root"/>
Feb 16 13:26:13 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:26:13 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:26:13 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:26:13 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:26:13 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:26:13 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:26:13 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:26:13 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:26:13 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:26:13 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:26:13 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:26:13 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:26:13 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:26:13 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:26:13 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:26:13 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:26:13 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:26:13 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:26:13 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:26:13 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:26:13 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:26:13 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:26:13 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:26:13 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:26:13 compute-0 nova_compute[185723]:     <controller type="usb" index="0"/>
Feb 16 13:26:13 compute-0 nova_compute[185723]:     <memballoon model="virtio">
Feb 16 13:26:13 compute-0 nova_compute[185723]:       <stats period="10"/>
Feb 16 13:26:13 compute-0 nova_compute[185723]:     </memballoon>
Feb 16 13:26:13 compute-0 nova_compute[185723]:   </devices>
Feb 16 13:26:13 compute-0 nova_compute[185723]: </domain>
Feb 16 13:26:13 compute-0 nova_compute[185723]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Feb 16 13:26:13 compute-0 nova_compute[185723]: 2026-02-16 13:26:13.895 185727 DEBUG nova.compute.manager [None req-967194eb-f58b-427a-bf7c-b57358159c8c 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] [instance: c6353280-0641-466d-9963-30eb530755e9] Preparing to wait for external event network-vif-plugged-68d12bd9-0c21-41b6-b775-1de285c4be2c prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Feb 16 13:26:13 compute-0 nova_compute[185723]: 2026-02-16 13:26:13.895 185727 DEBUG oslo_concurrency.lockutils [None req-967194eb-f58b-427a-bf7c-b57358159c8c 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] Acquiring lock "c6353280-0641-466d-9963-30eb530755e9-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:26:13 compute-0 nova_compute[185723]: 2026-02-16 13:26:13.896 185727 DEBUG oslo_concurrency.lockutils [None req-967194eb-f58b-427a-bf7c-b57358159c8c 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] Lock "c6353280-0641-466d-9963-30eb530755e9-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:26:13 compute-0 nova_compute[185723]: 2026-02-16 13:26:13.896 185727 DEBUG oslo_concurrency.lockutils [None req-967194eb-f58b-427a-bf7c-b57358159c8c 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] Lock "c6353280-0641-466d-9963-30eb530755e9-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:26:13 compute-0 nova_compute[185723]: 2026-02-16 13:26:13.897 185727 DEBUG nova.virt.libvirt.vif [None req-967194eb-f58b-427a-bf7c-b57358159c8c 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-16T13:26:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteBasicStrategy-server-92456537',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutebasicstrategy-server-92456537',id=8,image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='67efa696c46c451ba23d1157e0816503',ramdisk_id='',reservation_id='r-d5o7uffx',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteBasicStrategy-2074109192',owner_user_name='tempest-TestExecuteBasicStrategy-2074109192-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-16T13:26:09Z,user_data=None,user_id='566db36bffff4193a494fef52f968126',uuid=c6353280-0641-466d-9963-30eb530755e9,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "68d12bd9-0c21-41b6-b775-1de285c4be2c", "address": "fa:16:3e:44:c3:87", "network": {"id": "85f5abea-ac25-4244-a69b-79e29b2ba1fc", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-950978049-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "67efa696c46c451ba23d1157e0816503", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap68d12bd9-0c", "ovs_interfaceid": "68d12bd9-0c21-41b6-b775-1de285c4be2c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Feb 16 13:26:13 compute-0 nova_compute[185723]: 2026-02-16 13:26:13.897 185727 DEBUG nova.network.os_vif_util [None req-967194eb-f58b-427a-bf7c-b57358159c8c 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] Converting VIF {"id": "68d12bd9-0c21-41b6-b775-1de285c4be2c", "address": "fa:16:3e:44:c3:87", "network": {"id": "85f5abea-ac25-4244-a69b-79e29b2ba1fc", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-950978049-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "67efa696c46c451ba23d1157e0816503", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap68d12bd9-0c", "ovs_interfaceid": "68d12bd9-0c21-41b6-b775-1de285c4be2c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 13:26:13 compute-0 nova_compute[185723]: 2026-02-16 13:26:13.897 185727 DEBUG nova.network.os_vif_util [None req-967194eb-f58b-427a-bf7c-b57358159c8c 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:44:c3:87,bridge_name='br-int',has_traffic_filtering=True,id=68d12bd9-0c21-41b6-b775-1de285c4be2c,network=Network(85f5abea-ac25-4244-a69b-79e29b2ba1fc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap68d12bd9-0c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 13:26:13 compute-0 nova_compute[185723]: 2026-02-16 13:26:13.898 185727 DEBUG os_vif [None req-967194eb-f58b-427a-bf7c-b57358159c8c 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:44:c3:87,bridge_name='br-int',has_traffic_filtering=True,id=68d12bd9-0c21-41b6-b775-1de285c4be2c,network=Network(85f5abea-ac25-4244-a69b-79e29b2ba1fc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap68d12bd9-0c') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Feb 16 13:26:13 compute-0 nova_compute[185723]: 2026-02-16 13:26:13.899 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:26:13 compute-0 nova_compute[185723]: 2026-02-16 13:26:13.900 185727 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:26:13 compute-0 nova_compute[185723]: 2026-02-16 13:26:13.900 185727 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 13:26:13 compute-0 nova_compute[185723]: 2026-02-16 13:26:13.903 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:26:13 compute-0 nova_compute[185723]: 2026-02-16 13:26:13.904 185727 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap68d12bd9-0c, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:26:13 compute-0 nova_compute[185723]: 2026-02-16 13:26:13.904 185727 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap68d12bd9-0c, col_values=(('external_ids', {'iface-id': '68d12bd9-0c21-41b6-b775-1de285c4be2c', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:44:c3:87', 'vm-uuid': 'c6353280-0641-466d-9963-30eb530755e9'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:26:13 compute-0 NetworkManager[56177]: <info>  [1771248373.9067] manager: (tap68d12bd9-0c): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/33)
Feb 16 13:26:13 compute-0 nova_compute[185723]: 2026-02-16 13:26:13.907 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 13:26:13 compute-0 nova_compute[185723]: 2026-02-16 13:26:13.911 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:26:13 compute-0 nova_compute[185723]: 2026-02-16 13:26:13.913 185727 INFO os_vif [None req-967194eb-f58b-427a-bf7c-b57358159c8c 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:44:c3:87,bridge_name='br-int',has_traffic_filtering=True,id=68d12bd9-0c21-41b6-b775-1de285c4be2c,network=Network(85f5abea-ac25-4244-a69b-79e29b2ba1fc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap68d12bd9-0c')
Feb 16 13:26:13 compute-0 nova_compute[185723]: 2026-02-16 13:26:13.964 185727 DEBUG nova.virt.libvirt.driver [None req-967194eb-f58b-427a-bf7c-b57358159c8c 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 16 13:26:13 compute-0 nova_compute[185723]: 2026-02-16 13:26:13.965 185727 DEBUG nova.virt.libvirt.driver [None req-967194eb-f58b-427a-bf7c-b57358159c8c 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 16 13:26:13 compute-0 nova_compute[185723]: 2026-02-16 13:26:13.966 185727 DEBUG nova.virt.libvirt.driver [None req-967194eb-f58b-427a-bf7c-b57358159c8c 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] No VIF found with MAC fa:16:3e:44:c3:87, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Feb 16 13:26:13 compute-0 nova_compute[185723]: 2026-02-16 13:26:13.966 185727 INFO nova.virt.libvirt.driver [None req-967194eb-f58b-427a-bf7c-b57358159c8c 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] [instance: c6353280-0641-466d-9963-30eb530755e9] Using config drive
Feb 16 13:26:14 compute-0 podman[208291]: 2026-02-16 13:26:14.04615326 +0000 UTC m=+0.086284524 container health_status 4ec6105d1727f201f656d7e6e358931be8ceabc5af37fb5ad779abc620122180 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 16 13:26:14 compute-0 nova_compute[185723]: 2026-02-16 13:26:14.600 185727 INFO nova.virt.libvirt.driver [None req-967194eb-f58b-427a-bf7c-b57358159c8c 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] [instance: c6353280-0641-466d-9963-30eb530755e9] Creating config drive at /var/lib/nova/instances/c6353280-0641-466d-9963-30eb530755e9/disk.config
Feb 16 13:26:14 compute-0 nova_compute[185723]: 2026-02-16 13:26:14.604 185727 DEBUG oslo_concurrency.processutils [None req-967194eb-f58b-427a-bf7c-b57358159c8c 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/c6353280-0641-466d-9963-30eb530755e9/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpzzvf7fyz execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:26:14 compute-0 nova_compute[185723]: 2026-02-16 13:26:14.730 185727 DEBUG oslo_concurrency.processutils [None req-967194eb-f58b-427a-bf7c-b57358159c8c 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/c6353280-0641-466d-9963-30eb530755e9/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpzzvf7fyz" returned: 0 in 0.126s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:26:14 compute-0 kernel: tap68d12bd9-0c: entered promiscuous mode
Feb 16 13:26:14 compute-0 NetworkManager[56177]: <info>  [1771248374.7869] manager: (tap68d12bd9-0c): new Tun device (/org/freedesktop/NetworkManager/Devices/34)
Feb 16 13:26:14 compute-0 systemd-udevd[208330]: Network interface NamePolicy= disabled on kernel command line.
Feb 16 13:26:14 compute-0 ovn_controller[96072]: 2026-02-16T13:26:14Z|00069|binding|INFO|Claiming lport 68d12bd9-0c21-41b6-b775-1de285c4be2c for this chassis.
Feb 16 13:26:14 compute-0 ovn_controller[96072]: 2026-02-16T13:26:14Z|00070|binding|INFO|68d12bd9-0c21-41b6-b775-1de285c4be2c: Claiming fa:16:3e:44:c3:87 10.100.0.5
Feb 16 13:26:14 compute-0 nova_compute[185723]: 2026-02-16 13:26:14.834 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:26:14 compute-0 nova_compute[185723]: 2026-02-16 13:26:14.839 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:26:14 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:26:14.849 105360 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:44:c3:87 10.100.0.5'], port_security=['fa:16:3e:44:c3:87 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'c6353280-0641-466d-9963-30eb530755e9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-85f5abea-ac25-4244-a69b-79e29b2ba1fc', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '67efa696c46c451ba23d1157e0816503', 'neutron:revision_number': '2', 'neutron:security_group_ids': '12b2bd84-2289-4cae-bae0-edbc0fcc8f32', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2457a351-b1fe-40a0-b007-96d97766c2c9, chassis=[<ovs.db.idl.Row object at 0x7f2e53046760>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2e53046760>], logical_port=68d12bd9-0c21-41b6-b775-1de285c4be2c) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 13:26:14 compute-0 NetworkManager[56177]: <info>  [1771248374.8530] device (tap68d12bd9-0c): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 16 13:26:14 compute-0 NetworkManager[56177]: <info>  [1771248374.8542] device (tap68d12bd9-0c): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Feb 16 13:26:14 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:26:14.851 105360 INFO neutron.agent.ovn.metadata.agent [-] Port 68d12bd9-0c21-41b6-b775-1de285c4be2c in datapath 85f5abea-ac25-4244-a69b-79e29b2ba1fc bound to our chassis
Feb 16 13:26:14 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:26:14.852 105360 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 85f5abea-ac25-4244-a69b-79e29b2ba1fc
Feb 16 13:26:14 compute-0 systemd-machined[155229]: New machine qemu-5-instance-00000008.
Feb 16 13:26:14 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:26:14.865 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[4eff4097-a081-4ce9-9086-e830325bec72]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:26:14 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:26:14.866 105360 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap85f5abea-a1 in ovnmeta-85f5abea-ac25-4244-a69b-79e29b2ba1fc namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Feb 16 13:26:14 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:26:14.868 206438 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap85f5abea-a0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Feb 16 13:26:14 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:26:14.868 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[08cc1632-dd59-41fe-a28e-73ebc4558c8d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:26:14 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:26:14.869 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[39c71b3d-4d1e-4fb3-ba61-292f9c54f5d1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:26:14 compute-0 ovn_controller[96072]: 2026-02-16T13:26:14Z|00071|binding|INFO|Setting lport 68d12bd9-0c21-41b6-b775-1de285c4be2c ovn-installed in OVS
Feb 16 13:26:14 compute-0 ovn_controller[96072]: 2026-02-16T13:26:14Z|00072|binding|INFO|Setting lport 68d12bd9-0c21-41b6-b775-1de285c4be2c up in Southbound
Feb 16 13:26:14 compute-0 nova_compute[185723]: 2026-02-16 13:26:14.873 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:26:14 compute-0 systemd[1]: Started Virtual Machine qemu-5-instance-00000008.
Feb 16 13:26:14 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:26:14.883 105762 DEBUG oslo.privsep.daemon [-] privsep: reply[76d18f8d-306b-43d6-8cc8-ec670fa98993]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:26:14 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:26:14.971 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[bd69a8e1-02ff-4496-b1bd-b8597f172114]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:26:15 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:26:15.001 206452 DEBUG oslo.privsep.daemon [-] privsep: reply[3d47df1e-4a95-4967-92d3-6ab9640f2dce]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:26:15 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:26:15.005 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[2b998d1d-9e22-4838-8c8d-e5c1eb77228a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:26:15 compute-0 NetworkManager[56177]: <info>  [1771248375.0062] manager: (tap85f5abea-a0): new Veth device (/org/freedesktop/NetworkManager/Devices/35)
Feb 16 13:26:15 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:26:15.026 206452 DEBUG oslo.privsep.daemon [-] privsep: reply[75cdb982-b6b1-4cda-af60-dad2f3b13760]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:26:15 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:26:15.029 206452 DEBUG oslo.privsep.daemon [-] privsep: reply[6314063a-c179-4249-8c9a-69805327f053]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:26:15 compute-0 NetworkManager[56177]: <info>  [1771248375.0429] device (tap85f5abea-a0): carrier: link connected
Feb 16 13:26:15 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:26:15.046 206452 DEBUG oslo.privsep.daemon [-] privsep: reply[80a7c356-5381-41d5-80de-aa5fdc50a7d1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:26:15 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:26:15.059 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[cd4ac04d-816c-499e-be85-6bbab476bad9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap85f5abea-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c3:9c:74'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 22], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 445843, 'reachable_time': 31899, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 208366, 'error': None, 'target': 'ovnmeta-85f5abea-ac25-4244-a69b-79e29b2ba1fc', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:26:15 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:26:15.070 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[9202d900-95a0-4a6e-8896-76f9538256bb]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fec3:9c74'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 445843, 'tstamp': 445843}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 208368, 'error': None, 'target': 'ovnmeta-85f5abea-ac25-4244-a69b-79e29b2ba1fc', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:26:15 compute-0 nova_compute[185723]: 2026-02-16 13:26:15.072 185727 DEBUG nova.compute.manager [req-4a71210d-8bc3-4e80-8e6f-89540207d97b req-f565d5d0-88f6-4744-aa0c-0b54f6dd4f8b faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: c6353280-0641-466d-9963-30eb530755e9] Received event network-vif-plugged-68d12bd9-0c21-41b6-b775-1de285c4be2c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:26:15 compute-0 nova_compute[185723]: 2026-02-16 13:26:15.072 185727 DEBUG oslo_concurrency.lockutils [req-4a71210d-8bc3-4e80-8e6f-89540207d97b req-f565d5d0-88f6-4744-aa0c-0b54f6dd4f8b faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "c6353280-0641-466d-9963-30eb530755e9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:26:15 compute-0 nova_compute[185723]: 2026-02-16 13:26:15.073 185727 DEBUG oslo_concurrency.lockutils [req-4a71210d-8bc3-4e80-8e6f-89540207d97b req-f565d5d0-88f6-4744-aa0c-0b54f6dd4f8b faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "c6353280-0641-466d-9963-30eb530755e9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:26:15 compute-0 nova_compute[185723]: 2026-02-16 13:26:15.073 185727 DEBUG oslo_concurrency.lockutils [req-4a71210d-8bc3-4e80-8e6f-89540207d97b req-f565d5d0-88f6-4744-aa0c-0b54f6dd4f8b faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "c6353280-0641-466d-9963-30eb530755e9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:26:15 compute-0 nova_compute[185723]: 2026-02-16 13:26:15.073 185727 DEBUG nova.compute.manager [req-4a71210d-8bc3-4e80-8e6f-89540207d97b req-f565d5d0-88f6-4744-aa0c-0b54f6dd4f8b faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: c6353280-0641-466d-9963-30eb530755e9] Processing event network-vif-plugged-68d12bd9-0c21-41b6-b775-1de285c4be2c _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Feb 16 13:26:15 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:26:15.081 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[b294b351-67c2-42df-9cf5-60e51d65625e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap85f5abea-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c3:9c:74'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 22], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 445843, 'reachable_time': 31899, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 152, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 152, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 208369, 'error': None, 'target': 'ovnmeta-85f5abea-ac25-4244-a69b-79e29b2ba1fc', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:26:15 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:26:15.098 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[5bb7f3d9-913f-4e43-b577-6d0597807be9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:26:15 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:26:15.131 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[ed259aad-5a7f-42f3-993c-25d614d2d417]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:26:15 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:26:15.132 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap85f5abea-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:26:15 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:26:15.132 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 13:26:15 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:26:15.133 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap85f5abea-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:26:15 compute-0 nova_compute[185723]: 2026-02-16 13:26:15.135 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:26:15 compute-0 NetworkManager[56177]: <info>  [1771248375.1358] manager: (tap85f5abea-a0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/36)
Feb 16 13:26:15 compute-0 kernel: tap85f5abea-a0: entered promiscuous mode
Feb 16 13:26:15 compute-0 nova_compute[185723]: 2026-02-16 13:26:15.139 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:26:15 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:26:15.140 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap85f5abea-a0, col_values=(('external_ids', {'iface-id': 'd3feb028-a3ab-4f43-8a4b-1ee3054fd9f1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:26:15 compute-0 nova_compute[185723]: 2026-02-16 13:26:15.142 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:26:15 compute-0 ovn_controller[96072]: 2026-02-16T13:26:15Z|00073|binding|INFO|Releasing lport d3feb028-a3ab-4f43-8a4b-1ee3054fd9f1 from this chassis (sb_readonly=0)
Feb 16 13:26:15 compute-0 nova_compute[185723]: 2026-02-16 13:26:15.143 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:26:15 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:26:15.144 105360 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/85f5abea-ac25-4244-a69b-79e29b2ba1fc.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/85f5abea-ac25-4244-a69b-79e29b2ba1fc.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Feb 16 13:26:15 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:26:15.144 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[05939e71-79e4-424e-bf58-7e6052cec61a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:26:15 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:26:15.145 105360 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Feb 16 13:26:15 compute-0 ovn_metadata_agent[105355]: global
Feb 16 13:26:15 compute-0 ovn_metadata_agent[105355]:     log         /dev/log local0 debug
Feb 16 13:26:15 compute-0 ovn_metadata_agent[105355]:     log-tag     haproxy-metadata-proxy-85f5abea-ac25-4244-a69b-79e29b2ba1fc
Feb 16 13:26:15 compute-0 ovn_metadata_agent[105355]:     user        root
Feb 16 13:26:15 compute-0 ovn_metadata_agent[105355]:     group       root
Feb 16 13:26:15 compute-0 ovn_metadata_agent[105355]:     maxconn     1024
Feb 16 13:26:15 compute-0 ovn_metadata_agent[105355]:     pidfile     /var/lib/neutron/external/pids/85f5abea-ac25-4244-a69b-79e29b2ba1fc.pid.haproxy
Feb 16 13:26:15 compute-0 ovn_metadata_agent[105355]:     daemon
Feb 16 13:26:15 compute-0 ovn_metadata_agent[105355]: 
Feb 16 13:26:15 compute-0 ovn_metadata_agent[105355]: defaults
Feb 16 13:26:15 compute-0 ovn_metadata_agent[105355]:     log global
Feb 16 13:26:15 compute-0 ovn_metadata_agent[105355]:     mode http
Feb 16 13:26:15 compute-0 ovn_metadata_agent[105355]:     option httplog
Feb 16 13:26:15 compute-0 ovn_metadata_agent[105355]:     option dontlognull
Feb 16 13:26:15 compute-0 ovn_metadata_agent[105355]:     option http-server-close
Feb 16 13:26:15 compute-0 ovn_metadata_agent[105355]:     option forwardfor
Feb 16 13:26:15 compute-0 ovn_metadata_agent[105355]:     retries                 3
Feb 16 13:26:15 compute-0 ovn_metadata_agent[105355]:     timeout http-request    30s
Feb 16 13:26:15 compute-0 ovn_metadata_agent[105355]:     timeout connect         30s
Feb 16 13:26:15 compute-0 ovn_metadata_agent[105355]:     timeout client          32s
Feb 16 13:26:15 compute-0 ovn_metadata_agent[105355]:     timeout server          32s
Feb 16 13:26:15 compute-0 ovn_metadata_agent[105355]:     timeout http-keep-alive 30s
Feb 16 13:26:15 compute-0 ovn_metadata_agent[105355]: 
Feb 16 13:26:15 compute-0 ovn_metadata_agent[105355]: 
Feb 16 13:26:15 compute-0 ovn_metadata_agent[105355]: listen listener
Feb 16 13:26:15 compute-0 ovn_metadata_agent[105355]:     bind 169.254.169.254:80
Feb 16 13:26:15 compute-0 ovn_metadata_agent[105355]:     server metadata /var/lib/neutron/metadata_proxy
Feb 16 13:26:15 compute-0 ovn_metadata_agent[105355]:     http-request add-header X-OVN-Network-ID 85f5abea-ac25-4244-a69b-79e29b2ba1fc
Feb 16 13:26:15 compute-0 ovn_metadata_agent[105355]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Feb 16 13:26:15 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:26:15.145 105360 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-85f5abea-ac25-4244-a69b-79e29b2ba1fc', 'env', 'PROCESS_TAG=haproxy-85f5abea-ac25-4244-a69b-79e29b2ba1fc', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/85f5abea-ac25-4244-a69b-79e29b2ba1fc.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Feb 16 13:26:15 compute-0 nova_compute[185723]: 2026-02-16 13:26:15.146 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:26:15 compute-0 nova_compute[185723]: 2026-02-16 13:26:15.339 185727 DEBUG nova.compute.manager [None req-967194eb-f58b-427a-bf7c-b57358159c8c 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] [instance: c6353280-0641-466d-9963-30eb530755e9] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Feb 16 13:26:15 compute-0 nova_compute[185723]: 2026-02-16 13:26:15.339 185727 DEBUG nova.virt.driver [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] Emitting event <LifecycleEvent: 1771248375.3384771, c6353280-0641-466d-9963-30eb530755e9 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 13:26:15 compute-0 nova_compute[185723]: 2026-02-16 13:26:15.340 185727 INFO nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: c6353280-0641-466d-9963-30eb530755e9] VM Started (Lifecycle Event)
Feb 16 13:26:15 compute-0 nova_compute[185723]: 2026-02-16 13:26:15.342 185727 DEBUG nova.virt.libvirt.driver [None req-967194eb-f58b-427a-bf7c-b57358159c8c 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] [instance: c6353280-0641-466d-9963-30eb530755e9] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Feb 16 13:26:15 compute-0 nova_compute[185723]: 2026-02-16 13:26:15.345 185727 INFO nova.virt.libvirt.driver [-] [instance: c6353280-0641-466d-9963-30eb530755e9] Instance spawned successfully.
Feb 16 13:26:15 compute-0 nova_compute[185723]: 2026-02-16 13:26:15.345 185727 DEBUG nova.virt.libvirt.driver [None req-967194eb-f58b-427a-bf7c-b57358159c8c 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] [instance: c6353280-0641-466d-9963-30eb530755e9] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Feb 16 13:26:15 compute-0 nova_compute[185723]: 2026-02-16 13:26:15.370 185727 DEBUG nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: c6353280-0641-466d-9963-30eb530755e9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:26:15 compute-0 nova_compute[185723]: 2026-02-16 13:26:15.374 185727 DEBUG nova.virt.libvirt.driver [None req-967194eb-f58b-427a-bf7c-b57358159c8c 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] [instance: c6353280-0641-466d-9963-30eb530755e9] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:26:15 compute-0 nova_compute[185723]: 2026-02-16 13:26:15.375 185727 DEBUG nova.virt.libvirt.driver [None req-967194eb-f58b-427a-bf7c-b57358159c8c 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] [instance: c6353280-0641-466d-9963-30eb530755e9] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:26:15 compute-0 nova_compute[185723]: 2026-02-16 13:26:15.375 185727 DEBUG nova.virt.libvirt.driver [None req-967194eb-f58b-427a-bf7c-b57358159c8c 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] [instance: c6353280-0641-466d-9963-30eb530755e9] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:26:15 compute-0 nova_compute[185723]: 2026-02-16 13:26:15.376 185727 DEBUG nova.virt.libvirt.driver [None req-967194eb-f58b-427a-bf7c-b57358159c8c 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] [instance: c6353280-0641-466d-9963-30eb530755e9] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:26:15 compute-0 nova_compute[185723]: 2026-02-16 13:26:15.376 185727 DEBUG nova.virt.libvirt.driver [None req-967194eb-f58b-427a-bf7c-b57358159c8c 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] [instance: c6353280-0641-466d-9963-30eb530755e9] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:26:15 compute-0 nova_compute[185723]: 2026-02-16 13:26:15.376 185727 DEBUG nova.virt.libvirt.driver [None req-967194eb-f58b-427a-bf7c-b57358159c8c 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] [instance: c6353280-0641-466d-9963-30eb530755e9] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:26:15 compute-0 nova_compute[185723]: 2026-02-16 13:26:15.383 185727 DEBUG nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: c6353280-0641-466d-9963-30eb530755e9] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 16 13:26:15 compute-0 nova_compute[185723]: 2026-02-16 13:26:15.420 185727 INFO nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: c6353280-0641-466d-9963-30eb530755e9] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 16 13:26:15 compute-0 nova_compute[185723]: 2026-02-16 13:26:15.421 185727 DEBUG nova.virt.driver [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] Emitting event <LifecycleEvent: 1771248375.338718, c6353280-0641-466d-9963-30eb530755e9 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 13:26:15 compute-0 nova_compute[185723]: 2026-02-16 13:26:15.421 185727 INFO nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: c6353280-0641-466d-9963-30eb530755e9] VM Paused (Lifecycle Event)
Feb 16 13:26:15 compute-0 podman[208407]: 2026-02-16 13:26:15.429201025 +0000 UTC m=+0.046239280 container create 158b9aa7b75fa9076d9b5fc893d84fd9397b95eaeef8bc46a3650cf48aaffe00 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-85f5abea-ac25-4244-a69b-79e29b2ba1fc, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true)
Feb 16 13:26:15 compute-0 nova_compute[185723]: 2026-02-16 13:26:15.447 185727 DEBUG nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: c6353280-0641-466d-9963-30eb530755e9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:26:15 compute-0 nova_compute[185723]: 2026-02-16 13:26:15.450 185727 DEBUG nova.virt.driver [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] Emitting event <LifecycleEvent: 1771248375.3415887, c6353280-0641-466d-9963-30eb530755e9 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 13:26:15 compute-0 nova_compute[185723]: 2026-02-16 13:26:15.450 185727 INFO nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: c6353280-0641-466d-9963-30eb530755e9] VM Resumed (Lifecycle Event)
Feb 16 13:26:15 compute-0 nova_compute[185723]: 2026-02-16 13:26:15.453 185727 INFO nova.compute.manager [None req-967194eb-f58b-427a-bf7c-b57358159c8c 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] [instance: c6353280-0641-466d-9963-30eb530755e9] Took 6.04 seconds to spawn the instance on the hypervisor.
Feb 16 13:26:15 compute-0 nova_compute[185723]: 2026-02-16 13:26:15.453 185727 DEBUG nova.compute.manager [None req-967194eb-f58b-427a-bf7c-b57358159c8c 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] [instance: c6353280-0641-466d-9963-30eb530755e9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:26:15 compute-0 systemd[1]: Started libpod-conmon-158b9aa7b75fa9076d9b5fc893d84fd9397b95eaeef8bc46a3650cf48aaffe00.scope.
Feb 16 13:26:15 compute-0 nova_compute[185723]: 2026-02-16 13:26:15.483 185727 DEBUG nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: c6353280-0641-466d-9963-30eb530755e9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:26:15 compute-0 nova_compute[185723]: 2026-02-16 13:26:15.485 185727 DEBUG nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: c6353280-0641-466d-9963-30eb530755e9] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 16 13:26:15 compute-0 systemd[1]: Started libcrun container.
Feb 16 13:26:15 compute-0 podman[208407]: 2026-02-16 13:26:15.401258385 +0000 UTC m=+0.018296660 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Feb 16 13:26:15 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/282335b6f3b2ad70372fbfcb18a63f51f22b6c54176cbebabd3f0920c11b8c0b/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 16 13:26:15 compute-0 podman[208407]: 2026-02-16 13:26:15.508518453 +0000 UTC m=+0.125556728 container init 158b9aa7b75fa9076d9b5fc893d84fd9397b95eaeef8bc46a3650cf48aaffe00 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-85f5abea-ac25-4244-a69b-79e29b2ba1fc, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_managed=true, io.buildah.version=1.41.3)
Feb 16 13:26:15 compute-0 podman[208407]: 2026-02-16 13:26:15.512273107 +0000 UTC m=+0.129311362 container start 158b9aa7b75fa9076d9b5fc893d84fd9397b95eaeef8bc46a3650cf48aaffe00 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-85f5abea-ac25-4244-a69b-79e29b2ba1fc, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Feb 16 13:26:15 compute-0 neutron-haproxy-ovnmeta-85f5abea-ac25-4244-a69b-79e29b2ba1fc[208422]: [NOTICE]   (208426) : New worker (208428) forked
Feb 16 13:26:15 compute-0 neutron-haproxy-ovnmeta-85f5abea-ac25-4244-a69b-79e29b2ba1fc[208422]: [NOTICE]   (208426) : Loading success.
Feb 16 13:26:15 compute-0 nova_compute[185723]: 2026-02-16 13:26:15.539 185727 INFO nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: c6353280-0641-466d-9963-30eb530755e9] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 16 13:26:15 compute-0 nova_compute[185723]: 2026-02-16 13:26:15.554 185727 INFO nova.compute.manager [None req-967194eb-f58b-427a-bf7c-b57358159c8c 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] [instance: c6353280-0641-466d-9963-30eb530755e9] Took 6.79 seconds to build instance.
Feb 16 13:26:15 compute-0 nova_compute[185723]: 2026-02-16 13:26:15.579 185727 DEBUG oslo_concurrency.lockutils [None req-967194eb-f58b-427a-bf7c-b57358159c8c 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] Lock "c6353280-0641-466d-9963-30eb530755e9" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.903s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:26:15 compute-0 nova_compute[185723]: 2026-02-16 13:26:15.934 185727 DEBUG nova.network.neutron [req-048bc99d-c423-4834-9f66-807b50299d73 req-b004c738-c0a4-4a6e-8b65-dac01e6419be faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: c6353280-0641-466d-9963-30eb530755e9] Updated VIF entry in instance network info cache for port 68d12bd9-0c21-41b6-b775-1de285c4be2c. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 16 13:26:15 compute-0 nova_compute[185723]: 2026-02-16 13:26:15.934 185727 DEBUG nova.network.neutron [req-048bc99d-c423-4834-9f66-807b50299d73 req-b004c738-c0a4-4a6e-8b65-dac01e6419be faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: c6353280-0641-466d-9963-30eb530755e9] Updating instance_info_cache with network_info: [{"id": "68d12bd9-0c21-41b6-b775-1de285c4be2c", "address": "fa:16:3e:44:c3:87", "network": {"id": "85f5abea-ac25-4244-a69b-79e29b2ba1fc", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-950978049-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "67efa696c46c451ba23d1157e0816503", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap68d12bd9-0c", "ovs_interfaceid": "68d12bd9-0c21-41b6-b775-1de285c4be2c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 13:26:15 compute-0 nova_compute[185723]: 2026-02-16 13:26:15.951 185727 DEBUG oslo_concurrency.lockutils [req-048bc99d-c423-4834-9f66-807b50299d73 req-b004c738-c0a4-4a6e-8b65-dac01e6419be faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Releasing lock "refresh_cache-c6353280-0641-466d-9963-30eb530755e9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 13:26:17 compute-0 nova_compute[185723]: 2026-02-16 13:26:17.180 185727 DEBUG nova.compute.manager [req-4fa493cf-6c06-4574-98f7-f12d9e898281 req-34c62881-674d-4bc0-9840-e526561d42e8 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: c6353280-0641-466d-9963-30eb530755e9] Received event network-vif-plugged-68d12bd9-0c21-41b6-b775-1de285c4be2c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:26:17 compute-0 nova_compute[185723]: 2026-02-16 13:26:17.180 185727 DEBUG oslo_concurrency.lockutils [req-4fa493cf-6c06-4574-98f7-f12d9e898281 req-34c62881-674d-4bc0-9840-e526561d42e8 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "c6353280-0641-466d-9963-30eb530755e9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:26:17 compute-0 nova_compute[185723]: 2026-02-16 13:26:17.181 185727 DEBUG oslo_concurrency.lockutils [req-4fa493cf-6c06-4574-98f7-f12d9e898281 req-34c62881-674d-4bc0-9840-e526561d42e8 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "c6353280-0641-466d-9963-30eb530755e9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:26:17 compute-0 nova_compute[185723]: 2026-02-16 13:26:17.181 185727 DEBUG oslo_concurrency.lockutils [req-4fa493cf-6c06-4574-98f7-f12d9e898281 req-34c62881-674d-4bc0-9840-e526561d42e8 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "c6353280-0641-466d-9963-30eb530755e9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:26:17 compute-0 nova_compute[185723]: 2026-02-16 13:26:17.182 185727 DEBUG nova.compute.manager [req-4fa493cf-6c06-4574-98f7-f12d9e898281 req-34c62881-674d-4bc0-9840-e526561d42e8 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: c6353280-0641-466d-9963-30eb530755e9] No waiting events found dispatching network-vif-plugged-68d12bd9-0c21-41b6-b775-1de285c4be2c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:26:17 compute-0 nova_compute[185723]: 2026-02-16 13:26:17.182 185727 WARNING nova.compute.manager [req-4fa493cf-6c06-4574-98f7-f12d9e898281 req-34c62881-674d-4bc0-9840-e526561d42e8 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: c6353280-0641-466d-9963-30eb530755e9] Received unexpected event network-vif-plugged-68d12bd9-0c21-41b6-b775-1de285c4be2c for instance with vm_state active and task_state None.
Feb 16 13:26:17 compute-0 nova_compute[185723]: 2026-02-16 13:26:17.627 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:26:18 compute-0 nova_compute[185723]: 2026-02-16 13:26:18.906 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:26:22 compute-0 nova_compute[185723]: 2026-02-16 13:26:22.629 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:26:23 compute-0 nova_compute[185723]: 2026-02-16 13:26:23.908 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:26:27 compute-0 nova_compute[185723]: 2026-02-16 13:26:27.631 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:26:28 compute-0 ovn_controller[96072]: 2026-02-16T13:26:28Z|00012|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:44:c3:87 10.100.0.5
Feb 16 13:26:28 compute-0 ovn_controller[96072]: 2026-02-16T13:26:28Z|00013|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:44:c3:87 10.100.0.5
Feb 16 13:26:28 compute-0 nova_compute[185723]: 2026-02-16 13:26:28.911 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:26:29 compute-0 podman[195053]: time="2026-02-16T13:26:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:26:29 compute-0 podman[195053]: @ - - [16/Feb/2026:13:26:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17245 "" "Go-http-client/1.1"
Feb 16 13:26:29 compute-0 podman[195053]: @ - - [16/Feb/2026:13:26:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2639 "" "Go-http-client/1.1"
Feb 16 13:26:30 compute-0 podman[208454]: 2026-02-16 13:26:30.011100109 +0000 UTC m=+0.046129447 container health_status d69bd0be854ec8043fcba555b70c2cd311c7a2510d07d6bc9929fe7684abece9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, managed_by=edpm_ansible, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Feb 16 13:26:30 compute-0 podman[208453]: 2026-02-16 13:26:30.041225764 +0000 UTC m=+0.075677068 container health_status 93151dcbec9f159583042df5101bdd7b5b1bcb5f3117955b4bc9cd1c0e465b98 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=openstack_network_exporter, container_name=openstack_network_exporter, name=ubi9/ubi-minimal, io.openshift.expose-services=, architecture=x86_64, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, org.opencontainers.image.created=2026-02-05T04:57:10Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, version=9.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, release=1770267347, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vendor=Red Hat, Inc., io.buildah.version=1.33.7, build-date=2026-02-05T04:57:10Z)
Feb 16 13:26:31 compute-0 openstack_network_exporter[197909]: ERROR   13:26:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:26:31 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:26:31 compute-0 openstack_network_exporter[197909]: ERROR   13:26:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:26:31 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:26:32 compute-0 nova_compute[185723]: 2026-02-16 13:26:32.633 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:26:33 compute-0 nova_compute[185723]: 2026-02-16 13:26:33.913 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:26:35 compute-0 podman[208492]: 2026-02-16 13:26:35.022706792 +0000 UTC m=+0.062057626 container health_status 19961a2f872cc72daf954f4dd556f980b5f58f137d145776df6132c167a8a93c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Feb 16 13:26:37 compute-0 nova_compute[185723]: 2026-02-16 13:26:37.636 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:26:38 compute-0 nova_compute[185723]: 2026-02-16 13:26:38.915 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:26:42 compute-0 nova_compute[185723]: 2026-02-16 13:26:42.637 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:26:43 compute-0 nova_compute[185723]: 2026-02-16 13:26:43.917 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:26:45 compute-0 podman[208518]: 2026-02-16 13:26:45.0110025 +0000 UTC m=+0.044854186 container health_status 4ec6105d1727f201f656d7e6e358931be8ceabc5af37fb5ad779abc620122180 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 16 13:26:45 compute-0 ovn_controller[96072]: 2026-02-16T13:26:45Z|00074|memory_trim|INFO|Detected inactivity (last active 30011 ms ago): trimming memory
Feb 16 13:26:47 compute-0 nova_compute[185723]: 2026-02-16 13:26:47.642 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:26:48 compute-0 nova_compute[185723]: 2026-02-16 13:26:48.919 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:26:52 compute-0 sshd-session[208542]: Invalid user testuser from 188.166.42.159 port 57512
Feb 16 13:26:52 compute-0 sshd-session[208544]: Connection closed by authenticating user root 64.227.72.94 port 59976 [preauth]
Feb 16 13:26:52 compute-0 sshd-session[208542]: Connection closed by invalid user testuser 188.166.42.159 port 57512 [preauth]
Feb 16 13:26:52 compute-0 nova_compute[185723]: 2026-02-16 13:26:52.644 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:26:53 compute-0 nova_compute[185723]: 2026-02-16 13:26:53.920 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:26:57 compute-0 nova_compute[185723]: 2026-02-16 13:26:57.645 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:26:58 compute-0 nova_compute[185723]: 2026-02-16 13:26:58.922 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:26:59 compute-0 podman[195053]: time="2026-02-16T13:26:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:26:59 compute-0 podman[195053]: @ - - [16/Feb/2026:13:26:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17245 "" "Go-http-client/1.1"
Feb 16 13:26:59 compute-0 podman[195053]: @ - - [16/Feb/2026:13:26:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2638 "" "Go-http-client/1.1"
Feb 16 13:27:01 compute-0 podman[208546]: 2026-02-16 13:27:01.023751553 +0000 UTC m=+0.064492977 container health_status 93151dcbec9f159583042df5101bdd7b5b1bcb5f3117955b4bc9cd1c0e465b98 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, distribution-scope=public, architecture=x86_64, build-date=2026-02-05T04:57:10Z, container_name=openstack_network_exporter, managed_by=edpm_ansible, org.opencontainers.image.created=2026-02-05T04:57:10Z, release=1770267347, vcs-type=git, version=9.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Red Hat, Inc., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9/ubi-minimal, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9)
Feb 16 13:27:01 compute-0 podman[208547]: 2026-02-16 13:27:01.031084907 +0000 UTC m=+0.063479482 container health_status d69bd0be854ec8043fcba555b70c2cd311c7a2510d07d6bc9929fe7684abece9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Feb 16 13:27:01 compute-0 openstack_network_exporter[197909]: ERROR   13:27:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:27:01 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:27:01 compute-0 openstack_network_exporter[197909]: ERROR   13:27:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:27:01 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:27:02 compute-0 nova_compute[185723]: 2026-02-16 13:27:02.433 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:27:02 compute-0 nova_compute[185723]: 2026-02-16 13:27:02.647 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:27:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:27:03.219 105360 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:27:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:27:03.220 105360 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:27:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:27:03.220 105360 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:27:03 compute-0 nova_compute[185723]: 2026-02-16 13:27:03.429 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:27:04 compute-0 nova_compute[185723]: 2026-02-16 13:27:04.029 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:27:04 compute-0 nova_compute[185723]: 2026-02-16 13:27:04.432 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:27:06 compute-0 podman[208584]: 2026-02-16 13:27:06.02320551 +0000 UTC m=+0.067583615 container health_status 19961a2f872cc72daf954f4dd556f980b5f58f137d145776df6132c167a8a93c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Feb 16 13:27:06 compute-0 nova_compute[185723]: 2026-02-16 13:27:06.433 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:27:06 compute-0 nova_compute[185723]: 2026-02-16 13:27:06.434 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 16 13:27:06 compute-0 nova_compute[185723]: 2026-02-16 13:27:06.434 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 16 13:27:07 compute-0 nova_compute[185723]: 2026-02-16 13:27:07.190 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Acquiring lock "refresh_cache-c6353280-0641-466d-9963-30eb530755e9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 13:27:07 compute-0 nova_compute[185723]: 2026-02-16 13:27:07.191 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Acquired lock "refresh_cache-c6353280-0641-466d-9963-30eb530755e9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 13:27:07 compute-0 nova_compute[185723]: 2026-02-16 13:27:07.192 185727 DEBUG nova.network.neutron [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] [instance: c6353280-0641-466d-9963-30eb530755e9] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Feb 16 13:27:07 compute-0 nova_compute[185723]: 2026-02-16 13:27:07.192 185727 DEBUG nova.objects.instance [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c6353280-0641-466d-9963-30eb530755e9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 13:27:07 compute-0 nova_compute[185723]: 2026-02-16 13:27:07.688 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:27:09 compute-0 nova_compute[185723]: 2026-02-16 13:27:09.086 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:27:09 compute-0 nova_compute[185723]: 2026-02-16 13:27:09.712 185727 DEBUG nova.network.neutron [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] [instance: c6353280-0641-466d-9963-30eb530755e9] Updating instance_info_cache with network_info: [{"id": "68d12bd9-0c21-41b6-b775-1de285c4be2c", "address": "fa:16:3e:44:c3:87", "network": {"id": "85f5abea-ac25-4244-a69b-79e29b2ba1fc", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-950978049-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "67efa696c46c451ba23d1157e0816503", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap68d12bd9-0c", "ovs_interfaceid": "68d12bd9-0c21-41b6-b775-1de285c4be2c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 13:27:09 compute-0 nova_compute[185723]: 2026-02-16 13:27:09.736 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Releasing lock "refresh_cache-c6353280-0641-466d-9963-30eb530755e9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 13:27:09 compute-0 nova_compute[185723]: 2026-02-16 13:27:09.736 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] [instance: c6353280-0641-466d-9963-30eb530755e9] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Feb 16 13:27:09 compute-0 nova_compute[185723]: 2026-02-16 13:27:09.737 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:27:09 compute-0 nova_compute[185723]: 2026-02-16 13:27:09.738 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:27:09 compute-0 nova_compute[185723]: 2026-02-16 13:27:09.738 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:27:09 compute-0 nova_compute[185723]: 2026-02-16 13:27:09.738 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:27:09 compute-0 nova_compute[185723]: 2026-02-16 13:27:09.792 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:27:09 compute-0 nova_compute[185723]: 2026-02-16 13:27:09.793 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:27:09 compute-0 nova_compute[185723]: 2026-02-16 13:27:09.793 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:27:09 compute-0 nova_compute[185723]: 2026-02-16 13:27:09.793 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 16 13:27:09 compute-0 nova_compute[185723]: 2026-02-16 13:27:09.886 185727 DEBUG oslo_concurrency.processutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c6353280-0641-466d-9963-30eb530755e9/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:27:09 compute-0 nova_compute[185723]: 2026-02-16 13:27:09.964 185727 DEBUG oslo_concurrency.processutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c6353280-0641-466d-9963-30eb530755e9/disk --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:27:09 compute-0 nova_compute[185723]: 2026-02-16 13:27:09.966 185727 DEBUG oslo_concurrency.processutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c6353280-0641-466d-9963-30eb530755e9/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:27:10 compute-0 nova_compute[185723]: 2026-02-16 13:27:10.020 185727 DEBUG oslo_concurrency.processutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c6353280-0641-466d-9963-30eb530755e9/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:27:10 compute-0 nova_compute[185723]: 2026-02-16 13:27:10.144 185727 WARNING nova.virt.libvirt.driver [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 13:27:10 compute-0 nova_compute[185723]: 2026-02-16 13:27:10.145 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5689MB free_disk=73.19775772094727GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 16 13:27:10 compute-0 nova_compute[185723]: 2026-02-16 13:27:10.146 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:27:10 compute-0 nova_compute[185723]: 2026-02-16 13:27:10.146 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:27:10 compute-0 nova_compute[185723]: 2026-02-16 13:27:10.234 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Instance c6353280-0641-466d-9963-30eb530755e9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 16 13:27:10 compute-0 nova_compute[185723]: 2026-02-16 13:27:10.234 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 16 13:27:10 compute-0 nova_compute[185723]: 2026-02-16 13:27:10.234 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 16 13:27:10 compute-0 nova_compute[185723]: 2026-02-16 13:27:10.328 185727 DEBUG nova.compute.provider_tree [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Inventory has not changed in ProviderTree for provider: c9501a85-df32-4b8f-bce0-9425ef1e7866 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 13:27:10 compute-0 nova_compute[185723]: 2026-02-16 13:27:10.350 185727 DEBUG nova.scheduler.client.report [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Inventory has not changed for provider c9501a85-df32-4b8f-bce0-9425ef1e7866 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 13:27:10 compute-0 nova_compute[185723]: 2026-02-16 13:27:10.372 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 16 13:27:10 compute-0 nova_compute[185723]: 2026-02-16 13:27:10.372 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.226s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:27:11 compute-0 nova_compute[185723]: 2026-02-16 13:27:11.067 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:27:11 compute-0 nova_compute[185723]: 2026-02-16 13:27:11.067 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:27:11 compute-0 nova_compute[185723]: 2026-02-16 13:27:11.068 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 16 13:27:12 compute-0 nova_compute[185723]: 2026-02-16 13:27:12.690 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:27:14 compute-0 nova_compute[185723]: 2026-02-16 13:27:14.132 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:27:16 compute-0 podman[208617]: 2026-02-16 13:27:16.020809463 +0000 UTC m=+0.057260426 container health_status 4ec6105d1727f201f656d7e6e358931be8ceabc5af37fb5ad779abc620122180 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 16 13:27:17 compute-0 nova_compute[185723]: 2026-02-16 13:27:17.694 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:27:18 compute-0 sshd-session[208641]: Connection closed by authenticating user root 146.190.226.24 port 49484 [preauth]
Feb 16 13:27:19 compute-0 nova_compute[185723]: 2026-02-16 13:27:19.136 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:27:22 compute-0 nova_compute[185723]: 2026-02-16 13:27:22.696 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:27:24 compute-0 nova_compute[185723]: 2026-02-16 13:27:24.138 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:27:26 compute-0 sshd-session[208643]: Connection closed by authenticating user root 146.190.22.227 port 49612 [preauth]
Feb 16 13:27:28 compute-0 nova_compute[185723]: 2026-02-16 13:27:28.096 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:27:29 compute-0 nova_compute[185723]: 2026-02-16 13:27:29.173 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:27:29 compute-0 nova_compute[185723]: 2026-02-16 13:27:29.603 185727 DEBUG nova.compute.manager [None req-59c0cdd9-a31f-4441-8004-b4af6b272e39 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Adding trait COMPUTE_STATUS_DISABLED to compute node resource provider c9501a85-df32-4b8f-bce0-9425ef1e7866 in placement. update_compute_provider_status /usr/lib/python3.9/site-packages/nova/compute/manager.py:610
Feb 16 13:27:29 compute-0 nova_compute[185723]: 2026-02-16 13:27:29.653 185727 DEBUG nova.compute.provider_tree [None req-59c0cdd9-a31f-4441-8004-b4af6b272e39 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Updating resource provider c9501a85-df32-4b8f-bce0-9425ef1e7866 generation from 11 to 13 during operation: update_traits _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Feb 16 13:27:29 compute-0 podman[195053]: time="2026-02-16T13:27:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:27:29 compute-0 podman[195053]: @ - - [16/Feb/2026:13:27:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17245 "" "Go-http-client/1.1"
Feb 16 13:27:29 compute-0 podman[195053]: @ - - [16/Feb/2026:13:27:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2635 "" "Go-http-client/1.1"
Feb 16 13:27:31 compute-0 openstack_network_exporter[197909]: ERROR   13:27:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:27:31 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:27:31 compute-0 openstack_network_exporter[197909]: ERROR   13:27:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:27:31 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:27:32 compute-0 podman[208646]: 2026-02-16 13:27:32.043524339 +0000 UTC m=+0.064990520 container health_status d69bd0be854ec8043fcba555b70c2cd311c7a2510d07d6bc9929fe7684abece9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true)
Feb 16 13:27:32 compute-0 podman[208645]: 2026-02-16 13:27:32.080559907 +0000 UTC m=+0.103923376 container health_status 93151dcbec9f159583042df5101bdd7b5b1bcb5f3117955b4bc9cd1c0e465b98 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.7, com.redhat.component=ubi9-minimal-container, architecture=x86_64, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-type=git, build-date=2026-02-05T04:57:10Z, org.opencontainers.image.created=2026-02-05T04:57:10Z, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., name=ubi9/ubi-minimal, config_id=openstack_network_exporter, io.buildah.version=1.33.7, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, release=1770267347, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c)
Feb 16 13:27:33 compute-0 nova_compute[185723]: 2026-02-16 13:27:33.098 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:27:34 compute-0 nova_compute[185723]: 2026-02-16 13:27:34.175 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:27:37 compute-0 podman[208686]: 2026-02-16 13:27:37.043589229 +0000 UTC m=+0.079636027 container health_status 19961a2f872cc72daf954f4dd556f980b5f58f137d145776df6132c167a8a93c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_controller, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127)
Feb 16 13:27:37 compute-0 nova_compute[185723]: 2026-02-16 13:27:37.530 185727 DEBUG nova.virt.libvirt.driver [None req-2c134511-64a1-4c90-949d-09cb96b35f93 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: c6353280-0641-466d-9963-30eb530755e9] Check if temp file /var/lib/nova/instances/tmppe2llmuf exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10065
Feb 16 13:27:37 compute-0 nova_compute[185723]: 2026-02-16 13:27:37.530 185727 DEBUG nova.compute.manager [None req-2c134511-64a1-4c90-949d-09cb96b35f93 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmppe2llmuf',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='c6353280-0641-466d-9963-30eb530755e9',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.9/site-packages/nova/compute/manager.py:8587
Feb 16 13:27:38 compute-0 nova_compute[185723]: 2026-02-16 13:27:38.100 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:27:39 compute-0 nova_compute[185723]: 2026-02-16 13:27:39.178 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:27:39 compute-0 nova_compute[185723]: 2026-02-16 13:27:39.939 185727 DEBUG oslo_concurrency.processutils [None req-2c134511-64a1-4c90-949d-09cb96b35f93 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c6353280-0641-466d-9963-30eb530755e9/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:27:39 compute-0 nova_compute[185723]: 2026-02-16 13:27:39.986 185727 DEBUG oslo_concurrency.processutils [None req-2c134511-64a1-4c90-949d-09cb96b35f93 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c6353280-0641-466d-9963-30eb530755e9/disk --force-share --output=json" returned: 0 in 0.047s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:27:39 compute-0 nova_compute[185723]: 2026-02-16 13:27:39.987 185727 DEBUG oslo_concurrency.processutils [None req-2c134511-64a1-4c90-949d-09cb96b35f93 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c6353280-0641-466d-9963-30eb530755e9/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:27:40 compute-0 nova_compute[185723]: 2026-02-16 13:27:40.035 185727 DEBUG oslo_concurrency.processutils [None req-2c134511-64a1-4c90-949d-09cb96b35f93 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c6353280-0641-466d-9963-30eb530755e9/disk --force-share --output=json" returned: 0 in 0.048s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:27:42 compute-0 sshd-session[208718]: Connection closed by authenticating user root 64.227.72.94 port 49138 [preauth]
Feb 16 13:27:43 compute-0 nova_compute[185723]: 2026-02-16 13:27:43.101 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:27:44 compute-0 nova_compute[185723]: 2026-02-16 13:27:44.180 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:27:46 compute-0 sshd-session[208720]: Invalid user server from 188.166.42.159 port 38972
Feb 16 13:27:46 compute-0 podman[208722]: 2026-02-16 13:27:46.266701973 +0000 UTC m=+0.088202932 container health_status 4ec6105d1727f201f656d7e6e358931be8ceabc5af37fb5ad779abc620122180 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 16 13:27:46 compute-0 sshd-session[208720]: Connection closed by invalid user server 188.166.42.159 port 38972 [preauth]
Feb 16 13:27:48 compute-0 nova_compute[185723]: 2026-02-16 13:27:48.103 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:27:48 compute-0 sshd-session[208748]: Accepted publickey for nova from 192.168.122.101 port 37292 ssh2: ECDSA SHA256:U309eBAZgvPXicX2lI3ib2903RjOpPXbPKVddWOb314
Feb 16 13:27:48 compute-0 systemd-logind[818]: New session 29 of user nova.
Feb 16 13:27:48 compute-0 systemd[1]: Created slice User Slice of UID 42436.
Feb 16 13:27:48 compute-0 systemd[1]: Starting User Runtime Directory /run/user/42436...
Feb 16 13:27:48 compute-0 systemd[1]: Finished User Runtime Directory /run/user/42436.
Feb 16 13:27:48 compute-0 systemd[1]: Starting User Manager for UID 42436...
Feb 16 13:27:48 compute-0 systemd[208752]: pam_unix(systemd-user:session): session opened for user nova(uid=42436) by nova(uid=0)
Feb 16 13:27:48 compute-0 systemd[208752]: Queued start job for default target Main User Target.
Feb 16 13:27:48 compute-0 systemd[208752]: Created slice User Application Slice.
Feb 16 13:27:48 compute-0 systemd[208752]: Started Mark boot as successful after the user session has run 2 minutes.
Feb 16 13:27:48 compute-0 systemd[208752]: Started Daily Cleanup of User's Temporary Directories.
Feb 16 13:27:48 compute-0 systemd[208752]: Reached target Paths.
Feb 16 13:27:48 compute-0 systemd[208752]: Reached target Timers.
Feb 16 13:27:48 compute-0 systemd[208752]: Starting D-Bus User Message Bus Socket...
Feb 16 13:27:48 compute-0 systemd[208752]: Starting Create User's Volatile Files and Directories...
Feb 16 13:27:48 compute-0 systemd[208752]: Finished Create User's Volatile Files and Directories.
Feb 16 13:27:48 compute-0 systemd[208752]: Listening on D-Bus User Message Bus Socket.
Feb 16 13:27:48 compute-0 systemd[208752]: Reached target Sockets.
Feb 16 13:27:48 compute-0 systemd[208752]: Reached target Basic System.
Feb 16 13:27:48 compute-0 systemd[208752]: Reached target Main User Target.
Feb 16 13:27:48 compute-0 systemd[208752]: Startup finished in 121ms.
Feb 16 13:27:48 compute-0 systemd[1]: Started User Manager for UID 42436.
Feb 16 13:27:48 compute-0 systemd[1]: Started Session 29 of User nova.
Feb 16 13:27:48 compute-0 sshd-session[208748]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Feb 16 13:27:48 compute-0 sshd-session[208767]: Received disconnect from 192.168.122.101 port 37292:11: disconnected by user
Feb 16 13:27:48 compute-0 sshd-session[208767]: Disconnected from user nova 192.168.122.101 port 37292
Feb 16 13:27:48 compute-0 sshd-session[208748]: pam_unix(sshd:session): session closed for user nova
Feb 16 13:27:48 compute-0 systemd[1]: session-29.scope: Deactivated successfully.
Feb 16 13:27:48 compute-0 systemd-logind[818]: Session 29 logged out. Waiting for processes to exit.
Feb 16 13:27:48 compute-0 systemd-logind[818]: Removed session 29.
Feb 16 13:27:49 compute-0 nova_compute[185723]: 2026-02-16 13:27:49.183 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:27:50 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:27:50.242 105360 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=10, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '52:96:1b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '16:6c:7a:bd:49:39'}, ipsec=False) old=SB_Global(nb_cfg=9) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 13:27:50 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:27:50.243 105360 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 16 13:27:50 compute-0 nova_compute[185723]: 2026-02-16 13:27:50.243 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:27:50 compute-0 nova_compute[185723]: 2026-02-16 13:27:50.354 185727 DEBUG nova.compute.manager [req-32a8ae08-f120-471a-a81e-bf049d672510 req-087a8720-daff-4a60-ac73-b90d582782de faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: c6353280-0641-466d-9963-30eb530755e9] Received event network-vif-unplugged-68d12bd9-0c21-41b6-b775-1de285c4be2c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:27:50 compute-0 nova_compute[185723]: 2026-02-16 13:27:50.355 185727 DEBUG oslo_concurrency.lockutils [req-32a8ae08-f120-471a-a81e-bf049d672510 req-087a8720-daff-4a60-ac73-b90d582782de faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "c6353280-0641-466d-9963-30eb530755e9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:27:50 compute-0 nova_compute[185723]: 2026-02-16 13:27:50.355 185727 DEBUG oslo_concurrency.lockutils [req-32a8ae08-f120-471a-a81e-bf049d672510 req-087a8720-daff-4a60-ac73-b90d582782de faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "c6353280-0641-466d-9963-30eb530755e9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:27:50 compute-0 nova_compute[185723]: 2026-02-16 13:27:50.356 185727 DEBUG oslo_concurrency.lockutils [req-32a8ae08-f120-471a-a81e-bf049d672510 req-087a8720-daff-4a60-ac73-b90d582782de faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "c6353280-0641-466d-9963-30eb530755e9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:27:50 compute-0 nova_compute[185723]: 2026-02-16 13:27:50.356 185727 DEBUG nova.compute.manager [req-32a8ae08-f120-471a-a81e-bf049d672510 req-087a8720-daff-4a60-ac73-b90d582782de faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: c6353280-0641-466d-9963-30eb530755e9] No waiting events found dispatching network-vif-unplugged-68d12bd9-0c21-41b6-b775-1de285c4be2c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:27:50 compute-0 nova_compute[185723]: 2026-02-16 13:27:50.356 185727 DEBUG nova.compute.manager [req-32a8ae08-f120-471a-a81e-bf049d672510 req-087a8720-daff-4a60-ac73-b90d582782de faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: c6353280-0641-466d-9963-30eb530755e9] Received event network-vif-unplugged-68d12bd9-0c21-41b6-b775-1de285c4be2c for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Feb 16 13:27:50 compute-0 nova_compute[185723]: 2026-02-16 13:27:50.990 185727 INFO nova.compute.manager [None req-2c134511-64a1-4c90-949d-09cb96b35f93 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: c6353280-0641-466d-9963-30eb530755e9] Took 10.95 seconds for pre_live_migration on destination host compute-1.ctlplane.example.com.
Feb 16 13:27:50 compute-0 nova_compute[185723]: 2026-02-16 13:27:50.991 185727 DEBUG nova.compute.manager [None req-2c134511-64a1-4c90-949d-09cb96b35f93 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: c6353280-0641-466d-9963-30eb530755e9] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Feb 16 13:27:51 compute-0 nova_compute[185723]: 2026-02-16 13:27:51.011 185727 DEBUG nova.compute.manager [None req-2c134511-64a1-4c90-949d-09cb96b35f93 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmppe2llmuf',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='c6353280-0641-466d-9963-30eb530755e9',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(64356f2a-1c7d-445a-9b5b-217256fe2076),old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8939
Feb 16 13:27:51 compute-0 nova_compute[185723]: 2026-02-16 13:27:51.035 185727 DEBUG nova.objects.instance [None req-2c134511-64a1-4c90-949d-09cb96b35f93 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lazy-loading 'migration_context' on Instance uuid c6353280-0641-466d-9963-30eb530755e9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 13:27:51 compute-0 nova_compute[185723]: 2026-02-16 13:27:51.036 185727 DEBUG nova.virt.libvirt.driver [None req-2c134511-64a1-4c90-949d-09cb96b35f93 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: c6353280-0641-466d-9963-30eb530755e9] Starting monitoring of live migration _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10639
Feb 16 13:27:51 compute-0 nova_compute[185723]: 2026-02-16 13:27:51.037 185727 DEBUG nova.virt.libvirt.driver [None req-2c134511-64a1-4c90-949d-09cb96b35f93 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: c6353280-0641-466d-9963-30eb530755e9] Operation thread is still running _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10440
Feb 16 13:27:51 compute-0 nova_compute[185723]: 2026-02-16 13:27:51.038 185727 DEBUG nova.virt.libvirt.driver [None req-2c134511-64a1-4c90-949d-09cb96b35f93 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: c6353280-0641-466d-9963-30eb530755e9] Migration not running yet _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10449
Feb 16 13:27:51 compute-0 nova_compute[185723]: 2026-02-16 13:27:51.055 185727 DEBUG nova.virt.libvirt.vif [None req-2c134511-64a1-4c90-949d-09cb96b35f93 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-16T13:26:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteBasicStrategy-server-92456537',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutebasicstrategy-server-92456537',id=8,image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-16T13:26:15Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='67efa696c46c451ba23d1157e0816503',ramdisk_id='',reservation_id='r-d5o7uffx',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteBasicStrategy-2074109192',owner_user_name='tempest-TestExecuteBasicStrategy-2074109192-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-16T13:26:15Z,user_data=None,user_id='566db36bffff4193a494fef52f968126',uuid=c6353280-0641-466d-9963-30eb530755e9,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "68d12bd9-0c21-41b6-b775-1de285c4be2c", "address": "fa:16:3e:44:c3:87", "network": {"id": "85f5abea-ac25-4244-a69b-79e29b2ba1fc", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-950978049-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "67efa696c46c451ba23d1157e0816503", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap68d12bd9-0c", "ovs_interfaceid": "68d12bd9-0c21-41b6-b775-1de285c4be2c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Feb 16 13:27:51 compute-0 nova_compute[185723]: 2026-02-16 13:27:51.056 185727 DEBUG nova.network.os_vif_util [None req-2c134511-64a1-4c90-949d-09cb96b35f93 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Converting VIF {"id": "68d12bd9-0c21-41b6-b775-1de285c4be2c", "address": "fa:16:3e:44:c3:87", "network": {"id": "85f5abea-ac25-4244-a69b-79e29b2ba1fc", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-950978049-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "67efa696c46c451ba23d1157e0816503", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap68d12bd9-0c", "ovs_interfaceid": "68d12bd9-0c21-41b6-b775-1de285c4be2c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 13:27:51 compute-0 nova_compute[185723]: 2026-02-16 13:27:51.057 185727 DEBUG nova.network.os_vif_util [None req-2c134511-64a1-4c90-949d-09cb96b35f93 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:44:c3:87,bridge_name='br-int',has_traffic_filtering=True,id=68d12bd9-0c21-41b6-b775-1de285c4be2c,network=Network(85f5abea-ac25-4244-a69b-79e29b2ba1fc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap68d12bd9-0c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 13:27:51 compute-0 nova_compute[185723]: 2026-02-16 13:27:51.057 185727 DEBUG nova.virt.libvirt.migration [None req-2c134511-64a1-4c90-949d-09cb96b35f93 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: c6353280-0641-466d-9963-30eb530755e9] Updating guest XML with vif config: <interface type="ethernet">
Feb 16 13:27:51 compute-0 nova_compute[185723]:   <mac address="fa:16:3e:44:c3:87"/>
Feb 16 13:27:51 compute-0 nova_compute[185723]:   <model type="virtio"/>
Feb 16 13:27:51 compute-0 nova_compute[185723]:   <driver name="vhost" rx_queue_size="512"/>
Feb 16 13:27:51 compute-0 nova_compute[185723]:   <mtu size="1442"/>
Feb 16 13:27:51 compute-0 nova_compute[185723]:   <target dev="tap68d12bd9-0c"/>
Feb 16 13:27:51 compute-0 nova_compute[185723]: </interface>
Feb 16 13:27:51 compute-0 nova_compute[185723]:  _update_vif_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:388
Feb 16 13:27:51 compute-0 nova_compute[185723]: 2026-02-16 13:27:51.058 185727 DEBUG nova.virt.libvirt.driver [None req-2c134511-64a1-4c90-949d-09cb96b35f93 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: c6353280-0641-466d-9963-30eb530755e9] About to invoke the migrate API _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10272
Feb 16 13:27:51 compute-0 nova_compute[185723]: 2026-02-16 13:27:51.540 185727 DEBUG nova.virt.libvirt.migration [None req-2c134511-64a1-4c90-949d-09cb96b35f93 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: c6353280-0641-466d-9963-30eb530755e9] Current None elapsed 0 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Feb 16 13:27:51 compute-0 nova_compute[185723]: 2026-02-16 13:27:51.542 185727 INFO nova.virt.libvirt.migration [None req-2c134511-64a1-4c90-949d-09cb96b35f93 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: c6353280-0641-466d-9963-30eb530755e9] Increasing downtime to 50 ms after 0 sec elapsed time
Feb 16 13:27:51 compute-0 nova_compute[185723]: 2026-02-16 13:27:51.641 185727 INFO nova.virt.libvirt.driver [None req-2c134511-64a1-4c90-949d-09cb96b35f93 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: c6353280-0641-466d-9963-30eb530755e9] Migration running for 0 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).
Feb 16 13:27:52 compute-0 nova_compute[185723]: 2026-02-16 13:27:52.144 185727 DEBUG nova.virt.libvirt.migration [None req-2c134511-64a1-4c90-949d-09cb96b35f93 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: c6353280-0641-466d-9963-30eb530755e9] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Feb 16 13:27:52 compute-0 nova_compute[185723]: 2026-02-16 13:27:52.145 185727 DEBUG nova.virt.libvirt.migration [None req-2c134511-64a1-4c90-949d-09cb96b35f93 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: c6353280-0641-466d-9963-30eb530755e9] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Feb 16 13:27:52 compute-0 nova_compute[185723]: 2026-02-16 13:27:52.523 185727 DEBUG nova.compute.manager [req-3655e2a2-a9a7-4c2a-9be6-bd5242003eb6 req-edbdd4f6-826b-4c80-b0ae-5bb2223d9383 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: c6353280-0641-466d-9963-30eb530755e9] Received event network-vif-plugged-68d12bd9-0c21-41b6-b775-1de285c4be2c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:27:52 compute-0 nova_compute[185723]: 2026-02-16 13:27:52.524 185727 DEBUG oslo_concurrency.lockutils [req-3655e2a2-a9a7-4c2a-9be6-bd5242003eb6 req-edbdd4f6-826b-4c80-b0ae-5bb2223d9383 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "c6353280-0641-466d-9963-30eb530755e9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:27:52 compute-0 nova_compute[185723]: 2026-02-16 13:27:52.525 185727 DEBUG oslo_concurrency.lockutils [req-3655e2a2-a9a7-4c2a-9be6-bd5242003eb6 req-edbdd4f6-826b-4c80-b0ae-5bb2223d9383 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "c6353280-0641-466d-9963-30eb530755e9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:27:52 compute-0 nova_compute[185723]: 2026-02-16 13:27:52.525 185727 DEBUG oslo_concurrency.lockutils [req-3655e2a2-a9a7-4c2a-9be6-bd5242003eb6 req-edbdd4f6-826b-4c80-b0ae-5bb2223d9383 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "c6353280-0641-466d-9963-30eb530755e9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:27:52 compute-0 nova_compute[185723]: 2026-02-16 13:27:52.525 185727 DEBUG nova.compute.manager [req-3655e2a2-a9a7-4c2a-9be6-bd5242003eb6 req-edbdd4f6-826b-4c80-b0ae-5bb2223d9383 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: c6353280-0641-466d-9963-30eb530755e9] No waiting events found dispatching network-vif-plugged-68d12bd9-0c21-41b6-b775-1de285c4be2c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:27:52 compute-0 nova_compute[185723]: 2026-02-16 13:27:52.526 185727 WARNING nova.compute.manager [req-3655e2a2-a9a7-4c2a-9be6-bd5242003eb6 req-edbdd4f6-826b-4c80-b0ae-5bb2223d9383 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: c6353280-0641-466d-9963-30eb530755e9] Received unexpected event network-vif-plugged-68d12bd9-0c21-41b6-b775-1de285c4be2c for instance with vm_state active and task_state migrating.
Feb 16 13:27:52 compute-0 nova_compute[185723]: 2026-02-16 13:27:52.526 185727 DEBUG nova.compute.manager [req-3655e2a2-a9a7-4c2a-9be6-bd5242003eb6 req-edbdd4f6-826b-4c80-b0ae-5bb2223d9383 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: c6353280-0641-466d-9963-30eb530755e9] Received event network-changed-68d12bd9-0c21-41b6-b775-1de285c4be2c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:27:52 compute-0 nova_compute[185723]: 2026-02-16 13:27:52.526 185727 DEBUG nova.compute.manager [req-3655e2a2-a9a7-4c2a-9be6-bd5242003eb6 req-edbdd4f6-826b-4c80-b0ae-5bb2223d9383 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: c6353280-0641-466d-9963-30eb530755e9] Refreshing instance network info cache due to event network-changed-68d12bd9-0c21-41b6-b775-1de285c4be2c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 16 13:27:52 compute-0 nova_compute[185723]: 2026-02-16 13:27:52.527 185727 DEBUG oslo_concurrency.lockutils [req-3655e2a2-a9a7-4c2a-9be6-bd5242003eb6 req-edbdd4f6-826b-4c80-b0ae-5bb2223d9383 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "refresh_cache-c6353280-0641-466d-9963-30eb530755e9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 13:27:52 compute-0 nova_compute[185723]: 2026-02-16 13:27:52.527 185727 DEBUG oslo_concurrency.lockutils [req-3655e2a2-a9a7-4c2a-9be6-bd5242003eb6 req-edbdd4f6-826b-4c80-b0ae-5bb2223d9383 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquired lock "refresh_cache-c6353280-0641-466d-9963-30eb530755e9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 13:27:52 compute-0 nova_compute[185723]: 2026-02-16 13:27:52.527 185727 DEBUG nova.network.neutron [req-3655e2a2-a9a7-4c2a-9be6-bd5242003eb6 req-edbdd4f6-826b-4c80-b0ae-5bb2223d9383 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: c6353280-0641-466d-9963-30eb530755e9] Refreshing network info cache for port 68d12bd9-0c21-41b6-b775-1de285c4be2c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 16 13:27:52 compute-0 nova_compute[185723]: 2026-02-16 13:27:52.590 185727 DEBUG nova.virt.driver [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] Emitting event <LifecycleEvent: 1771248472.590053, c6353280-0641-466d-9963-30eb530755e9 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 13:27:52 compute-0 nova_compute[185723]: 2026-02-16 13:27:52.591 185727 INFO nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: c6353280-0641-466d-9963-30eb530755e9] VM Paused (Lifecycle Event)
Feb 16 13:27:52 compute-0 nova_compute[185723]: 2026-02-16 13:27:52.621 185727 DEBUG nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: c6353280-0641-466d-9963-30eb530755e9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:27:52 compute-0 nova_compute[185723]: 2026-02-16 13:27:52.628 185727 DEBUG nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: c6353280-0641-466d-9963-30eb530755e9] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 16 13:27:52 compute-0 nova_compute[185723]: 2026-02-16 13:27:52.650 185727 DEBUG nova.virt.libvirt.migration [None req-2c134511-64a1-4c90-949d-09cb96b35f93 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: c6353280-0641-466d-9963-30eb530755e9] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Feb 16 13:27:52 compute-0 nova_compute[185723]: 2026-02-16 13:27:52.650 185727 DEBUG nova.virt.libvirt.migration [None req-2c134511-64a1-4c90-949d-09cb96b35f93 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: c6353280-0641-466d-9963-30eb530755e9] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Feb 16 13:27:52 compute-0 nova_compute[185723]: 2026-02-16 13:27:52.660 185727 INFO nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: c6353280-0641-466d-9963-30eb530755e9] During sync_power_state the instance has a pending task (migrating). Skip.
Feb 16 13:27:52 compute-0 kernel: tap68d12bd9-0c (unregistering): left promiscuous mode
Feb 16 13:27:52 compute-0 NetworkManager[56177]: <info>  [1771248472.7445] device (tap68d12bd9-0c): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Feb 16 13:27:52 compute-0 ovn_controller[96072]: 2026-02-16T13:27:52Z|00075|binding|INFO|Releasing lport 68d12bd9-0c21-41b6-b775-1de285c4be2c from this chassis (sb_readonly=0)
Feb 16 13:27:52 compute-0 ovn_controller[96072]: 2026-02-16T13:27:52Z|00076|binding|INFO|Setting lport 68d12bd9-0c21-41b6-b775-1de285c4be2c down in Southbound
Feb 16 13:27:52 compute-0 ovn_controller[96072]: 2026-02-16T13:27:52Z|00077|binding|INFO|Removing iface tap68d12bd9-0c ovn-installed in OVS
Feb 16 13:27:52 compute-0 nova_compute[185723]: 2026-02-16 13:27:52.754 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:27:52 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:27:52.763 105360 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:44:c3:87 10.100.0.5'], port_security=['fa:16:3e:44:c3:87 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com,compute-1.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': '54c1a259-778a-4222-b2c6-8422ea19a065'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'c6353280-0641-466d-9963-30eb530755e9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-85f5abea-ac25-4244-a69b-79e29b2ba1fc', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '67efa696c46c451ba23d1157e0816503', 'neutron:revision_number': '8', 'neutron:security_group_ids': '12b2bd84-2289-4cae-bae0-edbc0fcc8f32', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2457a351-b1fe-40a0-b007-96d97766c2c9, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2e53046760>], logical_port=68d12bd9-0c21-41b6-b775-1de285c4be2c) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2e53046760>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 13:27:52 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:27:52.764 105360 INFO neutron.agent.ovn.metadata.agent [-] Port 68d12bd9-0c21-41b6-b775-1de285c4be2c in datapath 85f5abea-ac25-4244-a69b-79e29b2ba1fc unbound from our chassis
Feb 16 13:27:52 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:27:52.765 105360 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 85f5abea-ac25-4244-a69b-79e29b2ba1fc, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 16 13:27:52 compute-0 nova_compute[185723]: 2026-02-16 13:27:52.768 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:27:52 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:27:52.767 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[bbc0888a-fb63-4b73-a936-a09e1c5410ee]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:27:52 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:27:52.770 105360 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-85f5abea-ac25-4244-a69b-79e29b2ba1fc namespace which is not needed anymore
Feb 16 13:27:52 compute-0 systemd[1]: machine-qemu\x2d5\x2dinstance\x2d00000008.scope: Deactivated successfully.
Feb 16 13:27:52 compute-0 systemd[1]: machine-qemu\x2d5\x2dinstance\x2d00000008.scope: Consumed 15.379s CPU time.
Feb 16 13:27:52 compute-0 systemd-machined[155229]: Machine qemu-5-instance-00000008 terminated.
Feb 16 13:27:52 compute-0 neutron-haproxy-ovnmeta-85f5abea-ac25-4244-a69b-79e29b2ba1fc[208422]: [NOTICE]   (208426) : haproxy version is 2.8.14-c23fe91
Feb 16 13:27:52 compute-0 neutron-haproxy-ovnmeta-85f5abea-ac25-4244-a69b-79e29b2ba1fc[208422]: [NOTICE]   (208426) : path to executable is /usr/sbin/haproxy
Feb 16 13:27:52 compute-0 neutron-haproxy-ovnmeta-85f5abea-ac25-4244-a69b-79e29b2ba1fc[208422]: [WARNING]  (208426) : Exiting Master process...
Feb 16 13:27:52 compute-0 neutron-haproxy-ovnmeta-85f5abea-ac25-4244-a69b-79e29b2ba1fc[208422]: [ALERT]    (208426) : Current worker (208428) exited with code 143 (Terminated)
Feb 16 13:27:52 compute-0 neutron-haproxy-ovnmeta-85f5abea-ac25-4244-a69b-79e29b2ba1fc[208422]: [WARNING]  (208426) : All workers exited. Exiting... (0)
Feb 16 13:27:52 compute-0 systemd[1]: libpod-158b9aa7b75fa9076d9b5fc893d84fd9397b95eaeef8bc46a3650cf48aaffe00.scope: Deactivated successfully.
Feb 16 13:27:52 compute-0 podman[208814]: 2026-02-16 13:27:52.891771116 +0000 UTC m=+0.035873347 container died 158b9aa7b75fa9076d9b5fc893d84fd9397b95eaeef8bc46a3650cf48aaffe00 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-85f5abea-ac25-4244-a69b-79e29b2ba1fc, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Feb 16 13:27:52 compute-0 systemd[1]: var-lib-containers-storage-overlay-282335b6f3b2ad70372fbfcb18a63f51f22b6c54176cbebabd3f0920c11b8c0b-merged.mount: Deactivated successfully.
Feb 16 13:27:52 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-158b9aa7b75fa9076d9b5fc893d84fd9397b95eaeef8bc46a3650cf48aaffe00-userdata-shm.mount: Deactivated successfully.
Feb 16 13:27:52 compute-0 podman[208814]: 2026-02-16 13:27:52.916930854 +0000 UTC m=+0.061033085 container cleanup 158b9aa7b75fa9076d9b5fc893d84fd9397b95eaeef8bc46a3650cf48aaffe00 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-85f5abea-ac25-4244-a69b-79e29b2ba1fc, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20260127)
Feb 16 13:27:52 compute-0 systemd[1]: libpod-conmon-158b9aa7b75fa9076d9b5fc893d84fd9397b95eaeef8bc46a3650cf48aaffe00.scope: Deactivated successfully.
Feb 16 13:27:52 compute-0 nova_compute[185723]: 2026-02-16 13:27:52.959 185727 DEBUG nova.virt.libvirt.driver [None req-2c134511-64a1-4c90-949d-09cb96b35f93 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: c6353280-0641-466d-9963-30eb530755e9] Migrate API has completed _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10279
Feb 16 13:27:52 compute-0 nova_compute[185723]: 2026-02-16 13:27:52.959 185727 DEBUG nova.virt.libvirt.driver [None req-2c134511-64a1-4c90-949d-09cb96b35f93 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: c6353280-0641-466d-9963-30eb530755e9] Migration operation thread has finished _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10327
Feb 16 13:27:52 compute-0 nova_compute[185723]: 2026-02-16 13:27:52.959 185727 DEBUG nova.virt.libvirt.driver [None req-2c134511-64a1-4c90-949d-09cb96b35f93 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: c6353280-0641-466d-9963-30eb530755e9] Migration operation thread notification thread_finished /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10630
Feb 16 13:27:52 compute-0 podman[208842]: 2026-02-16 13:27:52.985836864 +0000 UTC m=+0.052399429 container remove 158b9aa7b75fa9076d9b5fc893d84fd9397b95eaeef8bc46a3650cf48aaffe00 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-85f5abea-ac25-4244-a69b-79e29b2ba1fc, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Feb 16 13:27:52 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:27:52.989 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[31f0556a-e33a-4dee-a342-b98cd86b5076]: (4, ('Mon Feb 16 01:27:52 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-85f5abea-ac25-4244-a69b-79e29b2ba1fc (158b9aa7b75fa9076d9b5fc893d84fd9397b95eaeef8bc46a3650cf48aaffe00)\n158b9aa7b75fa9076d9b5fc893d84fd9397b95eaeef8bc46a3650cf48aaffe00\nMon Feb 16 01:27:52 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-85f5abea-ac25-4244-a69b-79e29b2ba1fc (158b9aa7b75fa9076d9b5fc893d84fd9397b95eaeef8bc46a3650cf48aaffe00)\n158b9aa7b75fa9076d9b5fc893d84fd9397b95eaeef8bc46a3650cf48aaffe00\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:27:52 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:27:52.990 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[2228c136-4797-4416-b90c-afbde7b94775]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:27:52 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:27:52.991 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap85f5abea-a0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:27:52 compute-0 nova_compute[185723]: 2026-02-16 13:27:52.993 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:27:52 compute-0 kernel: tap85f5abea-a0: left promiscuous mode
Feb 16 13:27:52 compute-0 nova_compute[185723]: 2026-02-16 13:27:52.999 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:27:53 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:27:53.002 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[badcc22b-fd6a-4a7c-8345-1675551d42fe]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:27:53 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:27:53.017 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[a00ed90e-b2d8-44b3-ac91-29bd05819b09]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:27:53 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:27:53.019 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[771c7340-13c9-4776-8352-b723e9d1dcc9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:27:53 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:27:53.031 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[7f85f225-2bc9-49a6-8b68-9786642cb0ca]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 445839, 'reachable_time': 38423, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 208875, 'error': None, 'target': 'ovnmeta-85f5abea-ac25-4244-a69b-79e29b2ba1fc', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:27:53 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:27:53.035 105762 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-85f5abea-ac25-4244-a69b-79e29b2ba1fc deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Feb 16 13:27:53 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:27:53.035 105762 DEBUG oslo.privsep.daemon [-] privsep: reply[d833b68c-ecd2-4a0e-820e-3eefbd047213]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:27:53 compute-0 systemd[1]: run-netns-ovnmeta\x2d85f5abea\x2dac25\x2d4244\x2da69b\x2d79e29b2ba1fc.mount: Deactivated successfully.
Feb 16 13:27:53 compute-0 nova_compute[185723]: 2026-02-16 13:27:53.104 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:27:53 compute-0 nova_compute[185723]: 2026-02-16 13:27:53.153 185727 DEBUG nova.virt.libvirt.guest [None req-2c134511-64a1-4c90-949d-09cb96b35f93 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Domain has shutdown/gone away: Domain not found: no domain with matching uuid 'c6353280-0641-466d-9963-30eb530755e9' (instance-00000008) get_job_info /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:688
Feb 16 13:27:53 compute-0 nova_compute[185723]: 2026-02-16 13:27:53.154 185727 INFO nova.virt.libvirt.driver [None req-2c134511-64a1-4c90-949d-09cb96b35f93 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: c6353280-0641-466d-9963-30eb530755e9] Migration operation has completed
Feb 16 13:27:53 compute-0 nova_compute[185723]: 2026-02-16 13:27:53.154 185727 INFO nova.compute.manager [None req-2c134511-64a1-4c90-949d-09cb96b35f93 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: c6353280-0641-466d-9963-30eb530755e9] _post_live_migration() is started..
Feb 16 13:27:54 compute-0 nova_compute[185723]: 2026-02-16 13:27:54.580 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:27:57 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:27:57.246 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=b0e583b2-47d7-4bde-bbd6-282143e0c194, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '10'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:27:57 compute-0 nova_compute[185723]: 2026-02-16 13:27:57.442 185727 DEBUG nova.compute.manager [req-fe9b4ec7-0482-48bf-81f3-0b47f24044d0 req-e7d5f7dc-e644-49dc-80fb-e3ceb6983739 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: c6353280-0641-466d-9963-30eb530755e9] Received event network-vif-unplugged-68d12bd9-0c21-41b6-b775-1de285c4be2c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:27:57 compute-0 nova_compute[185723]: 2026-02-16 13:27:57.443 185727 DEBUG oslo_concurrency.lockutils [req-fe9b4ec7-0482-48bf-81f3-0b47f24044d0 req-e7d5f7dc-e644-49dc-80fb-e3ceb6983739 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "c6353280-0641-466d-9963-30eb530755e9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:27:57 compute-0 nova_compute[185723]: 2026-02-16 13:27:57.443 185727 DEBUG oslo_concurrency.lockutils [req-fe9b4ec7-0482-48bf-81f3-0b47f24044d0 req-e7d5f7dc-e644-49dc-80fb-e3ceb6983739 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "c6353280-0641-466d-9963-30eb530755e9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:27:57 compute-0 nova_compute[185723]: 2026-02-16 13:27:57.443 185727 DEBUG oslo_concurrency.lockutils [req-fe9b4ec7-0482-48bf-81f3-0b47f24044d0 req-e7d5f7dc-e644-49dc-80fb-e3ceb6983739 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "c6353280-0641-466d-9963-30eb530755e9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:27:57 compute-0 nova_compute[185723]: 2026-02-16 13:27:57.443 185727 DEBUG nova.compute.manager [req-fe9b4ec7-0482-48bf-81f3-0b47f24044d0 req-e7d5f7dc-e644-49dc-80fb-e3ceb6983739 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: c6353280-0641-466d-9963-30eb530755e9] No waiting events found dispatching network-vif-unplugged-68d12bd9-0c21-41b6-b775-1de285c4be2c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:27:57 compute-0 nova_compute[185723]: 2026-02-16 13:27:57.443 185727 DEBUG nova.compute.manager [req-fe9b4ec7-0482-48bf-81f3-0b47f24044d0 req-e7d5f7dc-e644-49dc-80fb-e3ceb6983739 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: c6353280-0641-466d-9963-30eb530755e9] Received event network-vif-unplugged-68d12bd9-0c21-41b6-b775-1de285c4be2c for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Feb 16 13:27:57 compute-0 nova_compute[185723]: 2026-02-16 13:27:57.752 185727 DEBUG nova.network.neutron [None req-2c134511-64a1-4c90-949d-09cb96b35f93 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Activated binding for port 68d12bd9-0c21-41b6-b775-1de285c4be2c and host compute-1.ctlplane.example.com migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3181
Feb 16 13:27:57 compute-0 nova_compute[185723]: 2026-02-16 13:27:57.753 185727 DEBUG nova.compute.manager [None req-2c134511-64a1-4c90-949d-09cb96b35f93 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: c6353280-0641-466d-9963-30eb530755e9] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "68d12bd9-0c21-41b6-b775-1de285c4be2c", "address": "fa:16:3e:44:c3:87", "network": {"id": "85f5abea-ac25-4244-a69b-79e29b2ba1fc", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-950978049-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "67efa696c46c451ba23d1157e0816503", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap68d12bd9-0c", "ovs_interfaceid": "68d12bd9-0c21-41b6-b775-1de285c4be2c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9326
Feb 16 13:27:57 compute-0 nova_compute[185723]: 2026-02-16 13:27:57.754 185727 DEBUG nova.virt.libvirt.vif [None req-2c134511-64a1-4c90-949d-09cb96b35f93 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-16T13:26:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteBasicStrategy-server-92456537',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutebasicstrategy-server-92456537',id=8,image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-16T13:26:15Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='67efa696c46c451ba23d1157e0816503',ramdisk_id='',reservation_id='r-d5o7uffx',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteBasicStrategy-2074109192',owner_user_name='tempest-TestExecuteBasicStrategy-2074109192-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-16T13:27:31Z,user_data=None,user_id='566db36bffff4193a494fef52f968126',uuid=c6353280-0641-466d-9963-30eb530755e9,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "68d12bd9-0c21-41b6-b775-1de285c4be2c", "address": "fa:16:3e:44:c3:87", "network": {"id": "85f5abea-ac25-4244-a69b-79e29b2ba1fc", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-950978049-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "67efa696c46c451ba23d1157e0816503", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap68d12bd9-0c", "ovs_interfaceid": "68d12bd9-0c21-41b6-b775-1de285c4be2c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Feb 16 13:27:57 compute-0 nova_compute[185723]: 2026-02-16 13:27:57.754 185727 DEBUG nova.network.os_vif_util [None req-2c134511-64a1-4c90-949d-09cb96b35f93 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Converting VIF {"id": "68d12bd9-0c21-41b6-b775-1de285c4be2c", "address": "fa:16:3e:44:c3:87", "network": {"id": "85f5abea-ac25-4244-a69b-79e29b2ba1fc", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-950978049-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "67efa696c46c451ba23d1157e0816503", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap68d12bd9-0c", "ovs_interfaceid": "68d12bd9-0c21-41b6-b775-1de285c4be2c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 13:27:57 compute-0 nova_compute[185723]: 2026-02-16 13:27:57.755 185727 DEBUG nova.network.os_vif_util [None req-2c134511-64a1-4c90-949d-09cb96b35f93 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:44:c3:87,bridge_name='br-int',has_traffic_filtering=True,id=68d12bd9-0c21-41b6-b775-1de285c4be2c,network=Network(85f5abea-ac25-4244-a69b-79e29b2ba1fc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap68d12bd9-0c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 13:27:57 compute-0 nova_compute[185723]: 2026-02-16 13:27:57.755 185727 DEBUG os_vif [None req-2c134511-64a1-4c90-949d-09cb96b35f93 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:44:c3:87,bridge_name='br-int',has_traffic_filtering=True,id=68d12bd9-0c21-41b6-b775-1de285c4be2c,network=Network(85f5abea-ac25-4244-a69b-79e29b2ba1fc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap68d12bd9-0c') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Feb 16 13:27:57 compute-0 nova_compute[185723]: 2026-02-16 13:27:57.758 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:27:57 compute-0 nova_compute[185723]: 2026-02-16 13:27:57.758 185727 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap68d12bd9-0c, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:27:57 compute-0 nova_compute[185723]: 2026-02-16 13:27:57.760 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:27:57 compute-0 nova_compute[185723]: 2026-02-16 13:27:57.763 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 13:27:57 compute-0 nova_compute[185723]: 2026-02-16 13:27:57.766 185727 INFO os_vif [None req-2c134511-64a1-4c90-949d-09cb96b35f93 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:44:c3:87,bridge_name='br-int',has_traffic_filtering=True,id=68d12bd9-0c21-41b6-b775-1de285c4be2c,network=Network(85f5abea-ac25-4244-a69b-79e29b2ba1fc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap68d12bd9-0c')
Feb 16 13:27:57 compute-0 nova_compute[185723]: 2026-02-16 13:27:57.767 185727 DEBUG oslo_concurrency.lockutils [None req-2c134511-64a1-4c90-949d-09cb96b35f93 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:27:57 compute-0 nova_compute[185723]: 2026-02-16 13:27:57.767 185727 DEBUG oslo_concurrency.lockutils [None req-2c134511-64a1-4c90-949d-09cb96b35f93 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:27:57 compute-0 nova_compute[185723]: 2026-02-16 13:27:57.767 185727 DEBUG oslo_concurrency.lockutils [None req-2c134511-64a1-4c90-949d-09cb96b35f93 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:27:57 compute-0 nova_compute[185723]: 2026-02-16 13:27:57.767 185727 DEBUG nova.compute.manager [None req-2c134511-64a1-4c90-949d-09cb96b35f93 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: c6353280-0641-466d-9963-30eb530755e9] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9349
Feb 16 13:27:57 compute-0 nova_compute[185723]: 2026-02-16 13:27:57.768 185727 INFO nova.virt.libvirt.driver [None req-2c134511-64a1-4c90-949d-09cb96b35f93 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: c6353280-0641-466d-9963-30eb530755e9] Deleting instance files /var/lib/nova/instances/c6353280-0641-466d-9963-30eb530755e9_del
Feb 16 13:27:57 compute-0 nova_compute[185723]: 2026-02-16 13:27:57.768 185727 INFO nova.virt.libvirt.driver [None req-2c134511-64a1-4c90-949d-09cb96b35f93 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: c6353280-0641-466d-9963-30eb530755e9] Deletion of /var/lib/nova/instances/c6353280-0641-466d-9963-30eb530755e9_del complete
Feb 16 13:27:58 compute-0 nova_compute[185723]: 2026-02-16 13:27:58.103 185727 DEBUG nova.network.neutron [req-3655e2a2-a9a7-4c2a-9be6-bd5242003eb6 req-edbdd4f6-826b-4c80-b0ae-5bb2223d9383 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: c6353280-0641-466d-9963-30eb530755e9] Updated VIF entry in instance network info cache for port 68d12bd9-0c21-41b6-b775-1de285c4be2c. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 16 13:27:58 compute-0 nova_compute[185723]: 2026-02-16 13:27:58.104 185727 DEBUG nova.network.neutron [req-3655e2a2-a9a7-4c2a-9be6-bd5242003eb6 req-edbdd4f6-826b-4c80-b0ae-5bb2223d9383 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: c6353280-0641-466d-9963-30eb530755e9] Updating instance_info_cache with network_info: [{"id": "68d12bd9-0c21-41b6-b775-1de285c4be2c", "address": "fa:16:3e:44:c3:87", "network": {"id": "85f5abea-ac25-4244-a69b-79e29b2ba1fc", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-950978049-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "67efa696c46c451ba23d1157e0816503", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap68d12bd9-0c", "ovs_interfaceid": "68d12bd9-0c21-41b6-b775-1de285c4be2c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-1.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 13:27:58 compute-0 nova_compute[185723]: 2026-02-16 13:27:58.129 185727 DEBUG oslo_concurrency.lockutils [req-3655e2a2-a9a7-4c2a-9be6-bd5242003eb6 req-edbdd4f6-826b-4c80-b0ae-5bb2223d9383 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Releasing lock "refresh_cache-c6353280-0641-466d-9963-30eb530755e9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 13:27:58 compute-0 nova_compute[185723]: 2026-02-16 13:27:58.152 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:27:58 compute-0 nova_compute[185723]: 2026-02-16 13:27:58.432 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:27:58 compute-0 nova_compute[185723]: 2026-02-16 13:27:58.433 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Feb 16 13:27:58 compute-0 nova_compute[185723]: 2026-02-16 13:27:58.455 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Feb 16 13:27:58 compute-0 systemd[1]: Stopping User Manager for UID 42436...
Feb 16 13:27:58 compute-0 systemd[208752]: Activating special unit Exit the Session...
Feb 16 13:27:58 compute-0 systemd[208752]: Stopped target Main User Target.
Feb 16 13:27:58 compute-0 systemd[208752]: Stopped target Basic System.
Feb 16 13:27:58 compute-0 systemd[208752]: Stopped target Paths.
Feb 16 13:27:58 compute-0 systemd[208752]: Stopped target Sockets.
Feb 16 13:27:58 compute-0 systemd[208752]: Stopped target Timers.
Feb 16 13:27:58 compute-0 systemd[208752]: Stopped Mark boot as successful after the user session has run 2 minutes.
Feb 16 13:27:58 compute-0 systemd[208752]: Stopped Daily Cleanup of User's Temporary Directories.
Feb 16 13:27:58 compute-0 systemd[208752]: Closed D-Bus User Message Bus Socket.
Feb 16 13:27:58 compute-0 systemd[208752]: Stopped Create User's Volatile Files and Directories.
Feb 16 13:27:58 compute-0 systemd[208752]: Removed slice User Application Slice.
Feb 16 13:27:58 compute-0 systemd[208752]: Reached target Shutdown.
Feb 16 13:27:58 compute-0 systemd[208752]: Finished Exit the Session.
Feb 16 13:27:58 compute-0 systemd[208752]: Reached target Exit the Session.
Feb 16 13:27:58 compute-0 systemd[1]: user@42436.service: Deactivated successfully.
Feb 16 13:27:58 compute-0 systemd[1]: Stopped User Manager for UID 42436.
Feb 16 13:27:58 compute-0 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Feb 16 13:27:58 compute-0 systemd[1]: run-user-42436.mount: Deactivated successfully.
Feb 16 13:27:58 compute-0 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Feb 16 13:27:58 compute-0 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Feb 16 13:27:58 compute-0 systemd[1]: Removed slice User Slice of UID 42436.
Feb 16 13:27:59 compute-0 nova_compute[185723]: 2026-02-16 13:27:59.549 185727 DEBUG nova.compute.manager [req-56fa7f5c-8140-4293-8fe2-43c00271ed13 req-9f136ad4-9e80-4b29-b6f1-6b7bed3875de faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: c6353280-0641-466d-9963-30eb530755e9] Received event network-vif-unplugged-68d12bd9-0c21-41b6-b775-1de285c4be2c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:27:59 compute-0 nova_compute[185723]: 2026-02-16 13:27:59.550 185727 DEBUG oslo_concurrency.lockutils [req-56fa7f5c-8140-4293-8fe2-43c00271ed13 req-9f136ad4-9e80-4b29-b6f1-6b7bed3875de faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "c6353280-0641-466d-9963-30eb530755e9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:27:59 compute-0 nova_compute[185723]: 2026-02-16 13:27:59.550 185727 DEBUG oslo_concurrency.lockutils [req-56fa7f5c-8140-4293-8fe2-43c00271ed13 req-9f136ad4-9e80-4b29-b6f1-6b7bed3875de faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "c6353280-0641-466d-9963-30eb530755e9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:27:59 compute-0 nova_compute[185723]: 2026-02-16 13:27:59.550 185727 DEBUG oslo_concurrency.lockutils [req-56fa7f5c-8140-4293-8fe2-43c00271ed13 req-9f136ad4-9e80-4b29-b6f1-6b7bed3875de faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "c6353280-0641-466d-9963-30eb530755e9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:27:59 compute-0 nova_compute[185723]: 2026-02-16 13:27:59.551 185727 DEBUG nova.compute.manager [req-56fa7f5c-8140-4293-8fe2-43c00271ed13 req-9f136ad4-9e80-4b29-b6f1-6b7bed3875de faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: c6353280-0641-466d-9963-30eb530755e9] No waiting events found dispatching network-vif-unplugged-68d12bd9-0c21-41b6-b775-1de285c4be2c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:27:59 compute-0 nova_compute[185723]: 2026-02-16 13:27:59.551 185727 DEBUG nova.compute.manager [req-56fa7f5c-8140-4293-8fe2-43c00271ed13 req-9f136ad4-9e80-4b29-b6f1-6b7bed3875de faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: c6353280-0641-466d-9963-30eb530755e9] Received event network-vif-unplugged-68d12bd9-0c21-41b6-b775-1de285c4be2c for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Feb 16 13:27:59 compute-0 nova_compute[185723]: 2026-02-16 13:27:59.551 185727 DEBUG nova.compute.manager [req-56fa7f5c-8140-4293-8fe2-43c00271ed13 req-9f136ad4-9e80-4b29-b6f1-6b7bed3875de faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: c6353280-0641-466d-9963-30eb530755e9] Received event network-vif-plugged-68d12bd9-0c21-41b6-b775-1de285c4be2c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:27:59 compute-0 nova_compute[185723]: 2026-02-16 13:27:59.551 185727 DEBUG oslo_concurrency.lockutils [req-56fa7f5c-8140-4293-8fe2-43c00271ed13 req-9f136ad4-9e80-4b29-b6f1-6b7bed3875de faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "c6353280-0641-466d-9963-30eb530755e9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:27:59 compute-0 nova_compute[185723]: 2026-02-16 13:27:59.551 185727 DEBUG oslo_concurrency.lockutils [req-56fa7f5c-8140-4293-8fe2-43c00271ed13 req-9f136ad4-9e80-4b29-b6f1-6b7bed3875de faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "c6353280-0641-466d-9963-30eb530755e9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:27:59 compute-0 nova_compute[185723]: 2026-02-16 13:27:59.551 185727 DEBUG oslo_concurrency.lockutils [req-56fa7f5c-8140-4293-8fe2-43c00271ed13 req-9f136ad4-9e80-4b29-b6f1-6b7bed3875de faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "c6353280-0641-466d-9963-30eb530755e9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:27:59 compute-0 nova_compute[185723]: 2026-02-16 13:27:59.552 185727 DEBUG nova.compute.manager [req-56fa7f5c-8140-4293-8fe2-43c00271ed13 req-9f136ad4-9e80-4b29-b6f1-6b7bed3875de faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: c6353280-0641-466d-9963-30eb530755e9] No waiting events found dispatching network-vif-plugged-68d12bd9-0c21-41b6-b775-1de285c4be2c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:27:59 compute-0 nova_compute[185723]: 2026-02-16 13:27:59.552 185727 WARNING nova.compute.manager [req-56fa7f5c-8140-4293-8fe2-43c00271ed13 req-9f136ad4-9e80-4b29-b6f1-6b7bed3875de faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: c6353280-0641-466d-9963-30eb530755e9] Received unexpected event network-vif-plugged-68d12bd9-0c21-41b6-b775-1de285c4be2c for instance with vm_state active and task_state migrating.
Feb 16 13:27:59 compute-0 nova_compute[185723]: 2026-02-16 13:27:59.552 185727 DEBUG nova.compute.manager [req-56fa7f5c-8140-4293-8fe2-43c00271ed13 req-9f136ad4-9e80-4b29-b6f1-6b7bed3875de faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: c6353280-0641-466d-9963-30eb530755e9] Received event network-vif-plugged-68d12bd9-0c21-41b6-b775-1de285c4be2c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:27:59 compute-0 nova_compute[185723]: 2026-02-16 13:27:59.552 185727 DEBUG oslo_concurrency.lockutils [req-56fa7f5c-8140-4293-8fe2-43c00271ed13 req-9f136ad4-9e80-4b29-b6f1-6b7bed3875de faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "c6353280-0641-466d-9963-30eb530755e9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:27:59 compute-0 nova_compute[185723]: 2026-02-16 13:27:59.553 185727 DEBUG oslo_concurrency.lockutils [req-56fa7f5c-8140-4293-8fe2-43c00271ed13 req-9f136ad4-9e80-4b29-b6f1-6b7bed3875de faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "c6353280-0641-466d-9963-30eb530755e9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:27:59 compute-0 nova_compute[185723]: 2026-02-16 13:27:59.554 185727 DEBUG oslo_concurrency.lockutils [req-56fa7f5c-8140-4293-8fe2-43c00271ed13 req-9f136ad4-9e80-4b29-b6f1-6b7bed3875de faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "c6353280-0641-466d-9963-30eb530755e9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:27:59 compute-0 nova_compute[185723]: 2026-02-16 13:27:59.554 185727 DEBUG nova.compute.manager [req-56fa7f5c-8140-4293-8fe2-43c00271ed13 req-9f136ad4-9e80-4b29-b6f1-6b7bed3875de faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: c6353280-0641-466d-9963-30eb530755e9] No waiting events found dispatching network-vif-plugged-68d12bd9-0c21-41b6-b775-1de285c4be2c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:27:59 compute-0 nova_compute[185723]: 2026-02-16 13:27:59.554 185727 WARNING nova.compute.manager [req-56fa7f5c-8140-4293-8fe2-43c00271ed13 req-9f136ad4-9e80-4b29-b6f1-6b7bed3875de faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: c6353280-0641-466d-9963-30eb530755e9] Received unexpected event network-vif-plugged-68d12bd9-0c21-41b6-b775-1de285c4be2c for instance with vm_state active and task_state migrating.
Feb 16 13:27:59 compute-0 nova_compute[185723]: 2026-02-16 13:27:59.555 185727 DEBUG nova.compute.manager [req-56fa7f5c-8140-4293-8fe2-43c00271ed13 req-9f136ad4-9e80-4b29-b6f1-6b7bed3875de faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: c6353280-0641-466d-9963-30eb530755e9] Received event network-vif-plugged-68d12bd9-0c21-41b6-b775-1de285c4be2c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:27:59 compute-0 nova_compute[185723]: 2026-02-16 13:27:59.555 185727 DEBUG oslo_concurrency.lockutils [req-56fa7f5c-8140-4293-8fe2-43c00271ed13 req-9f136ad4-9e80-4b29-b6f1-6b7bed3875de faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "c6353280-0641-466d-9963-30eb530755e9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:27:59 compute-0 nova_compute[185723]: 2026-02-16 13:27:59.555 185727 DEBUG oslo_concurrency.lockutils [req-56fa7f5c-8140-4293-8fe2-43c00271ed13 req-9f136ad4-9e80-4b29-b6f1-6b7bed3875de faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "c6353280-0641-466d-9963-30eb530755e9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:27:59 compute-0 nova_compute[185723]: 2026-02-16 13:27:59.555 185727 DEBUG oslo_concurrency.lockutils [req-56fa7f5c-8140-4293-8fe2-43c00271ed13 req-9f136ad4-9e80-4b29-b6f1-6b7bed3875de faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "c6353280-0641-466d-9963-30eb530755e9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:27:59 compute-0 nova_compute[185723]: 2026-02-16 13:27:59.555 185727 DEBUG nova.compute.manager [req-56fa7f5c-8140-4293-8fe2-43c00271ed13 req-9f136ad4-9e80-4b29-b6f1-6b7bed3875de faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: c6353280-0641-466d-9963-30eb530755e9] No waiting events found dispatching network-vif-plugged-68d12bd9-0c21-41b6-b775-1de285c4be2c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:27:59 compute-0 nova_compute[185723]: 2026-02-16 13:27:59.556 185727 WARNING nova.compute.manager [req-56fa7f5c-8140-4293-8fe2-43c00271ed13 req-9f136ad4-9e80-4b29-b6f1-6b7bed3875de faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: c6353280-0641-466d-9963-30eb530755e9] Received unexpected event network-vif-plugged-68d12bd9-0c21-41b6-b775-1de285c4be2c for instance with vm_state active and task_state migrating.
Feb 16 13:27:59 compute-0 podman[195053]: time="2026-02-16T13:27:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:27:59 compute-0 podman[195053]: @ - - [16/Feb/2026:13:27:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16012 "" "Go-http-client/1.1"
Feb 16 13:27:59 compute-0 podman[195053]: @ - - [16/Feb/2026:13:27:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2172 "" "Go-http-client/1.1"
Feb 16 13:28:00 compute-0 nova_compute[185723]: 2026-02-16 13:28:00.434 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:28:00 compute-0 nova_compute[185723]: 2026-02-16 13:28:00.434 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Feb 16 13:28:01 compute-0 openstack_network_exporter[197909]: ERROR   13:28:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:28:01 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:28:01 compute-0 openstack_network_exporter[197909]: ERROR   13:28:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:28:01 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:28:02 compute-0 nova_compute[185723]: 2026-02-16 13:28:02.761 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:28:03 compute-0 nova_compute[185723]: 2026-02-16 13:28:03.007 185727 DEBUG oslo_concurrency.lockutils [None req-2c134511-64a1-4c90-949d-09cb96b35f93 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "c6353280-0641-466d-9963-30eb530755e9-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:28:03 compute-0 nova_compute[185723]: 2026-02-16 13:28:03.007 185727 DEBUG oslo_concurrency.lockutils [None req-2c134511-64a1-4c90-949d-09cb96b35f93 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "c6353280-0641-466d-9963-30eb530755e9-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:28:03 compute-0 nova_compute[185723]: 2026-02-16 13:28:03.008 185727 DEBUG oslo_concurrency.lockutils [None req-2c134511-64a1-4c90-949d-09cb96b35f93 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "c6353280-0641-466d-9963-30eb530755e9-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:28:03 compute-0 nova_compute[185723]: 2026-02-16 13:28:03.034 185727 DEBUG oslo_concurrency.lockutils [None req-2c134511-64a1-4c90-949d-09cb96b35f93 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:28:03 compute-0 nova_compute[185723]: 2026-02-16 13:28:03.035 185727 DEBUG oslo_concurrency.lockutils [None req-2c134511-64a1-4c90-949d-09cb96b35f93 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:28:03 compute-0 nova_compute[185723]: 2026-02-16 13:28:03.035 185727 DEBUG oslo_concurrency.lockutils [None req-2c134511-64a1-4c90-949d-09cb96b35f93 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:28:03 compute-0 nova_compute[185723]: 2026-02-16 13:28:03.036 185727 DEBUG nova.compute.resource_tracker [None req-2c134511-64a1-4c90-949d-09cb96b35f93 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 16 13:28:03 compute-0 podman[208879]: 2026-02-16 13:28:03.069192149 +0000 UTC m=+0.104980312 container health_status 93151dcbec9f159583042df5101bdd7b5b1bcb5f3117955b4bc9cd1c0e465b98 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, version=9.7, container_name=openstack_network_exporter, managed_by=edpm_ansible, org.opencontainers.image.created=2026-02-05T04:57:10Z, architecture=x86_64, vcs-type=git, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9/ubi-minimal, vendor=Red Hat, Inc., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, release=1770267347, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, build-date=2026-02-05T04:57:10Z, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Feb 16 13:28:03 compute-0 podman[208880]: 2026-02-16 13:28:03.086104581 +0000 UTC m=+0.115333420 container health_status d69bd0be854ec8043fcba555b70c2cd311c7a2510d07d6bc9929fe7684abece9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Feb 16 13:28:03 compute-0 nova_compute[185723]: 2026-02-16 13:28:03.154 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:28:03 compute-0 nova_compute[185723]: 2026-02-16 13:28:03.211 185727 WARNING nova.virt.libvirt.driver [None req-2c134511-64a1-4c90-949d-09cb96b35f93 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 13:28:03 compute-0 nova_compute[185723]: 2026-02-16 13:28:03.213 185727 DEBUG nova.compute.resource_tracker [None req-2c134511-64a1-4c90-949d-09cb96b35f93 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5802MB free_disk=73.2269515991211GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 16 13:28:03 compute-0 nova_compute[185723]: 2026-02-16 13:28:03.213 185727 DEBUG oslo_concurrency.lockutils [None req-2c134511-64a1-4c90-949d-09cb96b35f93 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:28:03 compute-0 nova_compute[185723]: 2026-02-16 13:28:03.214 185727 DEBUG oslo_concurrency.lockutils [None req-2c134511-64a1-4c90-949d-09cb96b35f93 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:28:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:28:03.220 105360 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:28:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:28:03.221 105360 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:28:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:28:03.221 105360 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:28:03 compute-0 nova_compute[185723]: 2026-02-16 13:28:03.279 185727 DEBUG nova.compute.resource_tracker [None req-2c134511-64a1-4c90-949d-09cb96b35f93 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Migration for instance c6353280-0641-466d-9963-30eb530755e9 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903
Feb 16 13:28:03 compute-0 nova_compute[185723]: 2026-02-16 13:28:03.308 185727 DEBUG nova.compute.resource_tracker [None req-2c134511-64a1-4c90-949d-09cb96b35f93 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: c6353280-0641-466d-9963-30eb530755e9] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1491
Feb 16 13:28:03 compute-0 nova_compute[185723]: 2026-02-16 13:28:03.416 185727 DEBUG nova.compute.resource_tracker [None req-2c134511-64a1-4c90-949d-09cb96b35f93 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Migration 64356f2a-1c7d-445a-9b5b-217256fe2076 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640
Feb 16 13:28:03 compute-0 nova_compute[185723]: 2026-02-16 13:28:03.418 185727 DEBUG nova.compute.resource_tracker [None req-2c134511-64a1-4c90-949d-09cb96b35f93 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 16 13:28:03 compute-0 nova_compute[185723]: 2026-02-16 13:28:03.418 185727 DEBUG nova.compute.resource_tracker [None req-2c134511-64a1-4c90-949d-09cb96b35f93 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 16 13:28:03 compute-0 nova_compute[185723]: 2026-02-16 13:28:03.462 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:28:03 compute-0 nova_compute[185723]: 2026-02-16 13:28:03.600 185727 DEBUG nova.compute.provider_tree [None req-2c134511-64a1-4c90-949d-09cb96b35f93 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Inventory has not changed in ProviderTree for provider: c9501a85-df32-4b8f-bce0-9425ef1e7866 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 13:28:03 compute-0 nova_compute[185723]: 2026-02-16 13:28:03.656 185727 DEBUG nova.scheduler.client.report [None req-2c134511-64a1-4c90-949d-09cb96b35f93 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Inventory has not changed for provider c9501a85-df32-4b8f-bce0-9425ef1e7866 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 13:28:03 compute-0 nova_compute[185723]: 2026-02-16 13:28:03.686 185727 DEBUG nova.compute.resource_tracker [None req-2c134511-64a1-4c90-949d-09cb96b35f93 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 16 13:28:03 compute-0 nova_compute[185723]: 2026-02-16 13:28:03.686 185727 DEBUG oslo_concurrency.lockutils [None req-2c134511-64a1-4c90-949d-09cb96b35f93 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.473s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:28:03 compute-0 nova_compute[185723]: 2026-02-16 13:28:03.692 185727 INFO nova.compute.manager [None req-2c134511-64a1-4c90-949d-09cb96b35f93 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: c6353280-0641-466d-9963-30eb530755e9] Migrating instance to compute-1.ctlplane.example.com finished successfully.
Feb 16 13:28:03 compute-0 nova_compute[185723]: 2026-02-16 13:28:03.845 185727 INFO nova.scheduler.client.report [None req-2c134511-64a1-4c90-949d-09cb96b35f93 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Deleted allocation for migration 64356f2a-1c7d-445a-9b5b-217256fe2076
Feb 16 13:28:03 compute-0 nova_compute[185723]: 2026-02-16 13:28:03.846 185727 DEBUG nova.virt.libvirt.driver [None req-2c134511-64a1-4c90-949d-09cb96b35f93 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: c6353280-0641-466d-9963-30eb530755e9] Live migration monitoring is all done _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10662
Feb 16 13:28:05 compute-0 nova_compute[185723]: 2026-02-16 13:28:05.433 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:28:06 compute-0 nova_compute[185723]: 2026-02-16 13:28:06.434 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:28:06 compute-0 nova_compute[185723]: 2026-02-16 13:28:06.435 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 16 13:28:06 compute-0 nova_compute[185723]: 2026-02-16 13:28:06.435 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 16 13:28:06 compute-0 nova_compute[185723]: 2026-02-16 13:28:06.450 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 16 13:28:07 compute-0 nova_compute[185723]: 2026-02-16 13:28:07.433 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:28:07 compute-0 nova_compute[185723]: 2026-02-16 13:28:07.434 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:28:07 compute-0 nova_compute[185723]: 2026-02-16 13:28:07.799 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:28:07 compute-0 nova_compute[185723]: 2026-02-16 13:28:07.958 185727 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1771248472.9566276, c6353280-0641-466d-9963-30eb530755e9 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 13:28:07 compute-0 nova_compute[185723]: 2026-02-16 13:28:07.959 185727 INFO nova.compute.manager [-] [instance: c6353280-0641-466d-9963-30eb530755e9] VM Stopped (Lifecycle Event)
Feb 16 13:28:08 compute-0 nova_compute[185723]: 2026-02-16 13:28:08.003 185727 DEBUG nova.compute.manager [None req-ea0f5867-e1f4-4fdc-80a3-7b94e2116f09 - - - - - -] [instance: c6353280-0641-466d-9963-30eb530755e9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:28:08 compute-0 podman[208919]: 2026-02-16 13:28:08.079486914 +0000 UTC m=+0.116701714 container health_status 19961a2f872cc72daf954f4dd556f980b5f58f137d145776df6132c167a8a93c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127)
Feb 16 13:28:08 compute-0 nova_compute[185723]: 2026-02-16 13:28:08.156 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:28:08 compute-0 nova_compute[185723]: 2026-02-16 13:28:08.429 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:28:08 compute-0 nova_compute[185723]: 2026-02-16 13:28:08.432 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:28:08 compute-0 nova_compute[185723]: 2026-02-16 13:28:08.458 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:28:08 compute-0 nova_compute[185723]: 2026-02-16 13:28:08.459 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:28:08 compute-0 nova_compute[185723]: 2026-02-16 13:28:08.459 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:28:08 compute-0 nova_compute[185723]: 2026-02-16 13:28:08.460 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 16 13:28:08 compute-0 nova_compute[185723]: 2026-02-16 13:28:08.606 185727 WARNING nova.virt.libvirt.driver [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 13:28:08 compute-0 nova_compute[185723]: 2026-02-16 13:28:08.607 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5828MB free_disk=73.22699737548828GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 16 13:28:08 compute-0 nova_compute[185723]: 2026-02-16 13:28:08.607 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:28:08 compute-0 nova_compute[185723]: 2026-02-16 13:28:08.607 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:28:08 compute-0 nova_compute[185723]: 2026-02-16 13:28:08.899 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 16 13:28:08 compute-0 nova_compute[185723]: 2026-02-16 13:28:08.900 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 16 13:28:08 compute-0 nova_compute[185723]: 2026-02-16 13:28:08.926 185727 DEBUG nova.compute.provider_tree [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Inventory has not changed in ProviderTree for provider: c9501a85-df32-4b8f-bce0-9425ef1e7866 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 13:28:08 compute-0 nova_compute[185723]: 2026-02-16 13:28:08.947 185727 DEBUG nova.scheduler.client.report [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Inventory has not changed for provider c9501a85-df32-4b8f-bce0-9425ef1e7866 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 13:28:08 compute-0 nova_compute[185723]: 2026-02-16 13:28:08.948 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 16 13:28:08 compute-0 nova_compute[185723]: 2026-02-16 13:28:08.949 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.341s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:28:09 compute-0 nova_compute[185723]: 2026-02-16 13:28:09.950 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:28:12 compute-0 nova_compute[185723]: 2026-02-16 13:28:12.433 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:28:12 compute-0 nova_compute[185723]: 2026-02-16 13:28:12.434 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 16 13:28:12 compute-0 nova_compute[185723]: 2026-02-16 13:28:12.802 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:28:13 compute-0 nova_compute[185723]: 2026-02-16 13:28:13.158 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:28:17 compute-0 podman[208946]: 2026-02-16 13:28:17.010324061 +0000 UTC m=+0.051585378 container health_status 4ec6105d1727f201f656d7e6e358931be8ceabc5af37fb5ad779abc620122180 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 16 13:28:17 compute-0 nova_compute[185723]: 2026-02-16 13:28:17.845 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:28:18 compute-0 nova_compute[185723]: 2026-02-16 13:28:18.160 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:28:19 compute-0 nova_compute[185723]: 2026-02-16 13:28:19.433 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:28:22 compute-0 nova_compute[185723]: 2026-02-16 13:28:22.848 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:28:23 compute-0 nova_compute[185723]: 2026-02-16 13:28:23.163 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:28:24 compute-0 sshd-session[208970]: Invalid user admin from 146.190.226.24 port 39418
Feb 16 13:28:24 compute-0 sshd-session[208970]: Connection closed by invalid user admin 146.190.226.24 port 39418 [preauth]
Feb 16 13:28:27 compute-0 nova_compute[185723]: 2026-02-16 13:28:27.890 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:28:28 compute-0 nova_compute[185723]: 2026-02-16 13:28:28.164 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:28:29 compute-0 podman[195053]: time="2026-02-16T13:28:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:28:29 compute-0 podman[195053]: @ - - [16/Feb/2026:13:28:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16012 "" "Go-http-client/1.1"
Feb 16 13:28:29 compute-0 podman[195053]: @ - - [16/Feb/2026:13:28:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2175 "" "Go-http-client/1.1"
Feb 16 13:28:31 compute-0 openstack_network_exporter[197909]: ERROR   13:28:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:28:31 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:28:31 compute-0 openstack_network_exporter[197909]: ERROR   13:28:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:28:31 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:28:32 compute-0 sshd-session[208972]: Connection closed by authenticating user root 64.227.72.94 port 33306 [preauth]
Feb 16 13:28:32 compute-0 nova_compute[185723]: 2026-02-16 13:28:32.893 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:28:33 compute-0 nova_compute[185723]: 2026-02-16 13:28:33.166 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:28:34 compute-0 podman[208974]: 2026-02-16 13:28:34.028819998 +0000 UTC m=+0.066737137 container health_status 93151dcbec9f159583042df5101bdd7b5b1bcb5f3117955b4bc9cd1c0e465b98 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., io.openshift.expose-services=, io.openshift.tags=minimal rhel9, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=openstack_network_exporter, maintainer=Red Hat, Inc., org.opencontainers.image.created=2026-02-05T04:57:10Z, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.33.7, vcs-type=git, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1770267347, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.7, build-date=2026-02-05T04:57:10Z, distribution-scope=public, managed_by=edpm_ansible)
Feb 16 13:28:34 compute-0 podman[208975]: 2026-02-16 13:28:34.073651237 +0000 UTC m=+0.100947881 container health_status d69bd0be854ec8043fcba555b70c2cd311c7a2510d07d6bc9929fe7684abece9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 16 13:28:37 compute-0 nova_compute[185723]: 2026-02-16 13:28:37.904 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:28:38 compute-0 nova_compute[185723]: 2026-02-16 13:28:38.167 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:28:39 compute-0 podman[209014]: 2026-02-16 13:28:39.119865089 +0000 UTC m=+0.154961419 container health_status 19961a2f872cc72daf954f4dd556f980b5f58f137d145776df6132c167a8a93c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Feb 16 13:28:40 compute-0 ovn_controller[96072]: 2026-02-16T13:28:40Z|00078|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Feb 16 13:28:41 compute-0 sshd-session[209040]: Invalid user hadoop from 188.166.42.159 port 52484
Feb 16 13:28:41 compute-0 sshd-session[209040]: Connection closed by invalid user hadoop 188.166.42.159 port 52484 [preauth]
Feb 16 13:28:42 compute-0 nova_compute[185723]: 2026-02-16 13:28:42.906 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:28:43 compute-0 nova_compute[185723]: 2026-02-16 13:28:43.169 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:28:47 compute-0 nova_compute[185723]: 2026-02-16 13:28:47.910 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:28:48 compute-0 podman[209042]: 2026-02-16 13:28:48.048224286 +0000 UTC m=+0.083556667 container health_status 4ec6105d1727f201f656d7e6e358931be8ceabc5af37fb5ad779abc620122180 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 16 13:28:48 compute-0 nova_compute[185723]: 2026-02-16 13:28:48.171 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:28:52 compute-0 nova_compute[185723]: 2026-02-16 13:28:52.952 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:28:53 compute-0 nova_compute[185723]: 2026-02-16 13:28:53.172 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:28:57 compute-0 nova_compute[185723]: 2026-02-16 13:28:57.955 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:28:58 compute-0 nova_compute[185723]: 2026-02-16 13:28:58.174 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:28:59 compute-0 sshd-session[209067]: Invalid user postgres from 146.190.22.227 port 50840
Feb 16 13:28:59 compute-0 sshd-session[209067]: Connection closed by invalid user postgres 146.190.22.227 port 50840 [preauth]
Feb 16 13:28:59 compute-0 podman[195053]: time="2026-02-16T13:28:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:28:59 compute-0 podman[195053]: @ - - [16/Feb/2026:13:28:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16012 "" "Go-http-client/1.1"
Feb 16 13:28:59 compute-0 podman[195053]: @ - - [16/Feb/2026:13:28:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2170 "" "Go-http-client/1.1"
Feb 16 13:29:01 compute-0 openstack_network_exporter[197909]: ERROR   13:29:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:29:01 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:29:01 compute-0 openstack_network_exporter[197909]: ERROR   13:29:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:29:01 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:29:03 compute-0 nova_compute[185723]: 2026-02-16 13:29:03.010 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:29:03 compute-0 nova_compute[185723]: 2026-02-16 13:29:03.177 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:29:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:29:03.221 105360 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:29:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:29:03.222 105360 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:29:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:29:03.222 105360 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:29:04 compute-0 nova_compute[185723]: 2026-02-16 13:29:04.447 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:29:05 compute-0 podman[209070]: 2026-02-16 13:29:05.013303093 +0000 UTC m=+0.052712737 container health_status 93151dcbec9f159583042df5101bdd7b5b1bcb5f3117955b4bc9cd1c0e465b98 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, org.opencontainers.image.created=2026-02-05T04:57:10Z, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2026-02-05T04:57:10Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, distribution-scope=public, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, io.buildah.version=1.33.7, config_id=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, io.openshift.expose-services=, version=9.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1770267347, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, maintainer=Red Hat, Inc.)
Feb 16 13:29:05 compute-0 podman[209071]: 2026-02-16 13:29:05.047149668 +0000 UTC m=+0.083251759 container health_status d69bd0be854ec8043fcba555b70c2cd311c7a2510d07d6bc9929fe7684abece9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2)
Feb 16 13:29:06 compute-0 nova_compute[185723]: 2026-02-16 13:29:06.428 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:29:07 compute-0 nova_compute[185723]: 2026-02-16 13:29:07.433 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:29:07 compute-0 nova_compute[185723]: 2026-02-16 13:29:07.433 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:29:08 compute-0 nova_compute[185723]: 2026-02-16 13:29:08.011 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:29:08 compute-0 nova_compute[185723]: 2026-02-16 13:29:08.178 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:29:08 compute-0 nova_compute[185723]: 2026-02-16 13:29:08.433 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:29:08 compute-0 nova_compute[185723]: 2026-02-16 13:29:08.434 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 16 13:29:08 compute-0 nova_compute[185723]: 2026-02-16 13:29:08.434 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 16 13:29:08 compute-0 nova_compute[185723]: 2026-02-16 13:29:08.462 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 16 13:29:08 compute-0 nova_compute[185723]: 2026-02-16 13:29:08.462 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:29:08 compute-0 nova_compute[185723]: 2026-02-16 13:29:08.492 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:29:08 compute-0 nova_compute[185723]: 2026-02-16 13:29:08.492 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:29:08 compute-0 nova_compute[185723]: 2026-02-16 13:29:08.493 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:29:08 compute-0 nova_compute[185723]: 2026-02-16 13:29:08.493 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 16 13:29:08 compute-0 nova_compute[185723]: 2026-02-16 13:29:08.606 185727 WARNING nova.virt.libvirt.driver [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 13:29:08 compute-0 nova_compute[185723]: 2026-02-16 13:29:08.607 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5839MB free_disk=73.22699737548828GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 16 13:29:08 compute-0 nova_compute[185723]: 2026-02-16 13:29:08.607 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:29:08 compute-0 nova_compute[185723]: 2026-02-16 13:29:08.607 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:29:08 compute-0 nova_compute[185723]: 2026-02-16 13:29:08.671 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 16 13:29:08 compute-0 nova_compute[185723]: 2026-02-16 13:29:08.671 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 16 13:29:08 compute-0 nova_compute[185723]: 2026-02-16 13:29:08.699 185727 DEBUG nova.compute.provider_tree [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Inventory has not changed in ProviderTree for provider: c9501a85-df32-4b8f-bce0-9425ef1e7866 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 13:29:08 compute-0 nova_compute[185723]: 2026-02-16 13:29:08.717 185727 DEBUG nova.scheduler.client.report [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Inventory has not changed for provider c9501a85-df32-4b8f-bce0-9425ef1e7866 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 13:29:08 compute-0 nova_compute[185723]: 2026-02-16 13:29:08.718 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 16 13:29:08 compute-0 nova_compute[185723]: 2026-02-16 13:29:08.719 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.111s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:29:09 compute-0 nova_compute[185723]: 2026-02-16 13:29:09.690 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:29:10 compute-0 podman[209110]: 2026-02-16 13:29:10.032674894 +0000 UTC m=+0.075459685 container health_status 19961a2f872cc72daf954f4dd556f980b5f58f137d145776df6132c167a8a93c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.build-date=20260127, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Feb 16 13:29:10 compute-0 nova_compute[185723]: 2026-02-16 13:29:10.429 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:29:11 compute-0 nova_compute[185723]: 2026-02-16 13:29:11.432 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:29:13 compute-0 nova_compute[185723]: 2026-02-16 13:29:13.027 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:29:13 compute-0 nova_compute[185723]: 2026-02-16 13:29:13.180 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:29:13 compute-0 nova_compute[185723]: 2026-02-16 13:29:13.433 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:29:13 compute-0 nova_compute[185723]: 2026-02-16 13:29:13.434 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 16 13:29:14 compute-0 nova_compute[185723]: 2026-02-16 13:29:14.635 185727 DEBUG nova.compute.manager [None req-30e24f6c-2b0a-4632-baa5-6fbfc99115e1 d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] Removing trait COMPUTE_STATUS_DISABLED from compute node resource provider c9501a85-df32-4b8f-bce0-9425ef1e7866 in placement. update_compute_provider_status /usr/lib/python3.9/site-packages/nova/compute/manager.py:606
Feb 16 13:29:14 compute-0 nova_compute[185723]: 2026-02-16 13:29:14.697 185727 DEBUG nova.compute.provider_tree [None req-30e24f6c-2b0a-4632-baa5-6fbfc99115e1 d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] Updating resource provider c9501a85-df32-4b8f-bce0-9425ef1e7866 generation from 13 to 16 during operation: update_traits _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Feb 16 13:29:17 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:29:17.206 105360 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=11, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '52:96:1b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '16:6c:7a:bd:49:39'}, ipsec=False) old=SB_Global(nb_cfg=10) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 13:29:17 compute-0 nova_compute[185723]: 2026-02-16 13:29:17.207 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:29:17 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:29:17.208 105360 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 16 13:29:18 compute-0 nova_compute[185723]: 2026-02-16 13:29:18.030 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:29:18 compute-0 nova_compute[185723]: 2026-02-16 13:29:18.181 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:29:19 compute-0 podman[209137]: 2026-02-16 13:29:19.01825291 +0000 UTC m=+0.054838970 container health_status 4ec6105d1727f201f656d7e6e358931be8ceabc5af37fb5ad779abc620122180 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 16 13:29:20 compute-0 nova_compute[185723]: 2026-02-16 13:29:20.196 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:29:20 compute-0 sshd-session[209161]: Connection closed by authenticating user root 64.227.72.94 port 46648 [preauth]
Feb 16 13:29:21 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:29:21.210 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=b0e583b2-47d7-4bde-bbd6-282143e0c194, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '11'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:29:23 compute-0 nova_compute[185723]: 2026-02-16 13:29:23.033 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:29:23 compute-0 nova_compute[185723]: 2026-02-16 13:29:23.183 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:29:26 compute-0 sshd-session[209163]: Invalid user admin from 146.190.226.24 port 47448
Feb 16 13:29:27 compute-0 sshd-session[209163]: Connection closed by invalid user admin 146.190.226.24 port 47448 [preauth]
Feb 16 13:29:28 compute-0 nova_compute[185723]: 2026-02-16 13:29:28.070 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:29:28 compute-0 nova_compute[185723]: 2026-02-16 13:29:28.186 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:29:29 compute-0 podman[195053]: time="2026-02-16T13:29:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:29:29 compute-0 podman[195053]: @ - - [16/Feb/2026:13:29:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16012 "" "Go-http-client/1.1"
Feb 16 13:29:29 compute-0 podman[195053]: @ - - [16/Feb/2026:13:29:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2171 "" "Go-http-client/1.1"
Feb 16 13:29:31 compute-0 openstack_network_exporter[197909]: ERROR   13:29:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:29:31 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:29:31 compute-0 openstack_network_exporter[197909]: ERROR   13:29:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:29:31 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:29:33 compute-0 nova_compute[185723]: 2026-02-16 13:29:33.072 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:29:33 compute-0 nova_compute[185723]: 2026-02-16 13:29:33.189 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:29:35 compute-0 sshd-session[209165]: Invalid user git from 188.166.42.159 port 48666
Feb 16 13:29:35 compute-0 podman[209168]: 2026-02-16 13:29:35.345824641 +0000 UTC m=+0.051407904 container health_status d69bd0be854ec8043fcba555b70c2cd311c7a2510d07d6bc9929fe7684abece9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20260127, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Feb 16 13:29:35 compute-0 podman[209167]: 2026-02-16 13:29:35.35256517 +0000 UTC m=+0.060962663 container health_status 93151dcbec9f159583042df5101bdd7b5b1bcb5f3117955b4bc9cd1c0e465b98 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., version=9.7, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, name=ubi9/ubi-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, config_id=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2026-02-05T04:57:10Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, distribution-scope=public, release=1770267347, maintainer=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, container_name=openstack_network_exporter, org.opencontainers.image.created=2026-02-05T04:57:10Z, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c)
Feb 16 13:29:35 compute-0 sshd-session[209165]: Connection closed by invalid user git 188.166.42.159 port 48666 [preauth]
Feb 16 13:29:38 compute-0 nova_compute[185723]: 2026-02-16 13:29:38.075 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:29:38 compute-0 nova_compute[185723]: 2026-02-16 13:29:38.192 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:29:41 compute-0 podman[209205]: 2026-02-16 13:29:41.055099364 +0000 UTC m=+0.086992242 container health_status 19961a2f872cc72daf954f4dd556f980b5f58f137d145776df6132c167a8a93c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 16 13:29:43 compute-0 nova_compute[185723]: 2026-02-16 13:29:43.104 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:29:43 compute-0 nova_compute[185723]: 2026-02-16 13:29:43.195 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:29:48 compute-0 nova_compute[185723]: 2026-02-16 13:29:48.106 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:29:48 compute-0 nova_compute[185723]: 2026-02-16 13:29:48.196 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:29:50 compute-0 podman[209233]: 2026-02-16 13:29:50.046733943 +0000 UTC m=+0.082636017 container health_status 4ec6105d1727f201f656d7e6e358931be8ceabc5af37fb5ad779abc620122180 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 16 13:29:53 compute-0 nova_compute[185723]: 2026-02-16 13:29:53.109 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:29:53 compute-0 nova_compute[185723]: 2026-02-16 13:29:53.197 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:29:55 compute-0 ovn_controller[96072]: 2026-02-16T13:29:55Z|00079|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Feb 16 13:29:58 compute-0 nova_compute[185723]: 2026-02-16 13:29:58.153 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:29:58 compute-0 nova_compute[185723]: 2026-02-16 13:29:58.198 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:29:59 compute-0 podman[195053]: time="2026-02-16T13:29:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:29:59 compute-0 podman[195053]: @ - - [16/Feb/2026:13:29:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16012 "" "Go-http-client/1.1"
Feb 16 13:29:59 compute-0 podman[195053]: @ - - [16/Feb/2026:13:29:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2175 "" "Go-http-client/1.1"
Feb 16 13:30:01 compute-0 openstack_network_exporter[197909]: ERROR   13:30:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:30:01 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:30:01 compute-0 openstack_network_exporter[197909]: ERROR   13:30:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:30:01 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:30:03 compute-0 nova_compute[185723]: 2026-02-16 13:30:03.156 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:30:03 compute-0 nova_compute[185723]: 2026-02-16 13:30:03.201 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:30:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:30:03.222 105360 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:30:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:30:03.223 105360 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:30:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:30:03.224 105360 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:30:03 compute-0 sshd-session[209260]: Connection closed by 45.91.64.7 port 39398
Feb 16 13:30:04 compute-0 sshd-session[209261]: Unable to negotiate with 45.91.64.7 port 39408: no matching host key type found. Their offer: ssh-rsa,ssh-dss [preauth]
Feb 16 13:30:05 compute-0 nova_compute[185723]: 2026-02-16 13:30:05.435 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:30:06 compute-0 podman[209264]: 2026-02-16 13:30:06.024061176 +0000 UTC m=+0.056835323 container health_status d69bd0be854ec8043fcba555b70c2cd311c7a2510d07d6bc9929fe7684abece9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Feb 16 13:30:06 compute-0 podman[209263]: 2026-02-16 13:30:06.028054025 +0000 UTC m=+0.067666723 container health_status 93151dcbec9f159583042df5101bdd7b5b1bcb5f3117955b4bc9cd1c0e465b98 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, version=9.7, config_id=openstack_network_exporter, name=ubi9/ubi-minimal, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.buildah.version=1.33.7, architecture=x86_64, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., vcs-type=git, io.openshift.expose-services=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1770267347, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, build-date=2026-02-05T04:57:10Z, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, org.opencontainers.image.created=2026-02-05T04:57:10Z)
Feb 16 13:30:07 compute-0 sshd-session[209300]: Connection closed by authenticating user root 64.227.72.94 port 43698 [preauth]
Feb 16 13:30:08 compute-0 nova_compute[185723]: 2026-02-16 13:30:08.159 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:30:08 compute-0 nova_compute[185723]: 2026-02-16 13:30:08.203 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:30:08 compute-0 nova_compute[185723]: 2026-02-16 13:30:08.433 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:30:08 compute-0 nova_compute[185723]: 2026-02-16 13:30:08.464 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:30:08 compute-0 nova_compute[185723]: 2026-02-16 13:30:08.464 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:30:08 compute-0 nova_compute[185723]: 2026-02-16 13:30:08.464 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:30:08 compute-0 nova_compute[185723]: 2026-02-16 13:30:08.465 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 16 13:30:08 compute-0 nova_compute[185723]: 2026-02-16 13:30:08.628 185727 WARNING nova.virt.libvirt.driver [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 13:30:08 compute-0 nova_compute[185723]: 2026-02-16 13:30:08.629 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5872MB free_disk=73.22699356079102GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 16 13:30:08 compute-0 nova_compute[185723]: 2026-02-16 13:30:08.630 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:30:08 compute-0 nova_compute[185723]: 2026-02-16 13:30:08.630 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:30:08 compute-0 nova_compute[185723]: 2026-02-16 13:30:08.698 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 16 13:30:08 compute-0 nova_compute[185723]: 2026-02-16 13:30:08.699 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 16 13:30:08 compute-0 nova_compute[185723]: 2026-02-16 13:30:08.720 185727 DEBUG nova.compute.provider_tree [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Inventory has not changed in ProviderTree for provider: c9501a85-df32-4b8f-bce0-9425ef1e7866 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 13:30:08 compute-0 nova_compute[185723]: 2026-02-16 13:30:08.735 185727 DEBUG nova.scheduler.client.report [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Inventory has not changed for provider c9501a85-df32-4b8f-bce0-9425ef1e7866 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 13:30:08 compute-0 nova_compute[185723]: 2026-02-16 13:30:08.736 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 16 13:30:08 compute-0 nova_compute[185723]: 2026-02-16 13:30:08.736 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.106s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:30:09 compute-0 nova_compute[185723]: 2026-02-16 13:30:09.736 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:30:09 compute-0 nova_compute[185723]: 2026-02-16 13:30:09.736 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:30:09 compute-0 nova_compute[185723]: 2026-02-16 13:30:09.737 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:30:10 compute-0 nova_compute[185723]: 2026-02-16 13:30:10.434 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:30:10 compute-0 nova_compute[185723]: 2026-02-16 13:30:10.434 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 16 13:30:10 compute-0 nova_compute[185723]: 2026-02-16 13:30:10.434 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 16 13:30:10 compute-0 nova_compute[185723]: 2026-02-16 13:30:10.450 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 16 13:30:12 compute-0 podman[209302]: 2026-02-16 13:30:12.082235375 +0000 UTC m=+0.117184591 container health_status 19961a2f872cc72daf954f4dd556f980b5f58f137d145776df6132c167a8a93c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true)
Feb 16 13:30:12 compute-0 nova_compute[185723]: 2026-02-16 13:30:12.444 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:30:13 compute-0 nova_compute[185723]: 2026-02-16 13:30:13.160 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:30:13 compute-0 nova_compute[185723]: 2026-02-16 13:30:13.204 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:30:13 compute-0 nova_compute[185723]: 2026-02-16 13:30:13.433 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:30:13 compute-0 nova_compute[185723]: 2026-02-16 13:30:13.434 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:30:13 compute-0 nova_compute[185723]: 2026-02-16 13:30:13.435 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 16 13:30:17 compute-0 nova_compute[185723]: 2026-02-16 13:30:17.913 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:30:17 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:30:17.914 105360 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=12, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '52:96:1b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '16:6c:7a:bd:49:39'}, ipsec=False) old=SB_Global(nb_cfg=11) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 13:30:17 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:30:17.915 105360 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 16 13:30:18 compute-0 nova_compute[185723]: 2026-02-16 13:30:18.162 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:30:18 compute-0 nova_compute[185723]: 2026-02-16 13:30:18.206 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:30:21 compute-0 podman[209330]: 2026-02-16 13:30:21.005179326 +0000 UTC m=+0.045980251 container health_status 4ec6105d1727f201f656d7e6e358931be8ceabc5af37fb5ad779abc620122180 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 16 13:30:22 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:30:22.917 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=b0e583b2-47d7-4bde-bbd6-282143e0c194, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '12'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:30:23 compute-0 nova_compute[185723]: 2026-02-16 13:30:23.165 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:30:23 compute-0 nova_compute[185723]: 2026-02-16 13:30:23.209 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:30:28 compute-0 nova_compute[185723]: 2026-02-16 13:30:28.168 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:30:28 compute-0 nova_compute[185723]: 2026-02-16 13:30:28.211 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:30:29 compute-0 sshd-session[209354]: Invalid user admin from 146.190.226.24 port 52962
Feb 16 13:30:29 compute-0 sshd-session[209354]: Connection closed by invalid user admin 146.190.226.24 port 52962 [preauth]
Feb 16 13:30:29 compute-0 podman[195053]: time="2026-02-16T13:30:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:30:29 compute-0 podman[195053]: @ - - [16/Feb/2026:13:30:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16012 "" "Go-http-client/1.1"
Feb 16 13:30:29 compute-0 podman[195053]: @ - - [16/Feb/2026:13:30:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2169 "" "Go-http-client/1.1"
Feb 16 13:30:31 compute-0 openstack_network_exporter[197909]: ERROR   13:30:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:30:31 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:30:31 compute-0 openstack_network_exporter[197909]: ERROR   13:30:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:30:31 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:30:33 compute-0 sshd-session[209356]: Invalid user deploy from 188.166.42.159 port 47812
Feb 16 13:30:33 compute-0 nova_compute[185723]: 2026-02-16 13:30:33.171 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:30:33 compute-0 nova_compute[185723]: 2026-02-16 13:30:33.214 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:30:33 compute-0 sshd-session[209356]: Connection closed by invalid user deploy 188.166.42.159 port 47812 [preauth]
Feb 16 13:30:37 compute-0 podman[209358]: 2026-02-16 13:30:37.052927892 +0000 UTC m=+0.084853873 container health_status 93151dcbec9f159583042df5101bdd7b5b1bcb5f3117955b4bc9cd1c0e465b98 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, name=ubi9/ubi-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., release=1770267347, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, vendor=Red Hat, Inc., distribution-scope=public, container_name=openstack_network_exporter, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, version=9.7, build-date=2026-02-05T04:57:10Z, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, vcs-type=git, org.opencontainers.image.created=2026-02-05T04:57:10Z, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, cpe=cpe:/a:redhat:enterprise_linux:9::appstream)
Feb 16 13:30:37 compute-0 podman[209359]: 2026-02-16 13:30:37.052695136 +0000 UTC m=+0.080756480 container health_status d69bd0be854ec8043fcba555b70c2cd311c7a2510d07d6bc9929fe7684abece9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Feb 16 13:30:38 compute-0 nova_compute[185723]: 2026-02-16 13:30:38.172 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:30:38 compute-0 nova_compute[185723]: 2026-02-16 13:30:38.216 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:30:42 compute-0 sshd-session[209397]: Invalid user oracle from 146.190.22.227 port 53328
Feb 16 13:30:42 compute-0 podman[209399]: 2026-02-16 13:30:42.791826777 +0000 UTC m=+0.076607307 container health_status 19961a2f872cc72daf954f4dd556f980b5f58f137d145776df6132c167a8a93c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Feb 16 13:30:43 compute-0 sshd-session[209397]: Connection closed by invalid user oracle 146.190.22.227 port 53328 [preauth]
Feb 16 13:30:43 compute-0 nova_compute[185723]: 2026-02-16 13:30:43.173 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:30:43 compute-0 nova_compute[185723]: 2026-02-16 13:30:43.217 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:30:48 compute-0 nova_compute[185723]: 2026-02-16 13:30:48.176 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:30:48 compute-0 nova_compute[185723]: 2026-02-16 13:30:48.219 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:30:52 compute-0 podman[209425]: 2026-02-16 13:30:52.009605852 +0000 UTC m=+0.048018112 container health_status 4ec6105d1727f201f656d7e6e358931be8ceabc5af37fb5ad779abc620122180 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 16 13:30:53 compute-0 nova_compute[185723]: 2026-02-16 13:30:53.178 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:30:53 compute-0 nova_compute[185723]: 2026-02-16 13:30:53.221 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:30:56 compute-0 sshd-session[209449]: Connection closed by authenticating user root 64.227.72.94 port 52960 [preauth]
Feb 16 13:30:58 compute-0 nova_compute[185723]: 2026-02-16 13:30:58.222 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 13:30:58 compute-0 nova_compute[185723]: 2026-02-16 13:30:58.224 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 13:30:58 compute-0 nova_compute[185723]: 2026-02-16 13:30:58.225 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 16 13:30:58 compute-0 nova_compute[185723]: 2026-02-16 13:30:58.225 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 16 13:30:58 compute-0 nova_compute[185723]: 2026-02-16 13:30:58.229 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:30:58 compute-0 nova_compute[185723]: 2026-02-16 13:30:58.229 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 16 13:30:59 compute-0 podman[195053]: time="2026-02-16T13:30:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:30:59 compute-0 podman[195053]: @ - - [16/Feb/2026:13:30:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16012 "" "Go-http-client/1.1"
Feb 16 13:30:59 compute-0 podman[195053]: @ - - [16/Feb/2026:13:30:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2175 "" "Go-http-client/1.1"
Feb 16 13:31:01 compute-0 openstack_network_exporter[197909]: ERROR   13:31:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:31:01 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:31:01 compute-0 openstack_network_exporter[197909]: ERROR   13:31:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:31:01 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:31:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:31:03.223 105360 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:31:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:31:03.224 105360 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:31:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:31:03.224 105360 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:31:03 compute-0 nova_compute[185723]: 2026-02-16 13:31:03.231 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 13:31:03 compute-0 nova_compute[185723]: 2026-02-16 13:31:03.232 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:31:03 compute-0 nova_compute[185723]: 2026-02-16 13:31:03.232 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 16 13:31:03 compute-0 nova_compute[185723]: 2026-02-16 13:31:03.232 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 16 13:31:03 compute-0 nova_compute[185723]: 2026-02-16 13:31:03.233 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 16 13:31:03 compute-0 nova_compute[185723]: 2026-02-16 13:31:03.234 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:31:07 compute-0 nova_compute[185723]: 2026-02-16 13:31:07.435 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:31:08 compute-0 podman[209452]: 2026-02-16 13:31:08.029263503 +0000 UTC m=+0.059066608 container health_status d69bd0be854ec8043fcba555b70c2cd311c7a2510d07d6bc9929fe7684abece9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2)
Feb 16 13:31:08 compute-0 podman[209451]: 2026-02-16 13:31:08.045881439 +0000 UTC m=+0.079123150 container health_status 93151dcbec9f159583042df5101bdd7b5b1bcb5f3117955b4bc9cd1c0e465b98 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, build-date=2026-02-05T04:57:10Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, version=9.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1770267347, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, org.opencontainers.image.created=2026-02-05T04:57:10Z, com.redhat.component=ubi9-minimal-container, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9/ubi-minimal, architecture=x86_64, config_id=openstack_network_exporter, container_name=openstack_network_exporter, vcs-type=git, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, managed_by=edpm_ansible)
Feb 16 13:31:08 compute-0 nova_compute[185723]: 2026-02-16 13:31:08.234 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:31:09 compute-0 nova_compute[185723]: 2026-02-16 13:31:09.428 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:31:09 compute-0 nova_compute[185723]: 2026-02-16 13:31:09.454 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:31:09 compute-0 nova_compute[185723]: 2026-02-16 13:31:09.455 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:31:09 compute-0 nova_compute[185723]: 2026-02-16 13:31:09.455 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:31:10 compute-0 nova_compute[185723]: 2026-02-16 13:31:10.433 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:31:10 compute-0 nova_compute[185723]: 2026-02-16 13:31:10.433 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 16 13:31:10 compute-0 nova_compute[185723]: 2026-02-16 13:31:10.434 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 16 13:31:10 compute-0 nova_compute[185723]: 2026-02-16 13:31:10.460 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 16 13:31:10 compute-0 nova_compute[185723]: 2026-02-16 13:31:10.460 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:31:10 compute-0 nova_compute[185723]: 2026-02-16 13:31:10.518 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:31:10 compute-0 nova_compute[185723]: 2026-02-16 13:31:10.518 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:31:10 compute-0 nova_compute[185723]: 2026-02-16 13:31:10.519 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:31:10 compute-0 nova_compute[185723]: 2026-02-16 13:31:10.519 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 16 13:31:10 compute-0 nova_compute[185723]: 2026-02-16 13:31:10.691 185727 WARNING nova.virt.libvirt.driver [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 13:31:10 compute-0 nova_compute[185723]: 2026-02-16 13:31:10.692 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5859MB free_disk=73.22721862792969GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 16 13:31:10 compute-0 nova_compute[185723]: 2026-02-16 13:31:10.692 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:31:10 compute-0 nova_compute[185723]: 2026-02-16 13:31:10.693 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:31:10 compute-0 nova_compute[185723]: 2026-02-16 13:31:10.783 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 16 13:31:10 compute-0 nova_compute[185723]: 2026-02-16 13:31:10.783 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 16 13:31:10 compute-0 nova_compute[185723]: 2026-02-16 13:31:10.807 185727 DEBUG nova.scheduler.client.report [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Refreshing inventories for resource provider c9501a85-df32-4b8f-bce0-9425ef1e7866 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Feb 16 13:31:10 compute-0 nova_compute[185723]: 2026-02-16 13:31:10.837 185727 DEBUG nova.scheduler.client.report [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Updating ProviderTree inventory for provider c9501a85-df32-4b8f-bce0-9425ef1e7866 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Feb 16 13:31:10 compute-0 nova_compute[185723]: 2026-02-16 13:31:10.838 185727 DEBUG nova.compute.provider_tree [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Updating inventory in ProviderTree for provider c9501a85-df32-4b8f-bce0-9425ef1e7866 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Feb 16 13:31:10 compute-0 nova_compute[185723]: 2026-02-16 13:31:10.863 185727 DEBUG nova.scheduler.client.report [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Refreshing aggregate associations for resource provider c9501a85-df32-4b8f-bce0-9425ef1e7866, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Feb 16 13:31:10 compute-0 nova_compute[185723]: 2026-02-16 13:31:10.893 185727 DEBUG nova.scheduler.client.report [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Refreshing trait associations for resource provider c9501a85-df32-4b8f-bce0-9425ef1e7866, traits: COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_MMX,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_SSE,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_SSE2,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_SECURITY_TPM_1_2,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_VOLUME_EXTEND,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_STORAGE_BUS_USB,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_SECURITY_TPM_2_0,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_ACCELERATORS,COMPUTE_STORAGE_BUS_FDC,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NODE,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_SSE41,COMPUTE_IMAGE_TYPE_AKI _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Feb 16 13:31:10 compute-0 nova_compute[185723]: 2026-02-16 13:31:10.918 185727 DEBUG nova.compute.provider_tree [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Inventory has not changed in ProviderTree for provider: c9501a85-df32-4b8f-bce0-9425ef1e7866 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 13:31:10 compute-0 nova_compute[185723]: 2026-02-16 13:31:10.937 185727 DEBUG nova.scheduler.client.report [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Inventory has not changed for provider c9501a85-df32-4b8f-bce0-9425ef1e7866 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 13:31:10 compute-0 nova_compute[185723]: 2026-02-16 13:31:10.939 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 16 13:31:10 compute-0 nova_compute[185723]: 2026-02-16 13:31:10.939 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.247s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:31:12 compute-0 nova_compute[185723]: 2026-02-16 13:31:12.935 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:31:13 compute-0 podman[209493]: 2026-02-16 13:31:13.101900688 +0000 UTC m=+0.137944121 container health_status 19961a2f872cc72daf954f4dd556f980b5f58f137d145776df6132c167a8a93c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.vendor=CentOS)
Feb 16 13:31:13 compute-0 nova_compute[185723]: 2026-02-16 13:31:13.235 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:31:13 compute-0 nova_compute[185723]: 2026-02-16 13:31:13.236 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:31:13 compute-0 nova_compute[185723]: 2026-02-16 13:31:13.433 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:31:13 compute-0 nova_compute[185723]: 2026-02-16 13:31:13.434 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 16 13:31:14 compute-0 nova_compute[185723]: 2026-02-16 13:31:14.434 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:31:18 compute-0 nova_compute[185723]: 2026-02-16 13:31:18.237 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:31:19 compute-0 nova_compute[185723]: 2026-02-16 13:31:19.662 185727 DEBUG oslo_concurrency.lockutils [None req-da498c4e-2782-4dd4-ab45-4e1b065587fa 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] Acquiring lock "1e7b84b5-47ea-44b8-b949-ad82d5f1c8aa" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:31:19 compute-0 nova_compute[185723]: 2026-02-16 13:31:19.663 185727 DEBUG oslo_concurrency.lockutils [None req-da498c4e-2782-4dd4-ab45-4e1b065587fa 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] Lock "1e7b84b5-47ea-44b8-b949-ad82d5f1c8aa" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:31:19 compute-0 nova_compute[185723]: 2026-02-16 13:31:19.678 185727 DEBUG nova.compute.manager [None req-da498c4e-2782-4dd4-ab45-4e1b065587fa 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] [instance: 1e7b84b5-47ea-44b8-b949-ad82d5f1c8aa] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Feb 16 13:31:19 compute-0 nova_compute[185723]: 2026-02-16 13:31:19.769 185727 DEBUG oslo_concurrency.lockutils [None req-da498c4e-2782-4dd4-ab45-4e1b065587fa 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:31:19 compute-0 nova_compute[185723]: 2026-02-16 13:31:19.770 185727 DEBUG oslo_concurrency.lockutils [None req-da498c4e-2782-4dd4-ab45-4e1b065587fa 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:31:19 compute-0 nova_compute[185723]: 2026-02-16 13:31:19.794 185727 DEBUG nova.virt.hardware [None req-da498c4e-2782-4dd4-ab45-4e1b065587fa 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Feb 16 13:31:19 compute-0 nova_compute[185723]: 2026-02-16 13:31:19.794 185727 INFO nova.compute.claims [None req-da498c4e-2782-4dd4-ab45-4e1b065587fa 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] [instance: 1e7b84b5-47ea-44b8-b949-ad82d5f1c8aa] Claim successful on node compute-0.ctlplane.example.com
Feb 16 13:31:19 compute-0 nova_compute[185723]: 2026-02-16 13:31:19.921 185727 DEBUG nova.compute.provider_tree [None req-da498c4e-2782-4dd4-ab45-4e1b065587fa 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] Inventory has not changed in ProviderTree for provider: c9501a85-df32-4b8f-bce0-9425ef1e7866 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 13:31:19 compute-0 nova_compute[185723]: 2026-02-16 13:31:19.941 185727 DEBUG nova.scheduler.client.report [None req-da498c4e-2782-4dd4-ab45-4e1b065587fa 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] Inventory has not changed for provider c9501a85-df32-4b8f-bce0-9425ef1e7866 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 13:31:19 compute-0 nova_compute[185723]: 2026-02-16 13:31:19.966 185727 DEBUG oslo_concurrency.lockutils [None req-da498c4e-2782-4dd4-ab45-4e1b065587fa 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.196s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:31:19 compute-0 nova_compute[185723]: 2026-02-16 13:31:19.966 185727 DEBUG nova.compute.manager [None req-da498c4e-2782-4dd4-ab45-4e1b065587fa 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] [instance: 1e7b84b5-47ea-44b8-b949-ad82d5f1c8aa] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Feb 16 13:31:20 compute-0 nova_compute[185723]: 2026-02-16 13:31:20.015 185727 DEBUG nova.compute.manager [None req-da498c4e-2782-4dd4-ab45-4e1b065587fa 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] [instance: 1e7b84b5-47ea-44b8-b949-ad82d5f1c8aa] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Feb 16 13:31:20 compute-0 nova_compute[185723]: 2026-02-16 13:31:20.015 185727 DEBUG nova.network.neutron [None req-da498c4e-2782-4dd4-ab45-4e1b065587fa 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] [instance: 1e7b84b5-47ea-44b8-b949-ad82d5f1c8aa] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Feb 16 13:31:20 compute-0 nova_compute[185723]: 2026-02-16 13:31:20.035 185727 INFO nova.virt.libvirt.driver [None req-da498c4e-2782-4dd4-ab45-4e1b065587fa 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] [instance: 1e7b84b5-47ea-44b8-b949-ad82d5f1c8aa] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Feb 16 13:31:20 compute-0 nova_compute[185723]: 2026-02-16 13:31:20.051 185727 DEBUG nova.compute.manager [None req-da498c4e-2782-4dd4-ab45-4e1b065587fa 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] [instance: 1e7b84b5-47ea-44b8-b949-ad82d5f1c8aa] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Feb 16 13:31:20 compute-0 nova_compute[185723]: 2026-02-16 13:31:20.182 185727 DEBUG nova.compute.manager [None req-da498c4e-2782-4dd4-ab45-4e1b065587fa 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] [instance: 1e7b84b5-47ea-44b8-b949-ad82d5f1c8aa] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Feb 16 13:31:20 compute-0 nova_compute[185723]: 2026-02-16 13:31:20.185 185727 DEBUG nova.virt.libvirt.driver [None req-da498c4e-2782-4dd4-ab45-4e1b065587fa 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] [instance: 1e7b84b5-47ea-44b8-b949-ad82d5f1c8aa] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Feb 16 13:31:20 compute-0 nova_compute[185723]: 2026-02-16 13:31:20.185 185727 INFO nova.virt.libvirt.driver [None req-da498c4e-2782-4dd4-ab45-4e1b065587fa 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] [instance: 1e7b84b5-47ea-44b8-b949-ad82d5f1c8aa] Creating image(s)
Feb 16 13:31:20 compute-0 nova_compute[185723]: 2026-02-16 13:31:20.186 185727 DEBUG oslo_concurrency.lockutils [None req-da498c4e-2782-4dd4-ab45-4e1b065587fa 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] Acquiring lock "/var/lib/nova/instances/1e7b84b5-47ea-44b8-b949-ad82d5f1c8aa/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:31:20 compute-0 nova_compute[185723]: 2026-02-16 13:31:20.187 185727 DEBUG oslo_concurrency.lockutils [None req-da498c4e-2782-4dd4-ab45-4e1b065587fa 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] Lock "/var/lib/nova/instances/1e7b84b5-47ea-44b8-b949-ad82d5f1c8aa/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:31:20 compute-0 nova_compute[185723]: 2026-02-16 13:31:20.188 185727 DEBUG oslo_concurrency.lockutils [None req-da498c4e-2782-4dd4-ab45-4e1b065587fa 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] Lock "/var/lib/nova/instances/1e7b84b5-47ea-44b8-b949-ad82d5f1c8aa/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:31:20 compute-0 nova_compute[185723]: 2026-02-16 13:31:20.213 185727 DEBUG oslo_concurrency.processutils [None req-da498c4e-2782-4dd4-ab45-4e1b065587fa 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:31:20 compute-0 nova_compute[185723]: 2026-02-16 13:31:20.293 185727 DEBUG oslo_concurrency.processutils [None req-da498c4e-2782-4dd4-ab45-4e1b065587fa 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json" returned: 0 in 0.080s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:31:20 compute-0 nova_compute[185723]: 2026-02-16 13:31:20.298 185727 DEBUG oslo_concurrency.lockutils [None req-da498c4e-2782-4dd4-ab45-4e1b065587fa 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] Acquiring lock "755ae6cb63977e865a71a8c38a1a67ce95c9d0b7" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:31:20 compute-0 nova_compute[185723]: 2026-02-16 13:31:20.299 185727 DEBUG oslo_concurrency.lockutils [None req-da498c4e-2782-4dd4-ab45-4e1b065587fa 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] Lock "755ae6cb63977e865a71a8c38a1a67ce95c9d0b7" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:31:20 compute-0 nova_compute[185723]: 2026-02-16 13:31:20.310 185727 DEBUG oslo_concurrency.processutils [None req-da498c4e-2782-4dd4-ab45-4e1b065587fa 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:31:20 compute-0 nova_compute[185723]: 2026-02-16 13:31:20.373 185727 DEBUG oslo_concurrency.processutils [None req-da498c4e-2782-4dd4-ab45-4e1b065587fa 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:31:20 compute-0 nova_compute[185723]: 2026-02-16 13:31:20.375 185727 DEBUG oslo_concurrency.processutils [None req-da498c4e-2782-4dd4-ab45-4e1b065587fa 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7,backing_fmt=raw /var/lib/nova/instances/1e7b84b5-47ea-44b8-b949-ad82d5f1c8aa/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:31:20 compute-0 nova_compute[185723]: 2026-02-16 13:31:20.412 185727 DEBUG oslo_concurrency.processutils [None req-da498c4e-2782-4dd4-ab45-4e1b065587fa 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7,backing_fmt=raw /var/lib/nova/instances/1e7b84b5-47ea-44b8-b949-ad82d5f1c8aa/disk 1073741824" returned: 0 in 0.037s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:31:20 compute-0 nova_compute[185723]: 2026-02-16 13:31:20.413 185727 DEBUG oslo_concurrency.lockutils [None req-da498c4e-2782-4dd4-ab45-4e1b065587fa 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] Lock "755ae6cb63977e865a71a8c38a1a67ce95c9d0b7" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.114s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:31:20 compute-0 nova_compute[185723]: 2026-02-16 13:31:20.414 185727 DEBUG oslo_concurrency.processutils [None req-da498c4e-2782-4dd4-ab45-4e1b065587fa 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:31:20 compute-0 nova_compute[185723]: 2026-02-16 13:31:20.462 185727 DEBUG oslo_concurrency.processutils [None req-da498c4e-2782-4dd4-ab45-4e1b065587fa 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json" returned: 0 in 0.048s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:31:20 compute-0 nova_compute[185723]: 2026-02-16 13:31:20.463 185727 DEBUG nova.virt.disk.api [None req-da498c4e-2782-4dd4-ab45-4e1b065587fa 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] Checking if we can resize image /var/lib/nova/instances/1e7b84b5-47ea-44b8-b949-ad82d5f1c8aa/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Feb 16 13:31:20 compute-0 nova_compute[185723]: 2026-02-16 13:31:20.464 185727 DEBUG oslo_concurrency.processutils [None req-da498c4e-2782-4dd4-ab45-4e1b065587fa 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1e7b84b5-47ea-44b8-b949-ad82d5f1c8aa/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:31:20 compute-0 nova_compute[185723]: 2026-02-16 13:31:20.509 185727 DEBUG oslo_concurrency.processutils [None req-da498c4e-2782-4dd4-ab45-4e1b065587fa 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1e7b84b5-47ea-44b8-b949-ad82d5f1c8aa/disk --force-share --output=json" returned: 0 in 0.045s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:31:20 compute-0 nova_compute[185723]: 2026-02-16 13:31:20.510 185727 DEBUG nova.virt.disk.api [None req-da498c4e-2782-4dd4-ab45-4e1b065587fa 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] Cannot resize image /var/lib/nova/instances/1e7b84b5-47ea-44b8-b949-ad82d5f1c8aa/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Feb 16 13:31:20 compute-0 nova_compute[185723]: 2026-02-16 13:31:20.511 185727 DEBUG nova.objects.instance [None req-da498c4e-2782-4dd4-ab45-4e1b065587fa 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] Lazy-loading 'migration_context' on Instance uuid 1e7b84b5-47ea-44b8-b949-ad82d5f1c8aa obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 13:31:20 compute-0 nova_compute[185723]: 2026-02-16 13:31:20.530 185727 DEBUG nova.virt.libvirt.driver [None req-da498c4e-2782-4dd4-ab45-4e1b065587fa 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] [instance: 1e7b84b5-47ea-44b8-b949-ad82d5f1c8aa] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Feb 16 13:31:20 compute-0 nova_compute[185723]: 2026-02-16 13:31:20.530 185727 DEBUG nova.virt.libvirt.driver [None req-da498c4e-2782-4dd4-ab45-4e1b065587fa 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] [instance: 1e7b84b5-47ea-44b8-b949-ad82d5f1c8aa] Ensure instance console log exists: /var/lib/nova/instances/1e7b84b5-47ea-44b8-b949-ad82d5f1c8aa/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Feb 16 13:31:20 compute-0 nova_compute[185723]: 2026-02-16 13:31:20.531 185727 DEBUG oslo_concurrency.lockutils [None req-da498c4e-2782-4dd4-ab45-4e1b065587fa 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:31:20 compute-0 nova_compute[185723]: 2026-02-16 13:31:20.531 185727 DEBUG oslo_concurrency.lockutils [None req-da498c4e-2782-4dd4-ab45-4e1b065587fa 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:31:20 compute-0 nova_compute[185723]: 2026-02-16 13:31:20.532 185727 DEBUG oslo_concurrency.lockutils [None req-da498c4e-2782-4dd4-ab45-4e1b065587fa 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:31:20 compute-0 nova_compute[185723]: 2026-02-16 13:31:20.865 185727 DEBUG nova.policy [None req-da498c4e-2782-4dd4-ab45-4e1b065587fa 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '8712c0037def471dabf14879c0a418ec', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '9d212b8e966a499a9aad9b972bb7e76d', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Feb 16 13:31:21 compute-0 nova_compute[185723]: 2026-02-16 13:31:21.840 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:31:21 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:31:21.840 105360 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=13, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '52:96:1b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '16:6c:7a:bd:49:39'}, ipsec=False) old=SB_Global(nb_cfg=12) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 13:31:21 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:31:21.841 105360 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 16 13:31:22 compute-0 nova_compute[185723]: 2026-02-16 13:31:22.112 185727 DEBUG nova.network.neutron [None req-da498c4e-2782-4dd4-ab45-4e1b065587fa 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] [instance: 1e7b84b5-47ea-44b8-b949-ad82d5f1c8aa] Successfully created port: c0af6030-8607-421e-b581-c7d30d70b02d _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Feb 16 13:31:23 compute-0 podman[209532]: 2026-02-16 13:31:23.04424291 +0000 UTC m=+0.086438363 container health_status 4ec6105d1727f201f656d7e6e358931be8ceabc5af37fb5ad779abc620122180 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 16 13:31:23 compute-0 nova_compute[185723]: 2026-02-16 13:31:23.126 185727 DEBUG nova.network.neutron [None req-da498c4e-2782-4dd4-ab45-4e1b065587fa 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] [instance: 1e7b84b5-47ea-44b8-b949-ad82d5f1c8aa] Successfully updated port: c0af6030-8607-421e-b581-c7d30d70b02d _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Feb 16 13:31:23 compute-0 nova_compute[185723]: 2026-02-16 13:31:23.153 185727 DEBUG oslo_concurrency.lockutils [None req-da498c4e-2782-4dd4-ab45-4e1b065587fa 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] Acquiring lock "refresh_cache-1e7b84b5-47ea-44b8-b949-ad82d5f1c8aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 13:31:23 compute-0 nova_compute[185723]: 2026-02-16 13:31:23.153 185727 DEBUG oslo_concurrency.lockutils [None req-da498c4e-2782-4dd4-ab45-4e1b065587fa 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] Acquired lock "refresh_cache-1e7b84b5-47ea-44b8-b949-ad82d5f1c8aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 13:31:23 compute-0 nova_compute[185723]: 2026-02-16 13:31:23.153 185727 DEBUG nova.network.neutron [None req-da498c4e-2782-4dd4-ab45-4e1b065587fa 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] [instance: 1e7b84b5-47ea-44b8-b949-ad82d5f1c8aa] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 16 13:31:23 compute-0 nova_compute[185723]: 2026-02-16 13:31:23.239 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:31:23 compute-0 nova_compute[185723]: 2026-02-16 13:31:23.292 185727 DEBUG nova.compute.manager [req-7328c687-f691-48d8-ab6d-a652c2451008 req-a7c1f2e0-b9b6-49da-b27c-0c77ba6b5552 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 1e7b84b5-47ea-44b8-b949-ad82d5f1c8aa] Received event network-changed-c0af6030-8607-421e-b581-c7d30d70b02d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:31:23 compute-0 nova_compute[185723]: 2026-02-16 13:31:23.293 185727 DEBUG nova.compute.manager [req-7328c687-f691-48d8-ab6d-a652c2451008 req-a7c1f2e0-b9b6-49da-b27c-0c77ba6b5552 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 1e7b84b5-47ea-44b8-b949-ad82d5f1c8aa] Refreshing instance network info cache due to event network-changed-c0af6030-8607-421e-b581-c7d30d70b02d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 16 13:31:23 compute-0 nova_compute[185723]: 2026-02-16 13:31:23.293 185727 DEBUG oslo_concurrency.lockutils [req-7328c687-f691-48d8-ab6d-a652c2451008 req-a7c1f2e0-b9b6-49da-b27c-0c77ba6b5552 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "refresh_cache-1e7b84b5-47ea-44b8-b949-ad82d5f1c8aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 13:31:23 compute-0 nova_compute[185723]: 2026-02-16 13:31:23.785 185727 DEBUG nova.network.neutron [None req-da498c4e-2782-4dd4-ab45-4e1b065587fa 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] [instance: 1e7b84b5-47ea-44b8-b949-ad82d5f1c8aa] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Feb 16 13:31:25 compute-0 nova_compute[185723]: 2026-02-16 13:31:25.859 185727 DEBUG nova.network.neutron [None req-da498c4e-2782-4dd4-ab45-4e1b065587fa 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] [instance: 1e7b84b5-47ea-44b8-b949-ad82d5f1c8aa] Updating instance_info_cache with network_info: [{"id": "c0af6030-8607-421e-b581-c7d30d70b02d", "address": "fa:16:3e:eb:fe:c4", "network": {"id": "34e10b77-8ec0-4af1-a031-d83792585eee", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-944275405-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9d212b8e966a499a9aad9b972bb7e76d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc0af6030-86", "ovs_interfaceid": "c0af6030-8607-421e-b581-c7d30d70b02d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 13:31:25 compute-0 nova_compute[185723]: 2026-02-16 13:31:25.908 185727 DEBUG oslo_concurrency.lockutils [None req-da498c4e-2782-4dd4-ab45-4e1b065587fa 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] Releasing lock "refresh_cache-1e7b84b5-47ea-44b8-b949-ad82d5f1c8aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 13:31:25 compute-0 nova_compute[185723]: 2026-02-16 13:31:25.908 185727 DEBUG nova.compute.manager [None req-da498c4e-2782-4dd4-ab45-4e1b065587fa 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] [instance: 1e7b84b5-47ea-44b8-b949-ad82d5f1c8aa] Instance network_info: |[{"id": "c0af6030-8607-421e-b581-c7d30d70b02d", "address": "fa:16:3e:eb:fe:c4", "network": {"id": "34e10b77-8ec0-4af1-a031-d83792585eee", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-944275405-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9d212b8e966a499a9aad9b972bb7e76d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc0af6030-86", "ovs_interfaceid": "c0af6030-8607-421e-b581-c7d30d70b02d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Feb 16 13:31:25 compute-0 nova_compute[185723]: 2026-02-16 13:31:25.909 185727 DEBUG oslo_concurrency.lockutils [req-7328c687-f691-48d8-ab6d-a652c2451008 req-a7c1f2e0-b9b6-49da-b27c-0c77ba6b5552 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquired lock "refresh_cache-1e7b84b5-47ea-44b8-b949-ad82d5f1c8aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 13:31:25 compute-0 nova_compute[185723]: 2026-02-16 13:31:25.910 185727 DEBUG nova.network.neutron [req-7328c687-f691-48d8-ab6d-a652c2451008 req-a7c1f2e0-b9b6-49da-b27c-0c77ba6b5552 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 1e7b84b5-47ea-44b8-b949-ad82d5f1c8aa] Refreshing network info cache for port c0af6030-8607-421e-b581-c7d30d70b02d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 16 13:31:25 compute-0 nova_compute[185723]: 2026-02-16 13:31:25.913 185727 DEBUG nova.virt.libvirt.driver [None req-da498c4e-2782-4dd4-ab45-4e1b065587fa 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] [instance: 1e7b84b5-47ea-44b8-b949-ad82d5f1c8aa] Start _get_guest_xml network_info=[{"id": "c0af6030-8607-421e-b581-c7d30d70b02d", "address": "fa:16:3e:eb:fe:c4", "network": {"id": "34e10b77-8ec0-4af1-a031-d83792585eee", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-944275405-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9d212b8e966a499a9aad9b972bb7e76d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc0af6030-86", "ovs_interfaceid": "c0af6030-8607-421e-b581-c7d30d70b02d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-16T13:17:01Z,direct_url=<?>,disk_format='qcow2',id=6fb9af7f-2971-4890-a777-6e99e888717f,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='fa8ec31824694513a42cc22c81880a5c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-16T13:17:03Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'encrypted': False, 'device_type': 'disk', 'disk_bus': 'virtio', 'encryption_options': None, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'size': 0, 'image_id': '6fb9af7f-2971-4890-a777-6e99e888717f'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Feb 16 13:31:25 compute-0 nova_compute[185723]: 2026-02-16 13:31:25.916 185727 WARNING nova.virt.libvirt.driver [None req-da498c4e-2782-4dd4-ab45-4e1b065587fa 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 13:31:25 compute-0 nova_compute[185723]: 2026-02-16 13:31:25.920 185727 DEBUG nova.virt.libvirt.host [None req-da498c4e-2782-4dd4-ab45-4e1b065587fa 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Feb 16 13:31:25 compute-0 nova_compute[185723]: 2026-02-16 13:31:25.921 185727 DEBUG nova.virt.libvirt.host [None req-da498c4e-2782-4dd4-ab45-4e1b065587fa 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Feb 16 13:31:25 compute-0 nova_compute[185723]: 2026-02-16 13:31:25.924 185727 DEBUG nova.virt.libvirt.host [None req-da498c4e-2782-4dd4-ab45-4e1b065587fa 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Feb 16 13:31:25 compute-0 nova_compute[185723]: 2026-02-16 13:31:25.925 185727 DEBUG nova.virt.libvirt.host [None req-da498c4e-2782-4dd4-ab45-4e1b065587fa 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Feb 16 13:31:25 compute-0 nova_compute[185723]: 2026-02-16 13:31:25.926 185727 DEBUG nova.virt.libvirt.driver [None req-da498c4e-2782-4dd4-ab45-4e1b065587fa 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Feb 16 13:31:25 compute-0 nova_compute[185723]: 2026-02-16 13:31:25.927 185727 DEBUG nova.virt.hardware [None req-da498c4e-2782-4dd4-ab45-4e1b065587fa 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-02-16T13:16:59Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='6d89f72c-1760-421e-a5f2-83dfc3723b84',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-16T13:17:01Z,direct_url=<?>,disk_format='qcow2',id=6fb9af7f-2971-4890-a777-6e99e888717f,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='fa8ec31824694513a42cc22c81880a5c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-16T13:17:03Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Feb 16 13:31:25 compute-0 nova_compute[185723]: 2026-02-16 13:31:25.927 185727 DEBUG nova.virt.hardware [None req-da498c4e-2782-4dd4-ab45-4e1b065587fa 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Feb 16 13:31:25 compute-0 nova_compute[185723]: 2026-02-16 13:31:25.928 185727 DEBUG nova.virt.hardware [None req-da498c4e-2782-4dd4-ab45-4e1b065587fa 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Feb 16 13:31:25 compute-0 nova_compute[185723]: 2026-02-16 13:31:25.928 185727 DEBUG nova.virt.hardware [None req-da498c4e-2782-4dd4-ab45-4e1b065587fa 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Feb 16 13:31:25 compute-0 nova_compute[185723]: 2026-02-16 13:31:25.928 185727 DEBUG nova.virt.hardware [None req-da498c4e-2782-4dd4-ab45-4e1b065587fa 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Feb 16 13:31:25 compute-0 nova_compute[185723]: 2026-02-16 13:31:25.929 185727 DEBUG nova.virt.hardware [None req-da498c4e-2782-4dd4-ab45-4e1b065587fa 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Feb 16 13:31:25 compute-0 nova_compute[185723]: 2026-02-16 13:31:25.929 185727 DEBUG nova.virt.hardware [None req-da498c4e-2782-4dd4-ab45-4e1b065587fa 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Feb 16 13:31:25 compute-0 nova_compute[185723]: 2026-02-16 13:31:25.929 185727 DEBUG nova.virt.hardware [None req-da498c4e-2782-4dd4-ab45-4e1b065587fa 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Feb 16 13:31:25 compute-0 nova_compute[185723]: 2026-02-16 13:31:25.930 185727 DEBUG nova.virt.hardware [None req-da498c4e-2782-4dd4-ab45-4e1b065587fa 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Feb 16 13:31:25 compute-0 nova_compute[185723]: 2026-02-16 13:31:25.930 185727 DEBUG nova.virt.hardware [None req-da498c4e-2782-4dd4-ab45-4e1b065587fa 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Feb 16 13:31:25 compute-0 nova_compute[185723]: 2026-02-16 13:31:25.930 185727 DEBUG nova.virt.hardware [None req-da498c4e-2782-4dd4-ab45-4e1b065587fa 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Feb 16 13:31:25 compute-0 nova_compute[185723]: 2026-02-16 13:31:25.935 185727 DEBUG nova.virt.libvirt.vif [None req-da498c4e-2782-4dd4-ab45-4e1b065587fa 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-16T13:31:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-407301435',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-407301435',id=10,image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='9d212b8e966a499a9aad9b972bb7e76d',ramdisk_id='',reservation_id='r-w7yvpawb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-464275700',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-464275700-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-16T13:31:20Z,user_data=None,user_id='8712c0037def471dabf14879c0a418ec',uuid=1e7b84b5-47ea-44b8-b949-ad82d5f1c8aa,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c0af6030-8607-421e-b581-c7d30d70b02d", "address": "fa:16:3e:eb:fe:c4", "network": {"id": "34e10b77-8ec0-4af1-a031-d83792585eee", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-944275405-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9d212b8e966a499a9aad9b972bb7e76d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc0af6030-86", "ovs_interfaceid": "c0af6030-8607-421e-b581-c7d30d70b02d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Feb 16 13:31:25 compute-0 nova_compute[185723]: 2026-02-16 13:31:25.935 185727 DEBUG nova.network.os_vif_util [None req-da498c4e-2782-4dd4-ab45-4e1b065587fa 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] Converting VIF {"id": "c0af6030-8607-421e-b581-c7d30d70b02d", "address": "fa:16:3e:eb:fe:c4", "network": {"id": "34e10b77-8ec0-4af1-a031-d83792585eee", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-944275405-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9d212b8e966a499a9aad9b972bb7e76d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc0af6030-86", "ovs_interfaceid": "c0af6030-8607-421e-b581-c7d30d70b02d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 13:31:25 compute-0 nova_compute[185723]: 2026-02-16 13:31:25.936 185727 DEBUG nova.network.os_vif_util [None req-da498c4e-2782-4dd4-ab45-4e1b065587fa 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:eb:fe:c4,bridge_name='br-int',has_traffic_filtering=True,id=c0af6030-8607-421e-b581-c7d30d70b02d,network=Network(34e10b77-8ec0-4af1-a031-d83792585eee),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc0af6030-86') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 13:31:25 compute-0 nova_compute[185723]: 2026-02-16 13:31:25.938 185727 DEBUG nova.objects.instance [None req-da498c4e-2782-4dd4-ab45-4e1b065587fa 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] Lazy-loading 'pci_devices' on Instance uuid 1e7b84b5-47ea-44b8-b949-ad82d5f1c8aa obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 13:31:25 compute-0 nova_compute[185723]: 2026-02-16 13:31:25.960 185727 DEBUG nova.virt.libvirt.driver [None req-da498c4e-2782-4dd4-ab45-4e1b065587fa 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] [instance: 1e7b84b5-47ea-44b8-b949-ad82d5f1c8aa] End _get_guest_xml xml=<domain type="kvm">
Feb 16 13:31:25 compute-0 nova_compute[185723]:   <uuid>1e7b84b5-47ea-44b8-b949-ad82d5f1c8aa</uuid>
Feb 16 13:31:25 compute-0 nova_compute[185723]:   <name>instance-0000000a</name>
Feb 16 13:31:25 compute-0 nova_compute[185723]:   <memory>131072</memory>
Feb 16 13:31:25 compute-0 nova_compute[185723]:   <vcpu>1</vcpu>
Feb 16 13:31:25 compute-0 nova_compute[185723]:   <metadata>
Feb 16 13:31:25 compute-0 nova_compute[185723]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb 16 13:31:25 compute-0 nova_compute[185723]:       <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Feb 16 13:31:25 compute-0 nova_compute[185723]:       <nova:name>tempest-TestExecuteHostMaintenanceStrategy-server-407301435</nova:name>
Feb 16 13:31:25 compute-0 nova_compute[185723]:       <nova:creationTime>2026-02-16 13:31:25</nova:creationTime>
Feb 16 13:31:25 compute-0 nova_compute[185723]:       <nova:flavor name="m1.nano">
Feb 16 13:31:25 compute-0 nova_compute[185723]:         <nova:memory>128</nova:memory>
Feb 16 13:31:25 compute-0 nova_compute[185723]:         <nova:disk>1</nova:disk>
Feb 16 13:31:25 compute-0 nova_compute[185723]:         <nova:swap>0</nova:swap>
Feb 16 13:31:25 compute-0 nova_compute[185723]:         <nova:ephemeral>0</nova:ephemeral>
Feb 16 13:31:25 compute-0 nova_compute[185723]:         <nova:vcpus>1</nova:vcpus>
Feb 16 13:31:25 compute-0 nova_compute[185723]:       </nova:flavor>
Feb 16 13:31:25 compute-0 nova_compute[185723]:       <nova:owner>
Feb 16 13:31:25 compute-0 nova_compute[185723]:         <nova:user uuid="8712c0037def471dabf14879c0a418ec">tempest-TestExecuteHostMaintenanceStrategy-464275700-project-member</nova:user>
Feb 16 13:31:25 compute-0 nova_compute[185723]:         <nova:project uuid="9d212b8e966a499a9aad9b972bb7e76d">tempest-TestExecuteHostMaintenanceStrategy-464275700</nova:project>
Feb 16 13:31:25 compute-0 nova_compute[185723]:       </nova:owner>
Feb 16 13:31:25 compute-0 nova_compute[185723]:       <nova:root type="image" uuid="6fb9af7f-2971-4890-a777-6e99e888717f"/>
Feb 16 13:31:25 compute-0 nova_compute[185723]:       <nova:ports>
Feb 16 13:31:25 compute-0 nova_compute[185723]:         <nova:port uuid="c0af6030-8607-421e-b581-c7d30d70b02d">
Feb 16 13:31:25 compute-0 nova_compute[185723]:           <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Feb 16 13:31:25 compute-0 nova_compute[185723]:         </nova:port>
Feb 16 13:31:25 compute-0 nova_compute[185723]:       </nova:ports>
Feb 16 13:31:25 compute-0 nova_compute[185723]:     </nova:instance>
Feb 16 13:31:25 compute-0 nova_compute[185723]:   </metadata>
Feb 16 13:31:25 compute-0 nova_compute[185723]:   <sysinfo type="smbios">
Feb 16 13:31:25 compute-0 nova_compute[185723]:     <system>
Feb 16 13:31:25 compute-0 nova_compute[185723]:       <entry name="manufacturer">RDO</entry>
Feb 16 13:31:25 compute-0 nova_compute[185723]:       <entry name="product">OpenStack Compute</entry>
Feb 16 13:31:25 compute-0 nova_compute[185723]:       <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Feb 16 13:31:25 compute-0 nova_compute[185723]:       <entry name="serial">1e7b84b5-47ea-44b8-b949-ad82d5f1c8aa</entry>
Feb 16 13:31:25 compute-0 nova_compute[185723]:       <entry name="uuid">1e7b84b5-47ea-44b8-b949-ad82d5f1c8aa</entry>
Feb 16 13:31:25 compute-0 nova_compute[185723]:       <entry name="family">Virtual Machine</entry>
Feb 16 13:31:25 compute-0 nova_compute[185723]:     </system>
Feb 16 13:31:25 compute-0 nova_compute[185723]:   </sysinfo>
Feb 16 13:31:25 compute-0 nova_compute[185723]:   <os>
Feb 16 13:31:25 compute-0 nova_compute[185723]:     <type arch="x86_64" machine="q35">hvm</type>
Feb 16 13:31:25 compute-0 nova_compute[185723]:     <boot dev="hd"/>
Feb 16 13:31:25 compute-0 nova_compute[185723]:     <smbios mode="sysinfo"/>
Feb 16 13:31:25 compute-0 nova_compute[185723]:   </os>
Feb 16 13:31:25 compute-0 nova_compute[185723]:   <features>
Feb 16 13:31:25 compute-0 nova_compute[185723]:     <acpi/>
Feb 16 13:31:25 compute-0 nova_compute[185723]:     <apic/>
Feb 16 13:31:25 compute-0 nova_compute[185723]:     <vmcoreinfo/>
Feb 16 13:31:25 compute-0 nova_compute[185723]:   </features>
Feb 16 13:31:25 compute-0 nova_compute[185723]:   <clock offset="utc">
Feb 16 13:31:25 compute-0 nova_compute[185723]:     <timer name="pit" tickpolicy="delay"/>
Feb 16 13:31:25 compute-0 nova_compute[185723]:     <timer name="rtc" tickpolicy="catchup"/>
Feb 16 13:31:25 compute-0 nova_compute[185723]:     <timer name="hpet" present="no"/>
Feb 16 13:31:25 compute-0 nova_compute[185723]:   </clock>
Feb 16 13:31:25 compute-0 nova_compute[185723]:   <cpu mode="custom" match="exact">
Feb 16 13:31:25 compute-0 nova_compute[185723]:     <model>Nehalem</model>
Feb 16 13:31:25 compute-0 nova_compute[185723]:     <topology sockets="1" cores="1" threads="1"/>
Feb 16 13:31:25 compute-0 nova_compute[185723]:   </cpu>
Feb 16 13:31:25 compute-0 nova_compute[185723]:   <devices>
Feb 16 13:31:25 compute-0 nova_compute[185723]:     <disk type="file" device="disk">
Feb 16 13:31:25 compute-0 nova_compute[185723]:       <driver name="qemu" type="qcow2" cache="none"/>
Feb 16 13:31:25 compute-0 nova_compute[185723]:       <source file="/var/lib/nova/instances/1e7b84b5-47ea-44b8-b949-ad82d5f1c8aa/disk"/>
Feb 16 13:31:25 compute-0 nova_compute[185723]:       <target dev="vda" bus="virtio"/>
Feb 16 13:31:25 compute-0 nova_compute[185723]:     </disk>
Feb 16 13:31:25 compute-0 nova_compute[185723]:     <disk type="file" device="cdrom">
Feb 16 13:31:25 compute-0 nova_compute[185723]:       <driver name="qemu" type="raw" cache="none"/>
Feb 16 13:31:25 compute-0 nova_compute[185723]:       <source file="/var/lib/nova/instances/1e7b84b5-47ea-44b8-b949-ad82d5f1c8aa/disk.config"/>
Feb 16 13:31:25 compute-0 nova_compute[185723]:       <target dev="sda" bus="sata"/>
Feb 16 13:31:25 compute-0 nova_compute[185723]:     </disk>
Feb 16 13:31:25 compute-0 nova_compute[185723]:     <interface type="ethernet">
Feb 16 13:31:25 compute-0 nova_compute[185723]:       <mac address="fa:16:3e:eb:fe:c4"/>
Feb 16 13:31:25 compute-0 nova_compute[185723]:       <model type="virtio"/>
Feb 16 13:31:25 compute-0 nova_compute[185723]:       <driver name="vhost" rx_queue_size="512"/>
Feb 16 13:31:25 compute-0 nova_compute[185723]:       <mtu size="1442"/>
Feb 16 13:31:25 compute-0 nova_compute[185723]:       <target dev="tapc0af6030-86"/>
Feb 16 13:31:25 compute-0 nova_compute[185723]:     </interface>
Feb 16 13:31:25 compute-0 nova_compute[185723]:     <serial type="pty">
Feb 16 13:31:25 compute-0 nova_compute[185723]:       <log file="/var/lib/nova/instances/1e7b84b5-47ea-44b8-b949-ad82d5f1c8aa/console.log" append="off"/>
Feb 16 13:31:25 compute-0 nova_compute[185723]:     </serial>
Feb 16 13:31:25 compute-0 nova_compute[185723]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Feb 16 13:31:25 compute-0 nova_compute[185723]:     <video>
Feb 16 13:31:25 compute-0 nova_compute[185723]:       <model type="virtio"/>
Feb 16 13:31:25 compute-0 nova_compute[185723]:     </video>
Feb 16 13:31:25 compute-0 nova_compute[185723]:     <input type="tablet" bus="usb"/>
Feb 16 13:31:25 compute-0 nova_compute[185723]:     <rng model="virtio">
Feb 16 13:31:25 compute-0 nova_compute[185723]:       <backend model="random">/dev/urandom</backend>
Feb 16 13:31:25 compute-0 nova_compute[185723]:     </rng>
Feb 16 13:31:25 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root"/>
Feb 16 13:31:25 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:31:25 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:31:25 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:31:25 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:31:25 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:31:25 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:31:25 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:31:25 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:31:25 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:31:25 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:31:25 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:31:25 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:31:25 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:31:25 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:31:25 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:31:25 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:31:25 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:31:25 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:31:25 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:31:25 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:31:25 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:31:25 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:31:25 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:31:25 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:31:25 compute-0 nova_compute[185723]:     <controller type="usb" index="0"/>
Feb 16 13:31:25 compute-0 nova_compute[185723]:     <memballoon model="virtio">
Feb 16 13:31:25 compute-0 nova_compute[185723]:       <stats period="10"/>
Feb 16 13:31:25 compute-0 nova_compute[185723]:     </memballoon>
Feb 16 13:31:25 compute-0 nova_compute[185723]:   </devices>
Feb 16 13:31:25 compute-0 nova_compute[185723]: </domain>
Feb 16 13:31:25 compute-0 nova_compute[185723]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Feb 16 13:31:25 compute-0 nova_compute[185723]: 2026-02-16 13:31:25.961 185727 DEBUG nova.compute.manager [None req-da498c4e-2782-4dd4-ab45-4e1b065587fa 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] [instance: 1e7b84b5-47ea-44b8-b949-ad82d5f1c8aa] Preparing to wait for external event network-vif-plugged-c0af6030-8607-421e-b581-c7d30d70b02d prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Feb 16 13:31:25 compute-0 nova_compute[185723]: 2026-02-16 13:31:25.961 185727 DEBUG oslo_concurrency.lockutils [None req-da498c4e-2782-4dd4-ab45-4e1b065587fa 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] Acquiring lock "1e7b84b5-47ea-44b8-b949-ad82d5f1c8aa-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:31:25 compute-0 nova_compute[185723]: 2026-02-16 13:31:25.961 185727 DEBUG oslo_concurrency.lockutils [None req-da498c4e-2782-4dd4-ab45-4e1b065587fa 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] Lock "1e7b84b5-47ea-44b8-b949-ad82d5f1c8aa-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:31:25 compute-0 nova_compute[185723]: 2026-02-16 13:31:25.962 185727 DEBUG oslo_concurrency.lockutils [None req-da498c4e-2782-4dd4-ab45-4e1b065587fa 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] Lock "1e7b84b5-47ea-44b8-b949-ad82d5f1c8aa-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:31:25 compute-0 nova_compute[185723]: 2026-02-16 13:31:25.962 185727 DEBUG nova.virt.libvirt.vif [None req-da498c4e-2782-4dd4-ab45-4e1b065587fa 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-16T13:31:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-407301435',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-407301435',id=10,image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='9d212b8e966a499a9aad9b972bb7e76d',ramdisk_id='',reservation_id='r-w7yvpawb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-464275700',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-464275700-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-16T13:31:20Z,user_data=None,user_id='8712c0037def471dabf14879c0a418ec',uuid=1e7b84b5-47ea-44b8-b949-ad82d5f1c8aa,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c0af6030-8607-421e-b581-c7d30d70b02d", "address": "fa:16:3e:eb:fe:c4", "network": {"id": "34e10b77-8ec0-4af1-a031-d83792585eee", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-944275405-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9d212b8e966a499a9aad9b972bb7e76d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc0af6030-86", "ovs_interfaceid": "c0af6030-8607-421e-b581-c7d30d70b02d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Feb 16 13:31:25 compute-0 nova_compute[185723]: 2026-02-16 13:31:25.962 185727 DEBUG nova.network.os_vif_util [None req-da498c4e-2782-4dd4-ab45-4e1b065587fa 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] Converting VIF {"id": "c0af6030-8607-421e-b581-c7d30d70b02d", "address": "fa:16:3e:eb:fe:c4", "network": {"id": "34e10b77-8ec0-4af1-a031-d83792585eee", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-944275405-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9d212b8e966a499a9aad9b972bb7e76d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc0af6030-86", "ovs_interfaceid": "c0af6030-8607-421e-b581-c7d30d70b02d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 13:31:25 compute-0 nova_compute[185723]: 2026-02-16 13:31:25.963 185727 DEBUG nova.network.os_vif_util [None req-da498c4e-2782-4dd4-ab45-4e1b065587fa 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:eb:fe:c4,bridge_name='br-int',has_traffic_filtering=True,id=c0af6030-8607-421e-b581-c7d30d70b02d,network=Network(34e10b77-8ec0-4af1-a031-d83792585eee),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc0af6030-86') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 13:31:25 compute-0 nova_compute[185723]: 2026-02-16 13:31:25.963 185727 DEBUG os_vif [None req-da498c4e-2782-4dd4-ab45-4e1b065587fa 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:eb:fe:c4,bridge_name='br-int',has_traffic_filtering=True,id=c0af6030-8607-421e-b581-c7d30d70b02d,network=Network(34e10b77-8ec0-4af1-a031-d83792585eee),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc0af6030-86') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Feb 16 13:31:25 compute-0 nova_compute[185723]: 2026-02-16 13:31:25.964 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:31:25 compute-0 nova_compute[185723]: 2026-02-16 13:31:25.964 185727 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:31:25 compute-0 nova_compute[185723]: 2026-02-16 13:31:25.965 185727 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 13:31:25 compute-0 nova_compute[185723]: 2026-02-16 13:31:25.969 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:31:25 compute-0 nova_compute[185723]: 2026-02-16 13:31:25.969 185727 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc0af6030-86, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:31:25 compute-0 nova_compute[185723]: 2026-02-16 13:31:25.970 185727 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapc0af6030-86, col_values=(('external_ids', {'iface-id': 'c0af6030-8607-421e-b581-c7d30d70b02d', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:eb:fe:c4', 'vm-uuid': '1e7b84b5-47ea-44b8-b949-ad82d5f1c8aa'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:31:25 compute-0 nova_compute[185723]: 2026-02-16 13:31:25.971 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:31:25 compute-0 NetworkManager[56177]: <info>  [1771248685.9725] manager: (tapc0af6030-86): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/37)
Feb 16 13:31:25 compute-0 nova_compute[185723]: 2026-02-16 13:31:25.974 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 13:31:25 compute-0 nova_compute[185723]: 2026-02-16 13:31:25.977 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:31:25 compute-0 nova_compute[185723]: 2026-02-16 13:31:25.978 185727 INFO os_vif [None req-da498c4e-2782-4dd4-ab45-4e1b065587fa 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:eb:fe:c4,bridge_name='br-int',has_traffic_filtering=True,id=c0af6030-8607-421e-b581-c7d30d70b02d,network=Network(34e10b77-8ec0-4af1-a031-d83792585eee),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc0af6030-86')
Feb 16 13:31:26 compute-0 nova_compute[185723]: 2026-02-16 13:31:26.023 185727 DEBUG nova.virt.libvirt.driver [None req-da498c4e-2782-4dd4-ab45-4e1b065587fa 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 16 13:31:26 compute-0 nova_compute[185723]: 2026-02-16 13:31:26.024 185727 DEBUG nova.virt.libvirt.driver [None req-da498c4e-2782-4dd4-ab45-4e1b065587fa 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 16 13:31:26 compute-0 nova_compute[185723]: 2026-02-16 13:31:26.024 185727 DEBUG nova.virt.libvirt.driver [None req-da498c4e-2782-4dd4-ab45-4e1b065587fa 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] No VIF found with MAC fa:16:3e:eb:fe:c4, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Feb 16 13:31:26 compute-0 nova_compute[185723]: 2026-02-16 13:31:26.025 185727 INFO nova.virt.libvirt.driver [None req-da498c4e-2782-4dd4-ab45-4e1b065587fa 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] [instance: 1e7b84b5-47ea-44b8-b949-ad82d5f1c8aa] Using config drive
Feb 16 13:31:27 compute-0 nova_compute[185723]: 2026-02-16 13:31:27.262 185727 INFO nova.virt.libvirt.driver [None req-da498c4e-2782-4dd4-ab45-4e1b065587fa 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] [instance: 1e7b84b5-47ea-44b8-b949-ad82d5f1c8aa] Creating config drive at /var/lib/nova/instances/1e7b84b5-47ea-44b8-b949-ad82d5f1c8aa/disk.config
Feb 16 13:31:27 compute-0 nova_compute[185723]: 2026-02-16 13:31:27.270 185727 DEBUG oslo_concurrency.processutils [None req-da498c4e-2782-4dd4-ab45-4e1b065587fa 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/1e7b84b5-47ea-44b8-b949-ad82d5f1c8aa/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmp80m9_1u2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:31:27 compute-0 nova_compute[185723]: 2026-02-16 13:31:27.400 185727 DEBUG oslo_concurrency.processutils [None req-da498c4e-2782-4dd4-ab45-4e1b065587fa 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/1e7b84b5-47ea-44b8-b949-ad82d5f1c8aa/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmp80m9_1u2" returned: 0 in 0.130s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:31:27 compute-0 kernel: tapc0af6030-86: entered promiscuous mode
Feb 16 13:31:27 compute-0 ovn_controller[96072]: 2026-02-16T13:31:27Z|00080|binding|INFO|Claiming lport c0af6030-8607-421e-b581-c7d30d70b02d for this chassis.
Feb 16 13:31:27 compute-0 ovn_controller[96072]: 2026-02-16T13:31:27Z|00081|binding|INFO|c0af6030-8607-421e-b581-c7d30d70b02d: Claiming fa:16:3e:eb:fe:c4 10.100.0.12
Feb 16 13:31:27 compute-0 NetworkManager[56177]: <info>  [1771248687.4617] manager: (tapc0af6030-86): new Tun device (/org/freedesktop/NetworkManager/Devices/38)
Feb 16 13:31:27 compute-0 nova_compute[185723]: 2026-02-16 13:31:27.460 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:31:27 compute-0 nova_compute[185723]: 2026-02-16 13:31:27.464 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:31:27 compute-0 nova_compute[185723]: 2026-02-16 13:31:27.468 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:31:27 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:31:27.475 105360 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:eb:fe:c4 10.100.0.12'], port_security=['fa:16:3e:eb:fe:c4 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '1e7b84b5-47ea-44b8-b949-ad82d5f1c8aa', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-34e10b77-8ec0-4af1-a031-d83792585eee', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9d212b8e966a499a9aad9b972bb7e76d', 'neutron:revision_number': '2', 'neutron:security_group_ids': '145107b4-bbb8-4e69-b3bf-db62f38a1f3d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1fdee5c0-c83c-45cf-986e-fa2b109e36c1, chassis=[<ovs.db.idl.Row object at 0x7f2e53046760>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2e53046760>], logical_port=c0af6030-8607-421e-b581-c7d30d70b02d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 13:31:27 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:31:27.476 105360 INFO neutron.agent.ovn.metadata.agent [-] Port c0af6030-8607-421e-b581-c7d30d70b02d in datapath 34e10b77-8ec0-4af1-a031-d83792585eee bound to our chassis
Feb 16 13:31:27 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:31:27.477 105360 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 34e10b77-8ec0-4af1-a031-d83792585eee
Feb 16 13:31:27 compute-0 nova_compute[185723]: 2026-02-16 13:31:27.486 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:31:27 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:31:27.489 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[cf0d3cc3-6141-41a9-add7-ff3ef41f5084]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:31:27 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:31:27.489 105360 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap34e10b77-81 in ovnmeta-34e10b77-8ec0-4af1-a031-d83792585eee namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Feb 16 13:31:27 compute-0 ovn_controller[96072]: 2026-02-16T13:31:27Z|00082|binding|INFO|Setting lport c0af6030-8607-421e-b581-c7d30d70b02d ovn-installed in OVS
Feb 16 13:31:27 compute-0 ovn_controller[96072]: 2026-02-16T13:31:27Z|00083|binding|INFO|Setting lport c0af6030-8607-421e-b581-c7d30d70b02d up in Southbound
Feb 16 13:31:27 compute-0 nova_compute[185723]: 2026-02-16 13:31:27.491 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:31:27 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:31:27.491 206438 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap34e10b77-80 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Feb 16 13:31:27 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:31:27.492 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[be0d28a2-bfb8-4c09-a878-92fc28ab2413]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:31:27 compute-0 systemd-udevd[209578]: Network interface NamePolicy= disabled on kernel command line.
Feb 16 13:31:27 compute-0 systemd-machined[155229]: New machine qemu-6-instance-0000000a.
Feb 16 13:31:27 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:31:27.492 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[d8329a98-44db-4258-be6e-414f8390aa9f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:31:27 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:31:27.500 105762 DEBUG oslo.privsep.daemon [-] privsep: reply[a09c8ca8-149a-4d77-81fe-3e9d04d2473e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:31:27 compute-0 NetworkManager[56177]: <info>  [1771248687.5038] device (tapc0af6030-86): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 16 13:31:27 compute-0 NetworkManager[56177]: <info>  [1771248687.5045] device (tapc0af6030-86): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Feb 16 13:31:27 compute-0 systemd[1]: Started Virtual Machine qemu-6-instance-0000000a.
Feb 16 13:31:27 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:31:27.512 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[ea63dc44-9483-4809-a8ed-308c4c835999]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:31:27 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:31:27.535 206452 DEBUG oslo.privsep.daemon [-] privsep: reply[a4e6a62b-2ba1-437e-9fc5-c19a127ceec7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:31:27 compute-0 NetworkManager[56177]: <info>  [1771248687.5415] manager: (tap34e10b77-80): new Veth device (/org/freedesktop/NetworkManager/Devices/39)
Feb 16 13:31:27 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:31:27.540 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[d3ce3aec-a2d2-4e05-a802-513c115da0e2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:31:27 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:31:27.568 206452 DEBUG oslo.privsep.daemon [-] privsep: reply[cb6d32c0-b980-4752-ba17-500342df8637]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:31:27 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:31:27.571 206452 DEBUG oslo.privsep.daemon [-] privsep: reply[a7dcf344-5f21-42e8-b42a-31d5911bf1cb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:31:27 compute-0 NetworkManager[56177]: <info>  [1771248687.5846] device (tap34e10b77-80): carrier: link connected
Feb 16 13:31:27 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:31:27.586 206452 DEBUG oslo.privsep.daemon [-] privsep: reply[9c5515e8-71bc-4b94-9f73-2f5039e500db]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:31:27 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:31:27.600 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[28dcf747-8062-4590-a297-9dc7e5be63eb]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap34e10b77-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b3:31:13'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 25], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 477097, 'reachable_time': 25835, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 209610, 'error': None, 'target': 'ovnmeta-34e10b77-8ec0-4af1-a031-d83792585eee', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:31:27 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:31:27.608 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[4dda954a-d0b4-409e-ac49-c31a62815340]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feb3:3113'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 477097, 'tstamp': 477097}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 209611, 'error': None, 'target': 'ovnmeta-34e10b77-8ec0-4af1-a031-d83792585eee', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:31:27 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:31:27.623 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[ced1ff66-b76f-466f-aebc-611b0e1a1dcd]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap34e10b77-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b3:31:13'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 25], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 477097, 'reachable_time': 25835, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 209613, 'error': None, 'target': 'ovnmeta-34e10b77-8ec0-4af1-a031-d83792585eee', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:31:27 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:31:27.638 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[cd1628d9-1791-4b93-b906-e1f060401d92]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:31:27 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:31:27.687 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[540ce10d-ceaf-4eb5-8e7a-caef6792f4bd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:31:27 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:31:27.689 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap34e10b77-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:31:27 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:31:27.690 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 13:31:27 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:31:27.691 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap34e10b77-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:31:27 compute-0 nova_compute[185723]: 2026-02-16 13:31:27.694 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:31:27 compute-0 kernel: tap34e10b77-80: entered promiscuous mode
Feb 16 13:31:27 compute-0 NetworkManager[56177]: <info>  [1771248687.6977] manager: (tap34e10b77-80): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/40)
Feb 16 13:31:27 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:31:27.701 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap34e10b77-80, col_values=(('external_ids', {'iface-id': '37eb0121-3449-47dc-8fd8-69d7f9268b6f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:31:27 compute-0 nova_compute[185723]: 2026-02-16 13:31:27.703 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:31:27 compute-0 ovn_controller[96072]: 2026-02-16T13:31:27Z|00084|binding|INFO|Releasing lport 37eb0121-3449-47dc-8fd8-69d7f9268b6f from this chassis (sb_readonly=0)
Feb 16 13:31:27 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:31:27.705 105360 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/34e10b77-8ec0-4af1-a031-d83792585eee.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/34e10b77-8ec0-4af1-a031-d83792585eee.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Feb 16 13:31:27 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:31:27.709 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[af727065-7770-4a12-8336-67b86b58fe9a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:31:27 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:31:27.710 105360 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Feb 16 13:31:27 compute-0 ovn_metadata_agent[105355]: global
Feb 16 13:31:27 compute-0 ovn_metadata_agent[105355]:     log         /dev/log local0 debug
Feb 16 13:31:27 compute-0 ovn_metadata_agent[105355]:     log-tag     haproxy-metadata-proxy-34e10b77-8ec0-4af1-a031-d83792585eee
Feb 16 13:31:27 compute-0 ovn_metadata_agent[105355]:     user        root
Feb 16 13:31:27 compute-0 ovn_metadata_agent[105355]:     group       root
Feb 16 13:31:27 compute-0 ovn_metadata_agent[105355]:     maxconn     1024
Feb 16 13:31:27 compute-0 ovn_metadata_agent[105355]:     pidfile     /var/lib/neutron/external/pids/34e10b77-8ec0-4af1-a031-d83792585eee.pid.haproxy
Feb 16 13:31:27 compute-0 ovn_metadata_agent[105355]:     daemon
Feb 16 13:31:27 compute-0 ovn_metadata_agent[105355]: 
Feb 16 13:31:27 compute-0 ovn_metadata_agent[105355]: defaults
Feb 16 13:31:27 compute-0 ovn_metadata_agent[105355]:     log global
Feb 16 13:31:27 compute-0 ovn_metadata_agent[105355]:     mode http
Feb 16 13:31:27 compute-0 ovn_metadata_agent[105355]:     option httplog
Feb 16 13:31:27 compute-0 ovn_metadata_agent[105355]:     option dontlognull
Feb 16 13:31:27 compute-0 ovn_metadata_agent[105355]:     option http-server-close
Feb 16 13:31:27 compute-0 ovn_metadata_agent[105355]:     option forwardfor
Feb 16 13:31:27 compute-0 ovn_metadata_agent[105355]:     retries                 3
Feb 16 13:31:27 compute-0 ovn_metadata_agent[105355]:     timeout http-request    30s
Feb 16 13:31:27 compute-0 ovn_metadata_agent[105355]:     timeout connect         30s
Feb 16 13:31:27 compute-0 ovn_metadata_agent[105355]:     timeout client          32s
Feb 16 13:31:27 compute-0 ovn_metadata_agent[105355]:     timeout server          32s
Feb 16 13:31:27 compute-0 ovn_metadata_agent[105355]:     timeout http-keep-alive 30s
Feb 16 13:31:27 compute-0 ovn_metadata_agent[105355]: 
Feb 16 13:31:27 compute-0 ovn_metadata_agent[105355]: 
Feb 16 13:31:27 compute-0 ovn_metadata_agent[105355]: listen listener
Feb 16 13:31:27 compute-0 ovn_metadata_agent[105355]:     bind 169.254.169.254:80
Feb 16 13:31:27 compute-0 ovn_metadata_agent[105355]:     server metadata /var/lib/neutron/metadata_proxy
Feb 16 13:31:27 compute-0 ovn_metadata_agent[105355]:     http-request add-header X-OVN-Network-ID 34e10b77-8ec0-4af1-a031-d83792585eee
Feb 16 13:31:27 compute-0 ovn_metadata_agent[105355]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Feb 16 13:31:27 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:31:27.711 105360 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-34e10b77-8ec0-4af1-a031-d83792585eee', 'env', 'PROCESS_TAG=haproxy-34e10b77-8ec0-4af1-a031-d83792585eee', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/34e10b77-8ec0-4af1-a031-d83792585eee.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Feb 16 13:31:27 compute-0 nova_compute[185723]: 2026-02-16 13:31:27.714 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:31:27 compute-0 nova_compute[185723]: 2026-02-16 13:31:27.779 185727 DEBUG nova.compute.manager [req-0488f594-efad-494d-aed0-1d582c3dec1b req-3b62e612-98c9-4f8b-9520-d2c12c7d65fa faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 1e7b84b5-47ea-44b8-b949-ad82d5f1c8aa] Received event network-vif-plugged-c0af6030-8607-421e-b581-c7d30d70b02d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:31:27 compute-0 nova_compute[185723]: 2026-02-16 13:31:27.780 185727 DEBUG oslo_concurrency.lockutils [req-0488f594-efad-494d-aed0-1d582c3dec1b req-3b62e612-98c9-4f8b-9520-d2c12c7d65fa faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "1e7b84b5-47ea-44b8-b949-ad82d5f1c8aa-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:31:27 compute-0 nova_compute[185723]: 2026-02-16 13:31:27.780 185727 DEBUG oslo_concurrency.lockutils [req-0488f594-efad-494d-aed0-1d582c3dec1b req-3b62e612-98c9-4f8b-9520-d2c12c7d65fa faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "1e7b84b5-47ea-44b8-b949-ad82d5f1c8aa-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:31:27 compute-0 nova_compute[185723]: 2026-02-16 13:31:27.781 185727 DEBUG oslo_concurrency.lockutils [req-0488f594-efad-494d-aed0-1d582c3dec1b req-3b62e612-98c9-4f8b-9520-d2c12c7d65fa faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "1e7b84b5-47ea-44b8-b949-ad82d5f1c8aa-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:31:27 compute-0 nova_compute[185723]: 2026-02-16 13:31:27.781 185727 DEBUG nova.compute.manager [req-0488f594-efad-494d-aed0-1d582c3dec1b req-3b62e612-98c9-4f8b-9520-d2c12c7d65fa faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 1e7b84b5-47ea-44b8-b949-ad82d5f1c8aa] Processing event network-vif-plugged-c0af6030-8607-421e-b581-c7d30d70b02d _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Feb 16 13:31:27 compute-0 nova_compute[185723]: 2026-02-16 13:31:27.961 185727 DEBUG nova.network.neutron [req-7328c687-f691-48d8-ab6d-a652c2451008 req-a7c1f2e0-b9b6-49da-b27c-0c77ba6b5552 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 1e7b84b5-47ea-44b8-b949-ad82d5f1c8aa] Updated VIF entry in instance network info cache for port c0af6030-8607-421e-b581-c7d30d70b02d. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 16 13:31:27 compute-0 nova_compute[185723]: 2026-02-16 13:31:27.961 185727 DEBUG nova.network.neutron [req-7328c687-f691-48d8-ab6d-a652c2451008 req-a7c1f2e0-b9b6-49da-b27c-0c77ba6b5552 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 1e7b84b5-47ea-44b8-b949-ad82d5f1c8aa] Updating instance_info_cache with network_info: [{"id": "c0af6030-8607-421e-b581-c7d30d70b02d", "address": "fa:16:3e:eb:fe:c4", "network": {"id": "34e10b77-8ec0-4af1-a031-d83792585eee", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-944275405-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9d212b8e966a499a9aad9b972bb7e76d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc0af6030-86", "ovs_interfaceid": "c0af6030-8607-421e-b581-c7d30d70b02d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 13:31:27 compute-0 nova_compute[185723]: 2026-02-16 13:31:27.978 185727 DEBUG oslo_concurrency.lockutils [req-7328c687-f691-48d8-ab6d-a652c2451008 req-a7c1f2e0-b9b6-49da-b27c-0c77ba6b5552 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Releasing lock "refresh_cache-1e7b84b5-47ea-44b8-b949-ad82d5f1c8aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 13:31:28 compute-0 nova_compute[185723]: 2026-02-16 13:31:28.058 185727 DEBUG nova.compute.manager [None req-da498c4e-2782-4dd4-ab45-4e1b065587fa 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] [instance: 1e7b84b5-47ea-44b8-b949-ad82d5f1c8aa] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Feb 16 13:31:28 compute-0 nova_compute[185723]: 2026-02-16 13:31:28.059 185727 DEBUG nova.virt.driver [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] Emitting event <LifecycleEvent: 1771248688.0578969, 1e7b84b5-47ea-44b8-b949-ad82d5f1c8aa => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 13:31:28 compute-0 nova_compute[185723]: 2026-02-16 13:31:28.060 185727 INFO nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: 1e7b84b5-47ea-44b8-b949-ad82d5f1c8aa] VM Started (Lifecycle Event)
Feb 16 13:31:28 compute-0 nova_compute[185723]: 2026-02-16 13:31:28.063 185727 DEBUG nova.virt.libvirt.driver [None req-da498c4e-2782-4dd4-ab45-4e1b065587fa 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] [instance: 1e7b84b5-47ea-44b8-b949-ad82d5f1c8aa] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Feb 16 13:31:28 compute-0 nova_compute[185723]: 2026-02-16 13:31:28.067 185727 INFO nova.virt.libvirt.driver [-] [instance: 1e7b84b5-47ea-44b8-b949-ad82d5f1c8aa] Instance spawned successfully.
Feb 16 13:31:28 compute-0 nova_compute[185723]: 2026-02-16 13:31:28.068 185727 DEBUG nova.virt.libvirt.driver [None req-da498c4e-2782-4dd4-ab45-4e1b065587fa 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] [instance: 1e7b84b5-47ea-44b8-b949-ad82d5f1c8aa] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Feb 16 13:31:28 compute-0 nova_compute[185723]: 2026-02-16 13:31:28.083 185727 DEBUG nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: 1e7b84b5-47ea-44b8-b949-ad82d5f1c8aa] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:31:28 compute-0 podman[209652]: 2026-02-16 13:31:28.087033398 +0000 UTC m=+0.057364186 container create cac232bda38c5dd79b9caf80bd16d4ab4e13e85376207a1e81efe654f8b314b1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-34e10b77-8ec0-4af1-a031-d83792585eee, tcib_managed=true, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 16 13:31:28 compute-0 nova_compute[185723]: 2026-02-16 13:31:28.087 185727 DEBUG nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: 1e7b84b5-47ea-44b8-b949-ad82d5f1c8aa] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 16 13:31:28 compute-0 nova_compute[185723]: 2026-02-16 13:31:28.114 185727 INFO nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: 1e7b84b5-47ea-44b8-b949-ad82d5f1c8aa] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 16 13:31:28 compute-0 nova_compute[185723]: 2026-02-16 13:31:28.116 185727 DEBUG nova.virt.driver [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] Emitting event <LifecycleEvent: 1771248688.058121, 1e7b84b5-47ea-44b8-b949-ad82d5f1c8aa => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 13:31:28 compute-0 nova_compute[185723]: 2026-02-16 13:31:28.117 185727 INFO nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: 1e7b84b5-47ea-44b8-b949-ad82d5f1c8aa] VM Paused (Lifecycle Event)
Feb 16 13:31:28 compute-0 nova_compute[185723]: 2026-02-16 13:31:28.120 185727 DEBUG nova.virt.libvirt.driver [None req-da498c4e-2782-4dd4-ab45-4e1b065587fa 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] [instance: 1e7b84b5-47ea-44b8-b949-ad82d5f1c8aa] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:31:28 compute-0 nova_compute[185723]: 2026-02-16 13:31:28.121 185727 DEBUG nova.virt.libvirt.driver [None req-da498c4e-2782-4dd4-ab45-4e1b065587fa 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] [instance: 1e7b84b5-47ea-44b8-b949-ad82d5f1c8aa] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:31:28 compute-0 nova_compute[185723]: 2026-02-16 13:31:28.121 185727 DEBUG nova.virt.libvirt.driver [None req-da498c4e-2782-4dd4-ab45-4e1b065587fa 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] [instance: 1e7b84b5-47ea-44b8-b949-ad82d5f1c8aa] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:31:28 compute-0 nova_compute[185723]: 2026-02-16 13:31:28.122 185727 DEBUG nova.virt.libvirt.driver [None req-da498c4e-2782-4dd4-ab45-4e1b065587fa 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] [instance: 1e7b84b5-47ea-44b8-b949-ad82d5f1c8aa] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:31:28 compute-0 nova_compute[185723]: 2026-02-16 13:31:28.123 185727 DEBUG nova.virt.libvirt.driver [None req-da498c4e-2782-4dd4-ab45-4e1b065587fa 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] [instance: 1e7b84b5-47ea-44b8-b949-ad82d5f1c8aa] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:31:28 compute-0 nova_compute[185723]: 2026-02-16 13:31:28.123 185727 DEBUG nova.virt.libvirt.driver [None req-da498c4e-2782-4dd4-ab45-4e1b065587fa 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] [instance: 1e7b84b5-47ea-44b8-b949-ad82d5f1c8aa] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:31:28 compute-0 systemd[1]: Started libpod-conmon-cac232bda38c5dd79b9caf80bd16d4ab4e13e85376207a1e81efe654f8b314b1.scope.
Feb 16 13:31:28 compute-0 nova_compute[185723]: 2026-02-16 13:31:28.152 185727 DEBUG nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: 1e7b84b5-47ea-44b8-b949-ad82d5f1c8aa] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:31:28 compute-0 podman[209652]: 2026-02-16 13:31:28.061803987 +0000 UTC m=+0.032134795 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Feb 16 13:31:28 compute-0 nova_compute[185723]: 2026-02-16 13:31:28.155 185727 DEBUG nova.virt.driver [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] Emitting event <LifecycleEvent: 1771248688.0631695, 1e7b84b5-47ea-44b8-b949-ad82d5f1c8aa => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 13:31:28 compute-0 nova_compute[185723]: 2026-02-16 13:31:28.155 185727 INFO nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: 1e7b84b5-47ea-44b8-b949-ad82d5f1c8aa] VM Resumed (Lifecycle Event)
Feb 16 13:31:28 compute-0 systemd[1]: Started libcrun container.
Feb 16 13:31:28 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fc5a63ec9f3bdef818d1cc65d7d69c8c0ef7ad3f99c8e3572288f2d05872350c/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 16 13:31:28 compute-0 podman[209652]: 2026-02-16 13:31:28.174364082 +0000 UTC m=+0.144694900 container init cac232bda38c5dd79b9caf80bd16d4ab4e13e85376207a1e81efe654f8b314b1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-34e10b77-8ec0-4af1-a031-d83792585eee, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127)
Feb 16 13:31:28 compute-0 podman[209652]: 2026-02-16 13:31:28.179057949 +0000 UTC m=+0.149388737 container start cac232bda38c5dd79b9caf80bd16d4ab4e13e85376207a1e81efe654f8b314b1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-34e10b77-8ec0-4af1-a031-d83792585eee, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 16 13:31:28 compute-0 nova_compute[185723]: 2026-02-16 13:31:28.190 185727 DEBUG nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: 1e7b84b5-47ea-44b8-b949-ad82d5f1c8aa] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:31:28 compute-0 nova_compute[185723]: 2026-02-16 13:31:28.193 185727 DEBUG nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: 1e7b84b5-47ea-44b8-b949-ad82d5f1c8aa] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 16 13:31:28 compute-0 neutron-haproxy-ovnmeta-34e10b77-8ec0-4af1-a031-d83792585eee[209669]: [NOTICE]   (209673) : New worker (209675) forked
Feb 16 13:31:28 compute-0 neutron-haproxy-ovnmeta-34e10b77-8ec0-4af1-a031-d83792585eee[209669]: [NOTICE]   (209673) : Loading success.
Feb 16 13:31:28 compute-0 nova_compute[185723]: 2026-02-16 13:31:28.203 185727 INFO nova.compute.manager [None req-da498c4e-2782-4dd4-ab45-4e1b065587fa 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] [instance: 1e7b84b5-47ea-44b8-b949-ad82d5f1c8aa] Took 8.02 seconds to spawn the instance on the hypervisor.
Feb 16 13:31:28 compute-0 nova_compute[185723]: 2026-02-16 13:31:28.204 185727 DEBUG nova.compute.manager [None req-da498c4e-2782-4dd4-ab45-4e1b065587fa 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] [instance: 1e7b84b5-47ea-44b8-b949-ad82d5f1c8aa] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:31:28 compute-0 nova_compute[185723]: 2026-02-16 13:31:28.214 185727 INFO nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: 1e7b84b5-47ea-44b8-b949-ad82d5f1c8aa] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 16 13:31:28 compute-0 nova_compute[185723]: 2026-02-16 13:31:28.241 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:31:28 compute-0 sshd-session[209612]: Invalid user test from 188.166.42.159 port 33160
Feb 16 13:31:28 compute-0 nova_compute[185723]: 2026-02-16 13:31:28.282 185727 INFO nova.compute.manager [None req-da498c4e-2782-4dd4-ab45-4e1b065587fa 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] [instance: 1e7b84b5-47ea-44b8-b949-ad82d5f1c8aa] Took 8.55 seconds to build instance.
Feb 16 13:31:28 compute-0 nova_compute[185723]: 2026-02-16 13:31:28.303 185727 DEBUG oslo_concurrency.lockutils [None req-da498c4e-2782-4dd4-ab45-4e1b065587fa 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] Lock "1e7b84b5-47ea-44b8-b949-ad82d5f1c8aa" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.640s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:31:28 compute-0 sshd-session[209612]: Connection closed by invalid user test 188.166.42.159 port 33160 [preauth]
Feb 16 13:31:28 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:31:28.843 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=b0e583b2-47d7-4bde-bbd6-282143e0c194, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '13'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:31:29 compute-0 podman[195053]: time="2026-02-16T13:31:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:31:29 compute-0 podman[195053]: @ - - [16/Feb/2026:13:31:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17245 "" "Go-http-client/1.1"
Feb 16 13:31:29 compute-0 podman[195053]: @ - - [16/Feb/2026:13:31:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2632 "" "Go-http-client/1.1"
Feb 16 13:31:29 compute-0 nova_compute[185723]: 2026-02-16 13:31:29.885 185727 DEBUG nova.compute.manager [req-584f4f49-d5e5-4c1f-8a4e-6ac1b1016da1 req-cb13300d-f80e-427d-a5e1-d89fc92582b7 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 1e7b84b5-47ea-44b8-b949-ad82d5f1c8aa] Received event network-vif-plugged-c0af6030-8607-421e-b581-c7d30d70b02d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:31:29 compute-0 nova_compute[185723]: 2026-02-16 13:31:29.887 185727 DEBUG oslo_concurrency.lockutils [req-584f4f49-d5e5-4c1f-8a4e-6ac1b1016da1 req-cb13300d-f80e-427d-a5e1-d89fc92582b7 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "1e7b84b5-47ea-44b8-b949-ad82d5f1c8aa-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:31:29 compute-0 nova_compute[185723]: 2026-02-16 13:31:29.887 185727 DEBUG oslo_concurrency.lockutils [req-584f4f49-d5e5-4c1f-8a4e-6ac1b1016da1 req-cb13300d-f80e-427d-a5e1-d89fc92582b7 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "1e7b84b5-47ea-44b8-b949-ad82d5f1c8aa-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:31:29 compute-0 nova_compute[185723]: 2026-02-16 13:31:29.888 185727 DEBUG oslo_concurrency.lockutils [req-584f4f49-d5e5-4c1f-8a4e-6ac1b1016da1 req-cb13300d-f80e-427d-a5e1-d89fc92582b7 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "1e7b84b5-47ea-44b8-b949-ad82d5f1c8aa-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:31:29 compute-0 nova_compute[185723]: 2026-02-16 13:31:29.888 185727 DEBUG nova.compute.manager [req-584f4f49-d5e5-4c1f-8a4e-6ac1b1016da1 req-cb13300d-f80e-427d-a5e1-d89fc92582b7 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 1e7b84b5-47ea-44b8-b949-ad82d5f1c8aa] No waiting events found dispatching network-vif-plugged-c0af6030-8607-421e-b581-c7d30d70b02d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:31:29 compute-0 nova_compute[185723]: 2026-02-16 13:31:29.889 185727 WARNING nova.compute.manager [req-584f4f49-d5e5-4c1f-8a4e-6ac1b1016da1 req-cb13300d-f80e-427d-a5e1-d89fc92582b7 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 1e7b84b5-47ea-44b8-b949-ad82d5f1c8aa] Received unexpected event network-vif-plugged-c0af6030-8607-421e-b581-c7d30d70b02d for instance with vm_state active and task_state None.
Feb 16 13:31:30 compute-0 nova_compute[185723]: 2026-02-16 13:31:30.972 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:31:31 compute-0 openstack_network_exporter[197909]: ERROR   13:31:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:31:31 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:31:31 compute-0 openstack_network_exporter[197909]: ERROR   13:31:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:31:31 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:31:33 compute-0 nova_compute[185723]: 2026-02-16 13:31:33.243 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:31:35 compute-0 sshd-session[209685]: Invalid user admin from 146.190.226.24 port 45640
Feb 16 13:31:35 compute-0 sshd-session[209685]: Connection closed by invalid user admin 146.190.226.24 port 45640 [preauth]
Feb 16 13:31:35 compute-0 nova_compute[185723]: 2026-02-16 13:31:35.975 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:31:38 compute-0 nova_compute[185723]: 2026-02-16 13:31:38.245 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:31:39 compute-0 podman[209710]: 2026-02-16 13:31:39.046112227 +0000 UTC m=+0.074479554 container health_status d69bd0be854ec8043fcba555b70c2cd311c7a2510d07d6bc9929fe7684abece9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 16 13:31:39 compute-0 podman[209709]: 2026-02-16 13:31:39.052088617 +0000 UTC m=+0.085290954 container health_status 93151dcbec9f159583042df5101bdd7b5b1bcb5f3117955b4bc9cd1c0e465b98 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2026-02-05T04:57:10Z, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, version=9.7, architecture=x86_64, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., name=ubi9/ubi-minimal, org.opencontainers.image.created=2026-02-05T04:57:10Z, distribution-scope=public, release=1770267347, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, managed_by=edpm_ansible, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers)
Feb 16 13:31:39 compute-0 ovn_controller[96072]: 2026-02-16T13:31:39Z|00014|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:eb:fe:c4 10.100.0.12
Feb 16 13:31:39 compute-0 ovn_controller[96072]: 2026-02-16T13:31:39Z|00015|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:eb:fe:c4 10.100.0.12
Feb 16 13:31:40 compute-0 nova_compute[185723]: 2026-02-16 13:31:40.978 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:31:43 compute-0 nova_compute[185723]: 2026-02-16 13:31:43.247 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:31:44 compute-0 podman[209751]: 2026-02-16 13:31:44.08717986 +0000 UTC m=+0.116759611 container health_status 19961a2f872cc72daf954f4dd556f980b5f58f137d145776df6132c167a8a93c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ovn_controller)
Feb 16 13:31:45 compute-0 nova_compute[185723]: 2026-02-16 13:31:45.980 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:31:48 compute-0 nova_compute[185723]: 2026-02-16 13:31:48.249 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:31:49 compute-0 sshd-session[209777]: Invalid user admin from 64.227.72.94 port 47964
Feb 16 13:31:49 compute-0 sshd-session[209777]: Connection closed by invalid user admin 64.227.72.94 port 47964 [preauth]
Feb 16 13:31:50 compute-0 nova_compute[185723]: 2026-02-16 13:31:50.984 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:31:53 compute-0 nova_compute[185723]: 2026-02-16 13:31:53.252 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:31:54 compute-0 podman[209779]: 2026-02-16 13:31:54.037242065 +0000 UTC m=+0.077741815 container health_status 4ec6105d1727f201f656d7e6e358931be8ceabc5af37fb5ad779abc620122180 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 16 13:31:56 compute-0 nova_compute[185723]: 2026-02-16 13:31:56.035 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:31:57 compute-0 ovn_controller[96072]: 2026-02-16T13:31:57Z|00085|memory_trim|INFO|Detected inactivity (last active 30014 ms ago): trimming memory
Feb 16 13:31:58 compute-0 nova_compute[185723]: 2026-02-16 13:31:58.293 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:31:59 compute-0 podman[195053]: time="2026-02-16T13:31:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:31:59 compute-0 podman[195053]: @ - - [16/Feb/2026:13:31:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17245 "" "Go-http-client/1.1"
Feb 16 13:31:59 compute-0 podman[195053]: @ - - [16/Feb/2026:13:31:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2640 "" "Go-http-client/1.1"
Feb 16 13:32:01 compute-0 nova_compute[185723]: 2026-02-16 13:32:01.037 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:32:01 compute-0 anacron[60464]: Job `cron.weekly' started
Feb 16 13:32:01 compute-0 anacron[60464]: Job `cron.weekly' terminated
Feb 16 13:32:01 compute-0 openstack_network_exporter[197909]: ERROR   13:32:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:32:01 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:32:01 compute-0 openstack_network_exporter[197909]: ERROR   13:32:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:32:01 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:32:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:32:03.226 105360 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:32:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:32:03.231 105360 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.006s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:32:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:32:03.233 105360 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:32:03 compute-0 nova_compute[185723]: 2026-02-16 13:32:03.298 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:32:06 compute-0 nova_compute[185723]: 2026-02-16 13:32:06.040 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:32:08 compute-0 nova_compute[185723]: 2026-02-16 13:32:08.299 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:32:09 compute-0 nova_compute[185723]: 2026-02-16 13:32:09.433 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:32:10 compute-0 podman[209806]: 2026-02-16 13:32:10.022454947 +0000 UTC m=+0.055160676 container health_status d69bd0be854ec8043fcba555b70c2cd311c7a2510d07d6bc9929fe7684abece9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Feb 16 13:32:10 compute-0 podman[209805]: 2026-02-16 13:32:10.028217821 +0000 UTC m=+0.064834557 container health_status 93151dcbec9f159583042df5101bdd7b5b1bcb5f3117955b4bc9cd1c0e465b98 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1770267347, com.redhat.component=ubi9-minimal-container, vcs-type=git, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., config_id=openstack_network_exporter, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, architecture=x86_64, build-date=2026-02-05T04:57:10Z, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, name=ubi9/ubi-minimal, version=9.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, managed_by=edpm_ansible, vendor=Red Hat, Inc., container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, org.opencontainers.image.created=2026-02-05T04:57:10Z)
Feb 16 13:32:10 compute-0 nova_compute[185723]: 2026-02-16 13:32:10.433 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:32:10 compute-0 nova_compute[185723]: 2026-02-16 13:32:10.434 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 16 13:32:10 compute-0 nova_compute[185723]: 2026-02-16 13:32:10.434 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 16 13:32:11 compute-0 nova_compute[185723]: 2026-02-16 13:32:11.085 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:32:11 compute-0 nova_compute[185723]: 2026-02-16 13:32:11.859 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Acquiring lock "refresh_cache-1e7b84b5-47ea-44b8-b949-ad82d5f1c8aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 13:32:11 compute-0 nova_compute[185723]: 2026-02-16 13:32:11.860 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Acquired lock "refresh_cache-1e7b84b5-47ea-44b8-b949-ad82d5f1c8aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 13:32:11 compute-0 nova_compute[185723]: 2026-02-16 13:32:11.860 185727 DEBUG nova.network.neutron [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] [instance: 1e7b84b5-47ea-44b8-b949-ad82d5f1c8aa] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Feb 16 13:32:11 compute-0 nova_compute[185723]: 2026-02-16 13:32:11.860 185727 DEBUG nova.objects.instance [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 1e7b84b5-47ea-44b8-b949-ad82d5f1c8aa obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 13:32:13 compute-0 nova_compute[185723]: 2026-02-16 13:32:13.301 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:32:15 compute-0 podman[209842]: 2026-02-16 13:32:15.08669798 +0000 UTC m=+0.126646617 container health_status 19961a2f872cc72daf954f4dd556f980b5f58f137d145776df6132c167a8a93c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Feb 16 13:32:16 compute-0 nova_compute[185723]: 2026-02-16 13:32:16.087 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:32:17 compute-0 nova_compute[185723]: 2026-02-16 13:32:17.941 185727 DEBUG nova.network.neutron [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] [instance: 1e7b84b5-47ea-44b8-b949-ad82d5f1c8aa] Updating instance_info_cache with network_info: [{"id": "c0af6030-8607-421e-b581-c7d30d70b02d", "address": "fa:16:3e:eb:fe:c4", "network": {"id": "34e10b77-8ec0-4af1-a031-d83792585eee", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-944275405-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9d212b8e966a499a9aad9b972bb7e76d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc0af6030-86", "ovs_interfaceid": "c0af6030-8607-421e-b581-c7d30d70b02d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 13:32:18 compute-0 nova_compute[185723]: 2026-02-16 13:32:18.344 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:32:19 compute-0 nova_compute[185723]: 2026-02-16 13:32:19.867 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Releasing lock "refresh_cache-1e7b84b5-47ea-44b8-b949-ad82d5f1c8aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 13:32:19 compute-0 nova_compute[185723]: 2026-02-16 13:32:19.867 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] [instance: 1e7b84b5-47ea-44b8-b949-ad82d5f1c8aa] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Feb 16 13:32:19 compute-0 nova_compute[185723]: 2026-02-16 13:32:19.868 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:32:19 compute-0 nova_compute[185723]: 2026-02-16 13:32:19.869 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:32:19 compute-0 nova_compute[185723]: 2026-02-16 13:32:19.869 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:32:19 compute-0 nova_compute[185723]: 2026-02-16 13:32:19.869 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:32:19 compute-0 nova_compute[185723]: 2026-02-16 13:32:19.870 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:32:19 compute-0 nova_compute[185723]: 2026-02-16 13:32:19.870 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 16 13:32:19 compute-0 nova_compute[185723]: 2026-02-16 13:32:19.871 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:32:19 compute-0 nova_compute[185723]: 2026-02-16 13:32:19.899 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:32:19 compute-0 nova_compute[185723]: 2026-02-16 13:32:19.899 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:32:19 compute-0 nova_compute[185723]: 2026-02-16 13:32:19.900 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:32:19 compute-0 nova_compute[185723]: 2026-02-16 13:32:19.900 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 16 13:32:19 compute-0 nova_compute[185723]: 2026-02-16 13:32:19.982 185727 DEBUG oslo_concurrency.processutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1e7b84b5-47ea-44b8-b949-ad82d5f1c8aa/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:32:20 compute-0 nova_compute[185723]: 2026-02-16 13:32:20.040 185727 DEBUG oslo_concurrency.processutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1e7b84b5-47ea-44b8-b949-ad82d5f1c8aa/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:32:20 compute-0 nova_compute[185723]: 2026-02-16 13:32:20.041 185727 DEBUG oslo_concurrency.processutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1e7b84b5-47ea-44b8-b949-ad82d5f1c8aa/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:32:20 compute-0 nova_compute[185723]: 2026-02-16 13:32:20.105 185727 DEBUG oslo_concurrency.processutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1e7b84b5-47ea-44b8-b949-ad82d5f1c8aa/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:32:20 compute-0 nova_compute[185723]: 2026-02-16 13:32:20.249 185727 WARNING nova.virt.libvirt.driver [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 13:32:20 compute-0 nova_compute[185723]: 2026-02-16 13:32:20.251 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5628MB free_disk=73.19847106933594GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 16 13:32:20 compute-0 nova_compute[185723]: 2026-02-16 13:32:20.251 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:32:20 compute-0 nova_compute[185723]: 2026-02-16 13:32:20.251 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:32:20 compute-0 nova_compute[185723]: 2026-02-16 13:32:20.335 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Instance 1e7b84b5-47ea-44b8-b949-ad82d5f1c8aa actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 16 13:32:20 compute-0 nova_compute[185723]: 2026-02-16 13:32:20.336 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 16 13:32:20 compute-0 nova_compute[185723]: 2026-02-16 13:32:20.336 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 16 13:32:20 compute-0 nova_compute[185723]: 2026-02-16 13:32:20.383 185727 DEBUG nova.compute.provider_tree [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Inventory has not changed in ProviderTree for provider: c9501a85-df32-4b8f-bce0-9425ef1e7866 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 13:32:20 compute-0 nova_compute[185723]: 2026-02-16 13:32:20.405 185727 DEBUG nova.scheduler.client.report [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Inventory has not changed for provider c9501a85-df32-4b8f-bce0-9425ef1e7866 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 13:32:20 compute-0 nova_compute[185723]: 2026-02-16 13:32:20.427 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 16 13:32:20 compute-0 nova_compute[185723]: 2026-02-16 13:32:20.428 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.176s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:32:21 compute-0 nova_compute[185723]: 2026-02-16 13:32:21.090 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:32:21 compute-0 sshd-session[209876]: Invalid user nagios from 188.166.42.159 port 39286
Feb 16 13:32:21 compute-0 sshd-session[209876]: Connection closed by invalid user nagios 188.166.42.159 port 39286 [preauth]
Feb 16 13:32:23 compute-0 nova_compute[185723]: 2026-02-16 13:32:23.347 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:32:23 compute-0 nova_compute[185723]: 2026-02-16 13:32:23.422 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:32:25 compute-0 podman[209878]: 2026-02-16 13:32:25.012195776 +0000 UTC m=+0.050999282 container health_status 4ec6105d1727f201f656d7e6e358931be8ceabc5af37fb5ad779abc620122180 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 16 13:32:26 compute-0 nova_compute[185723]: 2026-02-16 13:32:26.093 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:32:28 compute-0 nova_compute[185723]: 2026-02-16 13:32:28.392 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:32:29 compute-0 podman[195053]: time="2026-02-16T13:32:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:32:29 compute-0 podman[195053]: @ - - [16/Feb/2026:13:32:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17245 "" "Go-http-client/1.1"
Feb 16 13:32:29 compute-0 podman[195053]: @ - - [16/Feb/2026:13:32:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2631 "" "Go-http-client/1.1"
Feb 16 13:32:31 compute-0 nova_compute[185723]: 2026-02-16 13:32:31.094 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:32:31 compute-0 openstack_network_exporter[197909]: ERROR   13:32:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:32:31 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:32:31 compute-0 openstack_network_exporter[197909]: ERROR   13:32:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:32:31 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:32:31 compute-0 nova_compute[185723]: 2026-02-16 13:32:31.659 185727 DEBUG nova.virt.libvirt.driver [None req-d7cd74d4-7171-42db-a947-41618a902942 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] Creating tmpfile /var/lib/nova/instances/tmp4z78b9dd to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10041
Feb 16 13:32:31 compute-0 sshd-session[209903]: Invalid user user from 146.190.22.227 port 35690
Feb 16 13:32:31 compute-0 nova_compute[185723]: 2026-02-16 13:32:31.807 185727 DEBUG nova.compute.manager [None req-d7cd74d4-7171-42db-a947-41618a902942 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp4z78b9dd',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.9/site-packages/nova/compute/manager.py:8476
Feb 16 13:32:32 compute-0 sshd-session[209903]: Connection closed by invalid user user 146.190.22.227 port 35690 [preauth]
Feb 16 13:32:33 compute-0 nova_compute[185723]: 2026-02-16 13:32:33.395 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:32:34 compute-0 nova_compute[185723]: 2026-02-16 13:32:34.294 185727 DEBUG nova.compute.manager [None req-d7cd74d4-7171-42db-a947-41618a902942 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp4z78b9dd',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='07689e3f-f214-4f57-a662-bc531b614c3d',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8604
Feb 16 13:32:34 compute-0 nova_compute[185723]: 2026-02-16 13:32:34.356 185727 DEBUG oslo_concurrency.lockutils [None req-d7cd74d4-7171-42db-a947-41618a902942 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "refresh_cache-07689e3f-f214-4f57-a662-bc531b614c3d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 13:32:34 compute-0 nova_compute[185723]: 2026-02-16 13:32:34.357 185727 DEBUG oslo_concurrency.lockutils [None req-d7cd74d4-7171-42db-a947-41618a902942 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Acquired lock "refresh_cache-07689e3f-f214-4f57-a662-bc531b614c3d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 13:32:34 compute-0 nova_compute[185723]: 2026-02-16 13:32:34.358 185727 DEBUG nova.network.neutron [None req-d7cd74d4-7171-42db-a947-41618a902942 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 16 13:32:36 compute-0 nova_compute[185723]: 2026-02-16 13:32:36.097 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:32:36 compute-0 nova_compute[185723]: 2026-02-16 13:32:36.221 185727 DEBUG nova.network.neutron [None req-d7cd74d4-7171-42db-a947-41618a902942 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] Updating instance_info_cache with network_info: [{"id": "9ac0912f-d593-4dad-bf05-01d7dd0b6677", "address": "fa:16:3e:ba:1f:94", "network": {"id": "34e10b77-8ec0-4af1-a031-d83792585eee", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-944275405-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9d212b8e966a499a9aad9b972bb7e76d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9ac0912f-d5", "ovs_interfaceid": "9ac0912f-d593-4dad-bf05-01d7dd0b6677", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 13:32:36 compute-0 nova_compute[185723]: 2026-02-16 13:32:36.244 185727 DEBUG oslo_concurrency.lockutils [None req-d7cd74d4-7171-42db-a947-41618a902942 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Releasing lock "refresh_cache-07689e3f-f214-4f57-a662-bc531b614c3d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 13:32:36 compute-0 nova_compute[185723]: 2026-02-16 13:32:36.246 185727 DEBUG nova.virt.libvirt.driver [None req-d7cd74d4-7171-42db-a947-41618a902942 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp4z78b9dd',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='07689e3f-f214-4f57-a662-bc531b614c3d',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10827
Feb 16 13:32:36 compute-0 nova_compute[185723]: 2026-02-16 13:32:36.247 185727 DEBUG nova.virt.libvirt.driver [None req-d7cd74d4-7171-42db-a947-41618a902942 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] Creating instance directory: /var/lib/nova/instances/07689e3f-f214-4f57-a662-bc531b614c3d pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10840
Feb 16 13:32:36 compute-0 nova_compute[185723]: 2026-02-16 13:32:36.247 185727 DEBUG nova.virt.libvirt.driver [None req-d7cd74d4-7171-42db-a947-41618a902942 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] Creating disk.info with the contents: {'/var/lib/nova/instances/07689e3f-f214-4f57-a662-bc531b614c3d/disk': 'qcow2', '/var/lib/nova/instances/07689e3f-f214-4f57-a662-bc531b614c3d/disk.config': 'raw'} pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10854
Feb 16 13:32:36 compute-0 nova_compute[185723]: 2026-02-16 13:32:36.248 185727 DEBUG nova.virt.libvirt.driver [None req-d7cd74d4-7171-42db-a947-41618a902942 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10864
Feb 16 13:32:36 compute-0 nova_compute[185723]: 2026-02-16 13:32:36.248 185727 DEBUG nova.objects.instance [None req-d7cd74d4-7171-42db-a947-41618a902942 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lazy-loading 'trusted_certs' on Instance uuid 07689e3f-f214-4f57-a662-bc531b614c3d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 13:32:36 compute-0 nova_compute[185723]: 2026-02-16 13:32:36.294 185727 DEBUG oslo_concurrency.processutils [None req-d7cd74d4-7171-42db-a947-41618a902942 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:32:36 compute-0 nova_compute[185723]: 2026-02-16 13:32:36.371 185727 DEBUG oslo_concurrency.processutils [None req-d7cd74d4-7171-42db-a947-41618a902942 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:32:36 compute-0 nova_compute[185723]: 2026-02-16 13:32:36.373 185727 DEBUG oslo_concurrency.lockutils [None req-d7cd74d4-7171-42db-a947-41618a902942 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "755ae6cb63977e865a71a8c38a1a67ce95c9d0b7" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:32:36 compute-0 nova_compute[185723]: 2026-02-16 13:32:36.373 185727 DEBUG oslo_concurrency.lockutils [None req-d7cd74d4-7171-42db-a947-41618a902942 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "755ae6cb63977e865a71a8c38a1a67ce95c9d0b7" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:32:36 compute-0 nova_compute[185723]: 2026-02-16 13:32:36.384 185727 DEBUG oslo_concurrency.processutils [None req-d7cd74d4-7171-42db-a947-41618a902942 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:32:36 compute-0 nova_compute[185723]: 2026-02-16 13:32:36.447 185727 DEBUG oslo_concurrency.processutils [None req-d7cd74d4-7171-42db-a947-41618a902942 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:32:36 compute-0 nova_compute[185723]: 2026-02-16 13:32:36.448 185727 DEBUG oslo_concurrency.processutils [None req-d7cd74d4-7171-42db-a947-41618a902942 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7,backing_fmt=raw /var/lib/nova/instances/07689e3f-f214-4f57-a662-bc531b614c3d/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:32:36 compute-0 nova_compute[185723]: 2026-02-16 13:32:36.479 185727 DEBUG oslo_concurrency.processutils [None req-d7cd74d4-7171-42db-a947-41618a902942 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7,backing_fmt=raw /var/lib/nova/instances/07689e3f-f214-4f57-a662-bc531b614c3d/disk 1073741824" returned: 0 in 0.031s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:32:36 compute-0 nova_compute[185723]: 2026-02-16 13:32:36.481 185727 DEBUG oslo_concurrency.lockutils [None req-d7cd74d4-7171-42db-a947-41618a902942 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "755ae6cb63977e865a71a8c38a1a67ce95c9d0b7" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.108s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:32:36 compute-0 nova_compute[185723]: 2026-02-16 13:32:36.482 185727 DEBUG oslo_concurrency.processutils [None req-d7cd74d4-7171-42db-a947-41618a902942 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:32:36 compute-0 nova_compute[185723]: 2026-02-16 13:32:36.532 185727 DEBUG oslo_concurrency.processutils [None req-d7cd74d4-7171-42db-a947-41618a902942 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:32:36 compute-0 nova_compute[185723]: 2026-02-16 13:32:36.533 185727 DEBUG nova.virt.disk.api [None req-d7cd74d4-7171-42db-a947-41618a902942 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Checking if we can resize image /var/lib/nova/instances/07689e3f-f214-4f57-a662-bc531b614c3d/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Feb 16 13:32:36 compute-0 nova_compute[185723]: 2026-02-16 13:32:36.534 185727 DEBUG oslo_concurrency.processutils [None req-d7cd74d4-7171-42db-a947-41618a902942 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/07689e3f-f214-4f57-a662-bc531b614c3d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:32:36 compute-0 nova_compute[185723]: 2026-02-16 13:32:36.582 185727 DEBUG oslo_concurrency.processutils [None req-d7cd74d4-7171-42db-a947-41618a902942 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/07689e3f-f214-4f57-a662-bc531b614c3d/disk --force-share --output=json" returned: 0 in 0.048s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:32:36 compute-0 nova_compute[185723]: 2026-02-16 13:32:36.584 185727 DEBUG nova.virt.disk.api [None req-d7cd74d4-7171-42db-a947-41618a902942 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Cannot resize image /var/lib/nova/instances/07689e3f-f214-4f57-a662-bc531b614c3d/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Feb 16 13:32:36 compute-0 nova_compute[185723]: 2026-02-16 13:32:36.584 185727 DEBUG nova.objects.instance [None req-d7cd74d4-7171-42db-a947-41618a902942 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lazy-loading 'migration_context' on Instance uuid 07689e3f-f214-4f57-a662-bc531b614c3d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 13:32:36 compute-0 nova_compute[185723]: 2026-02-16 13:32:36.615 185727 DEBUG oslo_concurrency.processutils [None req-d7cd74d4-7171-42db-a947-41618a902942 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/07689e3f-f214-4f57-a662-bc531b614c3d/disk.config 485376 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:32:36 compute-0 nova_compute[185723]: 2026-02-16 13:32:36.635 185727 DEBUG oslo_concurrency.processutils [None req-d7cd74d4-7171-42db-a947-41618a902942 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/07689e3f-f214-4f57-a662-bc531b614c3d/disk.config 485376" returned: 0 in 0.020s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:32:36 compute-0 nova_compute[185723]: 2026-02-16 13:32:36.637 185727 DEBUG nova.virt.libvirt.volume.remotefs [None req-d7cd74d4-7171-42db-a947-41618a902942 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Copying file compute-1.ctlplane.example.com:/var/lib/nova/instances/07689e3f-f214-4f57-a662-bc531b614c3d/disk.config to /var/lib/nova/instances/07689e3f-f214-4f57-a662-bc531b614c3d copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Feb 16 13:32:36 compute-0 nova_compute[185723]: 2026-02-16 13:32:36.637 185727 DEBUG oslo_concurrency.processutils [None req-d7cd74d4-7171-42db-a947-41618a902942 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Running cmd (subprocess): scp -C -r compute-1.ctlplane.example.com:/var/lib/nova/instances/07689e3f-f214-4f57-a662-bc531b614c3d/disk.config /var/lib/nova/instances/07689e3f-f214-4f57-a662-bc531b614c3d execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:32:37 compute-0 nova_compute[185723]: 2026-02-16 13:32:37.062 185727 DEBUG oslo_concurrency.processutils [None req-d7cd74d4-7171-42db-a947-41618a902942 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] CMD "scp -C -r compute-1.ctlplane.example.com:/var/lib/nova/instances/07689e3f-f214-4f57-a662-bc531b614c3d/disk.config /var/lib/nova/instances/07689e3f-f214-4f57-a662-bc531b614c3d" returned: 0 in 0.425s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:32:37 compute-0 nova_compute[185723]: 2026-02-16 13:32:37.063 185727 DEBUG nova.virt.libvirt.driver [None req-d7cd74d4-7171-42db-a947-41618a902942 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10794
Feb 16 13:32:37 compute-0 nova_compute[185723]: 2026-02-16 13:32:37.064 185727 DEBUG nova.virt.libvirt.vif [None req-d7cd74d4-7171-42db-a947-41618a902942 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-16T13:30:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-1190069071',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-1190069071',id=9,image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-16T13:31:11Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='9d212b8e966a499a9aad9b972bb7e76d',ramdisk_id='',reservation_id='r-wzp5jrw0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-464275700',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-464275700-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2026-02-16T13:31:11Z,user_data=None,user_id='8712c0037def471dabf14879c0a418ec',uuid=07689e3f-f214-4f57-a662-bc531b614c3d,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "9ac0912f-d593-4dad-bf05-01d7dd0b6677", "address": "fa:16:3e:ba:1f:94", "network": {"id": "34e10b77-8ec0-4af1-a031-d83792585eee", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-944275405-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9d212b8e966a499a9aad9b972bb7e76d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap9ac0912f-d5", "ovs_interfaceid": "9ac0912f-d593-4dad-bf05-01d7dd0b6677", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Feb 16 13:32:37 compute-0 nova_compute[185723]: 2026-02-16 13:32:37.065 185727 DEBUG nova.network.os_vif_util [None req-d7cd74d4-7171-42db-a947-41618a902942 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Converting VIF {"id": "9ac0912f-d593-4dad-bf05-01d7dd0b6677", "address": "fa:16:3e:ba:1f:94", "network": {"id": "34e10b77-8ec0-4af1-a031-d83792585eee", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-944275405-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9d212b8e966a499a9aad9b972bb7e76d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap9ac0912f-d5", "ovs_interfaceid": "9ac0912f-d593-4dad-bf05-01d7dd0b6677", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 13:32:37 compute-0 nova_compute[185723]: 2026-02-16 13:32:37.065 185727 DEBUG nova.network.os_vif_util [None req-d7cd74d4-7171-42db-a947-41618a902942 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:ba:1f:94,bridge_name='br-int',has_traffic_filtering=True,id=9ac0912f-d593-4dad-bf05-01d7dd0b6677,network=Network(34e10b77-8ec0-4af1-a031-d83792585eee),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9ac0912f-d5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 13:32:37 compute-0 nova_compute[185723]: 2026-02-16 13:32:37.066 185727 DEBUG os_vif [None req-d7cd74d4-7171-42db-a947-41618a902942 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:ba:1f:94,bridge_name='br-int',has_traffic_filtering=True,id=9ac0912f-d593-4dad-bf05-01d7dd0b6677,network=Network(34e10b77-8ec0-4af1-a031-d83792585eee),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9ac0912f-d5') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Feb 16 13:32:37 compute-0 nova_compute[185723]: 2026-02-16 13:32:37.066 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:32:37 compute-0 nova_compute[185723]: 2026-02-16 13:32:37.067 185727 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:32:37 compute-0 nova_compute[185723]: 2026-02-16 13:32:37.067 185727 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 13:32:37 compute-0 nova_compute[185723]: 2026-02-16 13:32:37.071 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:32:37 compute-0 nova_compute[185723]: 2026-02-16 13:32:37.072 185727 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9ac0912f-d5, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:32:37 compute-0 nova_compute[185723]: 2026-02-16 13:32:37.072 185727 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap9ac0912f-d5, col_values=(('external_ids', {'iface-id': '9ac0912f-d593-4dad-bf05-01d7dd0b6677', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ba:1f:94', 'vm-uuid': '07689e3f-f214-4f57-a662-bc531b614c3d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:32:37 compute-0 nova_compute[185723]: 2026-02-16 13:32:37.075 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:32:37 compute-0 NetworkManager[56177]: <info>  [1771248757.0761] manager: (tap9ac0912f-d5): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/41)
Feb 16 13:32:37 compute-0 nova_compute[185723]: 2026-02-16 13:32:37.077 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 13:32:37 compute-0 nova_compute[185723]: 2026-02-16 13:32:37.084 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:32:37 compute-0 nova_compute[185723]: 2026-02-16 13:32:37.086 185727 INFO os_vif [None req-d7cd74d4-7171-42db-a947-41618a902942 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:ba:1f:94,bridge_name='br-int',has_traffic_filtering=True,id=9ac0912f-d593-4dad-bf05-01d7dd0b6677,network=Network(34e10b77-8ec0-4af1-a031-d83792585eee),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9ac0912f-d5')
Feb 16 13:32:37 compute-0 nova_compute[185723]: 2026-02-16 13:32:37.087 185727 DEBUG nova.virt.libvirt.driver [None req-d7cd74d4-7171-42db-a947-41618a902942 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10954
Feb 16 13:32:37 compute-0 nova_compute[185723]: 2026-02-16 13:32:37.087 185727 DEBUG nova.compute.manager [None req-d7cd74d4-7171-42db-a947-41618a902942 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp4z78b9dd',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='07689e3f-f214-4f57-a662-bc531b614c3d',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8668
Feb 16 13:32:37 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:32:37.631 105360 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=14, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '52:96:1b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '16:6c:7a:bd:49:39'}, ipsec=False) old=SB_Global(nb_cfg=13) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 13:32:37 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:32:37.632 105360 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 16 13:32:37 compute-0 nova_compute[185723]: 2026-02-16 13:32:37.682 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:32:38 compute-0 nova_compute[185723]: 2026-02-16 13:32:38.397 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:32:38 compute-0 nova_compute[185723]: 2026-02-16 13:32:38.435 185727 DEBUG nova.network.neutron [None req-d7cd74d4-7171-42db-a947-41618a902942 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] Port 9ac0912f-d593-4dad-bf05-01d7dd0b6677 updated with migration profile {'migrating_to': 'compute-0.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.9/site-packages/nova/network/neutron.py:354
Feb 16 13:32:38 compute-0 nova_compute[185723]: 2026-02-16 13:32:38.437 185727 DEBUG nova.compute.manager [None req-d7cd74d4-7171-42db-a947-41618a902942 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp4z78b9dd',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='07689e3f-f214-4f57-a662-bc531b614c3d',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8723
Feb 16 13:32:38 compute-0 systemd[1]: Starting libvirt proxy daemon...
Feb 16 13:32:38 compute-0 systemd[1]: Started libvirt proxy daemon.
Feb 16 13:32:38 compute-0 kernel: tap9ac0912f-d5: entered promiscuous mode
Feb 16 13:32:38 compute-0 NetworkManager[56177]: <info>  [1771248758.7651] manager: (tap9ac0912f-d5): new Tun device (/org/freedesktop/NetworkManager/Devices/42)
Feb 16 13:32:38 compute-0 systemd-udevd[209959]: Network interface NamePolicy= disabled on kernel command line.
Feb 16 13:32:38 compute-0 ovn_controller[96072]: 2026-02-16T13:32:38Z|00086|binding|INFO|Claiming lport 9ac0912f-d593-4dad-bf05-01d7dd0b6677 for this additional chassis.
Feb 16 13:32:38 compute-0 ovn_controller[96072]: 2026-02-16T13:32:38Z|00087|binding|INFO|9ac0912f-d593-4dad-bf05-01d7dd0b6677: Claiming fa:16:3e:ba:1f:94 10.100.0.14
Feb 16 13:32:38 compute-0 nova_compute[185723]: 2026-02-16 13:32:38.802 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:32:38 compute-0 ovn_controller[96072]: 2026-02-16T13:32:38Z|00088|binding|INFO|Setting lport 9ac0912f-d593-4dad-bf05-01d7dd0b6677 ovn-installed in OVS
Feb 16 13:32:38 compute-0 nova_compute[185723]: 2026-02-16 13:32:38.809 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:32:38 compute-0 NetworkManager[56177]: <info>  [1771248758.8291] device (tap9ac0912f-d5): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 16 13:32:38 compute-0 NetworkManager[56177]: <info>  [1771248758.8300] device (tap9ac0912f-d5): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Feb 16 13:32:38 compute-0 systemd-machined[155229]: New machine qemu-7-instance-00000009.
Feb 16 13:32:38 compute-0 systemd[1]: Started Virtual Machine qemu-7-instance-00000009.
Feb 16 13:32:39 compute-0 nova_compute[185723]: 2026-02-16 13:32:39.386 185727 DEBUG nova.virt.driver [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] Emitting event <LifecycleEvent: 1771248759.3863714, 07689e3f-f214-4f57-a662-bc531b614c3d => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 13:32:39 compute-0 nova_compute[185723]: 2026-02-16 13:32:39.387 185727 INFO nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] VM Started (Lifecycle Event)
Feb 16 13:32:39 compute-0 nova_compute[185723]: 2026-02-16 13:32:39.413 185727 DEBUG nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:32:39 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:32:39.635 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=b0e583b2-47d7-4bde-bbd6-282143e0c194, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '14'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:32:41 compute-0 podman[209992]: 2026-02-16 13:32:41.203206962 +0000 UTC m=+0.055667568 container health_status d69bd0be854ec8043fcba555b70c2cd311c7a2510d07d6bc9929fe7684abece9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20260127)
Feb 16 13:32:41 compute-0 podman[209991]: 2026-02-16 13:32:41.234246696 +0000 UTC m=+0.086732573 container health_status 93151dcbec9f159583042df5101bdd7b5b1bcb5f3117955b4bc9cd1c0e465b98 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, org.opencontainers.image.created=2026-02-05T04:57:10Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, managed_by=edpm_ansible, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., version=9.7, config_id=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9/ubi-minimal, vendor=Red Hat, Inc., architecture=x86_64, build-date=2026-02-05T04:57:10Z, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.openshift.expose-services=, release=1770267347, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git)
Feb 16 13:32:41 compute-0 sshd-session[210010]: Invalid user admin from 64.227.72.94 port 53726
Feb 16 13:32:41 compute-0 nova_compute[185723]: 2026-02-16 13:32:41.808 185727 DEBUG nova.virt.driver [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] Emitting event <LifecycleEvent: 1771248761.8076274, 07689e3f-f214-4f57-a662-bc531b614c3d => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 13:32:41 compute-0 nova_compute[185723]: 2026-02-16 13:32:41.809 185727 INFO nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] VM Resumed (Lifecycle Event)
Feb 16 13:32:41 compute-0 nova_compute[185723]: 2026-02-16 13:32:41.860 185727 DEBUG nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:32:41 compute-0 nova_compute[185723]: 2026-02-16 13:32:41.865 185727 DEBUG nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 16 13:32:41 compute-0 sshd-session[210010]: Connection closed by invalid user admin 64.227.72.94 port 53726 [preauth]
Feb 16 13:32:41 compute-0 nova_compute[185723]: 2026-02-16 13:32:41.926 185727 INFO nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] During the sync_power process the instance has moved from host compute-1.ctlplane.example.com to host compute-0.ctlplane.example.com
Feb 16 13:32:42 compute-0 nova_compute[185723]: 2026-02-16 13:32:42.076 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:32:42 compute-0 sshd-session[210049]: Invalid user admin from 146.190.226.24 port 42984
Feb 16 13:32:43 compute-0 sshd-session[210049]: Connection closed by invalid user admin 146.190.226.24 port 42984 [preauth]
Feb 16 13:32:43 compute-0 nova_compute[185723]: 2026-02-16 13:32:43.398 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:32:43 compute-0 ovn_controller[96072]: 2026-02-16T13:32:43Z|00089|binding|INFO|Claiming lport 9ac0912f-d593-4dad-bf05-01d7dd0b6677 for this chassis.
Feb 16 13:32:43 compute-0 ovn_controller[96072]: 2026-02-16T13:32:43Z|00090|binding|INFO|9ac0912f-d593-4dad-bf05-01d7dd0b6677: Claiming fa:16:3e:ba:1f:94 10.100.0.14
Feb 16 13:32:43 compute-0 ovn_controller[96072]: 2026-02-16T13:32:43Z|00091|binding|INFO|Setting lport 9ac0912f-d593-4dad-bf05-01d7dd0b6677 up in Southbound
Feb 16 13:32:43 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:32:43.677 105360 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ba:1f:94 10.100.0.14'], port_security=['fa:16:3e:ba:1f:94 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '07689e3f-f214-4f57-a662-bc531b614c3d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-34e10b77-8ec0-4af1-a031-d83792585eee', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9d212b8e966a499a9aad9b972bb7e76d', 'neutron:revision_number': '11', 'neutron:security_group_ids': '145107b4-bbb8-4e69-b3bf-db62f38a1f3d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1fdee5c0-c83c-45cf-986e-fa2b109e36c1, chassis=[<ovs.db.idl.Row object at 0x7f2e53046760>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2e53046760>], logical_port=9ac0912f-d593-4dad-bf05-01d7dd0b6677) old=Port_Binding(up=[False], additional_chassis=[<ovs.db.idl.Row object at 0x7f2e53046760>], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 13:32:43 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:32:43.679 105360 INFO neutron.agent.ovn.metadata.agent [-] Port 9ac0912f-d593-4dad-bf05-01d7dd0b6677 in datapath 34e10b77-8ec0-4af1-a031-d83792585eee bound to our chassis
Feb 16 13:32:43 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:32:43.680 105360 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 34e10b77-8ec0-4af1-a031-d83792585eee
Feb 16 13:32:43 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:32:43.695 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[90490594-98f3-4a8f-aaa4-d94c2cdd69a0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:32:43 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:32:43.718 206452 DEBUG oslo.privsep.daemon [-] privsep: reply[8fa6c60b-11f6-4e89-9e58-82d6a699964b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:32:43 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:32:43.722 206452 DEBUG oslo.privsep.daemon [-] privsep: reply[8d49e5b2-e1cb-4ae5-a5f5-4f5e607d7b3d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:32:43 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:32:43.749 206452 DEBUG oslo.privsep.daemon [-] privsep: reply[3cc21492-7dbe-4de8-af20-f8cceb8378d1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:32:43 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:32:43.767 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[e67f1a4b-a0f4-4701-8ff5-81c8c6049f81]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap34e10b77-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b3:31:13'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 13, 'tx_packets': 5, 'rx_bytes': 826, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 13, 'tx_packets': 5, 'rx_bytes': 826, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 25], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 477097, 'reachable_time': 25835, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 210056, 'error': None, 'target': 'ovnmeta-34e10b77-8ec0-4af1-a031-d83792585eee', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:32:43 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:32:43.783 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[1ca591eb-d996-47e4-951c-82dc5e50efa4]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap34e10b77-81'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 477105, 'tstamp': 477105}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 210057, 'error': None, 'target': 'ovnmeta-34e10b77-8ec0-4af1-a031-d83792585eee', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap34e10b77-81'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 477107, 'tstamp': 477107}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 210057, 'error': None, 'target': 'ovnmeta-34e10b77-8ec0-4af1-a031-d83792585eee', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:32:43 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:32:43.785 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap34e10b77-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:32:43 compute-0 nova_compute[185723]: 2026-02-16 13:32:43.788 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:32:43 compute-0 nova_compute[185723]: 2026-02-16 13:32:43.789 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:32:43 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:32:43.789 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap34e10b77-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:32:43 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:32:43.790 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 13:32:43 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:32:43.790 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap34e10b77-80, col_values=(('external_ids', {'iface-id': '37eb0121-3449-47dc-8fd8-69d7f9268b6f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:32:43 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:32:43.791 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 13:32:44 compute-0 nova_compute[185723]: 2026-02-16 13:32:44.051 185727 INFO nova.compute.manager [None req-d7cd74d4-7171-42db-a947-41618a902942 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] Post operation of migration started
Feb 16 13:32:44 compute-0 nova_compute[185723]: 2026-02-16 13:32:44.912 185727 DEBUG oslo_concurrency.lockutils [None req-d7cd74d4-7171-42db-a947-41618a902942 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "refresh_cache-07689e3f-f214-4f57-a662-bc531b614c3d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 13:32:44 compute-0 nova_compute[185723]: 2026-02-16 13:32:44.913 185727 DEBUG oslo_concurrency.lockutils [None req-d7cd74d4-7171-42db-a947-41618a902942 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Acquired lock "refresh_cache-07689e3f-f214-4f57-a662-bc531b614c3d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 13:32:44 compute-0 nova_compute[185723]: 2026-02-16 13:32:44.913 185727 DEBUG nova.network.neutron [None req-d7cd74d4-7171-42db-a947-41618a902942 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 16 13:32:46 compute-0 podman[210058]: 2026-02-16 13:32:46.048296414 +0000 UTC m=+0.090619689 container health_status 19961a2f872cc72daf954f4dd556f980b5f58f137d145776df6132c167a8a93c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Feb 16 13:32:47 compute-0 nova_compute[185723]: 2026-02-16 13:32:47.078 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:32:48 compute-0 nova_compute[185723]: 2026-02-16 13:32:48.401 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:32:50 compute-0 nova_compute[185723]: 2026-02-16 13:32:50.021 185727 DEBUG nova.network.neutron [None req-d7cd74d4-7171-42db-a947-41618a902942 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] Updating instance_info_cache with network_info: [{"id": "9ac0912f-d593-4dad-bf05-01d7dd0b6677", "address": "fa:16:3e:ba:1f:94", "network": {"id": "34e10b77-8ec0-4af1-a031-d83792585eee", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-944275405-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9d212b8e966a499a9aad9b972bb7e76d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9ac0912f-d5", "ovs_interfaceid": "9ac0912f-d593-4dad-bf05-01d7dd0b6677", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 13:32:50 compute-0 nova_compute[185723]: 2026-02-16 13:32:50.089 185727 DEBUG oslo_concurrency.lockutils [None req-d7cd74d4-7171-42db-a947-41618a902942 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Releasing lock "refresh_cache-07689e3f-f214-4f57-a662-bc531b614c3d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 13:32:50 compute-0 nova_compute[185723]: 2026-02-16 13:32:50.136 185727 DEBUG oslo_concurrency.lockutils [None req-d7cd74d4-7171-42db-a947-41618a902942 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:32:50 compute-0 nova_compute[185723]: 2026-02-16 13:32:50.137 185727 DEBUG oslo_concurrency.lockutils [None req-d7cd74d4-7171-42db-a947-41618a902942 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:32:50 compute-0 nova_compute[185723]: 2026-02-16 13:32:50.137 185727 DEBUG oslo_concurrency.lockutils [None req-d7cd74d4-7171-42db-a947-41618a902942 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:32:50 compute-0 nova_compute[185723]: 2026-02-16 13:32:50.142 185727 INFO nova.virt.libvirt.driver [None req-d7cd74d4-7171-42db-a947-41618a902942 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Feb 16 13:32:50 compute-0 virtqemud[184843]: Domain id=7 name='instance-00000009' uuid=07689e3f-f214-4f57-a662-bc531b614c3d is tainted: custom-monitor
Feb 16 13:32:51 compute-0 nova_compute[185723]: 2026-02-16 13:32:51.150 185727 INFO nova.virt.libvirt.driver [None req-d7cd74d4-7171-42db-a947-41618a902942 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Feb 16 13:32:52 compute-0 nova_compute[185723]: 2026-02-16 13:32:52.081 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:32:52 compute-0 nova_compute[185723]: 2026-02-16 13:32:52.157 185727 INFO nova.virt.libvirt.driver [None req-d7cd74d4-7171-42db-a947-41618a902942 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Feb 16 13:32:52 compute-0 nova_compute[185723]: 2026-02-16 13:32:52.164 185727 DEBUG nova.compute.manager [None req-d7cd74d4-7171-42db-a947-41618a902942 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:32:52 compute-0 nova_compute[185723]: 2026-02-16 13:32:52.798 185727 DEBUG nova.objects.instance [None req-d7cd74d4-7171-42db-a947-41618a902942 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Feb 16 13:32:53 compute-0 nova_compute[185723]: 2026-02-16 13:32:53.404 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:32:56 compute-0 podman[210084]: 2026-02-16 13:32:56.035364206 +0000 UTC m=+0.067063052 container health_status 4ec6105d1727f201f656d7e6e358931be8ceabc5af37fb5ad779abc620122180 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 16 13:32:57 compute-0 nova_compute[185723]: 2026-02-16 13:32:57.083 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:32:58 compute-0 nova_compute[185723]: 2026-02-16 13:32:58.448 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:32:59 compute-0 podman[195053]: time="2026-02-16T13:32:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:32:59 compute-0 podman[195053]: @ - - [16/Feb/2026:13:32:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17245 "" "Go-http-client/1.1"
Feb 16 13:32:59 compute-0 podman[195053]: @ - - [16/Feb/2026:13:32:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2637 "" "Go-http-client/1.1"
Feb 16 13:33:01 compute-0 openstack_network_exporter[197909]: ERROR   13:33:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:33:01 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:33:01 compute-0 openstack_network_exporter[197909]: ERROR   13:33:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:33:01 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:33:02 compute-0 nova_compute[185723]: 2026-02-16 13:33:02.085 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:33:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:33:03.226 105360 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:33:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:33:03.227 105360 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:33:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:33:03.227 105360 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:33:03 compute-0 nova_compute[185723]: 2026-02-16 13:33:03.411 185727 DEBUG oslo_concurrency.lockutils [None req-46473dd4-3877-4938-b91c-3356dce4bf4d 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] Acquiring lock "1e7b84b5-47ea-44b8-b949-ad82d5f1c8aa" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:33:03 compute-0 nova_compute[185723]: 2026-02-16 13:33:03.412 185727 DEBUG oslo_concurrency.lockutils [None req-46473dd4-3877-4938-b91c-3356dce4bf4d 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] Lock "1e7b84b5-47ea-44b8-b949-ad82d5f1c8aa" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:33:03 compute-0 nova_compute[185723]: 2026-02-16 13:33:03.412 185727 DEBUG oslo_concurrency.lockutils [None req-46473dd4-3877-4938-b91c-3356dce4bf4d 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] Acquiring lock "1e7b84b5-47ea-44b8-b949-ad82d5f1c8aa-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:33:03 compute-0 nova_compute[185723]: 2026-02-16 13:33:03.412 185727 DEBUG oslo_concurrency.lockutils [None req-46473dd4-3877-4938-b91c-3356dce4bf4d 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] Lock "1e7b84b5-47ea-44b8-b949-ad82d5f1c8aa-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:33:03 compute-0 nova_compute[185723]: 2026-02-16 13:33:03.412 185727 DEBUG oslo_concurrency.lockutils [None req-46473dd4-3877-4938-b91c-3356dce4bf4d 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] Lock "1e7b84b5-47ea-44b8-b949-ad82d5f1c8aa-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:33:03 compute-0 nova_compute[185723]: 2026-02-16 13:33:03.413 185727 INFO nova.compute.manager [None req-46473dd4-3877-4938-b91c-3356dce4bf4d 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] [instance: 1e7b84b5-47ea-44b8-b949-ad82d5f1c8aa] Terminating instance
Feb 16 13:33:03 compute-0 nova_compute[185723]: 2026-02-16 13:33:03.414 185727 DEBUG nova.compute.manager [None req-46473dd4-3877-4938-b91c-3356dce4bf4d 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] [instance: 1e7b84b5-47ea-44b8-b949-ad82d5f1c8aa] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Feb 16 13:33:03 compute-0 nova_compute[185723]: 2026-02-16 13:33:03.432 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:33:03 compute-0 nova_compute[185723]: 2026-02-16 13:33:03.432 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Feb 16 13:33:03 compute-0 kernel: tapc0af6030-86 (unregistering): left promiscuous mode
Feb 16 13:33:03 compute-0 NetworkManager[56177]: <info>  [1771248783.4489] device (tapc0af6030-86): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Feb 16 13:33:03 compute-0 ovn_controller[96072]: 2026-02-16T13:33:03Z|00092|binding|INFO|Releasing lport c0af6030-8607-421e-b581-c7d30d70b02d from this chassis (sb_readonly=0)
Feb 16 13:33:03 compute-0 ovn_controller[96072]: 2026-02-16T13:33:03Z|00093|binding|INFO|Setting lport c0af6030-8607-421e-b581-c7d30d70b02d down in Southbound
Feb 16 13:33:03 compute-0 nova_compute[185723]: 2026-02-16 13:33:03.453 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:33:03 compute-0 ovn_controller[96072]: 2026-02-16T13:33:03Z|00094|binding|INFO|Removing iface tapc0af6030-86 ovn-installed in OVS
Feb 16 13:33:03 compute-0 nova_compute[185723]: 2026-02-16 13:33:03.456 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:33:03 compute-0 nova_compute[185723]: 2026-02-16 13:33:03.458 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Feb 16 13:33:03 compute-0 nova_compute[185723]: 2026-02-16 13:33:03.461 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:33:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:33:03.464 105360 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:eb:fe:c4 10.100.0.12'], port_security=['fa:16:3e:eb:fe:c4 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '1e7b84b5-47ea-44b8-b949-ad82d5f1c8aa', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-34e10b77-8ec0-4af1-a031-d83792585eee', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9d212b8e966a499a9aad9b972bb7e76d', 'neutron:revision_number': '4', 'neutron:security_group_ids': '145107b4-bbb8-4e69-b3bf-db62f38a1f3d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1fdee5c0-c83c-45cf-986e-fa2b109e36c1, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2e53046760>], logical_port=c0af6030-8607-421e-b581-c7d30d70b02d) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2e53046760>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 13:33:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:33:03.465 105360 INFO neutron.agent.ovn.metadata.agent [-] Port c0af6030-8607-421e-b581-c7d30d70b02d in datapath 34e10b77-8ec0-4af1-a031-d83792585eee unbound from our chassis
Feb 16 13:33:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:33:03.466 105360 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 34e10b77-8ec0-4af1-a031-d83792585eee
Feb 16 13:33:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:33:03.481 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[18655a54-3195-48c3-b3f3-41a82f8906f7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:33:03 compute-0 systemd[1]: machine-qemu\x2d6\x2dinstance\x2d0000000a.scope: Deactivated successfully.
Feb 16 13:33:03 compute-0 systemd[1]: machine-qemu\x2d6\x2dinstance\x2d0000000a.scope: Consumed 15.614s CPU time.
Feb 16 13:33:03 compute-0 systemd-machined[155229]: Machine qemu-6-instance-0000000a terminated.
Feb 16 13:33:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:33:03.510 206452 DEBUG oslo.privsep.daemon [-] privsep: reply[7fed17b8-2df7-4c94-add1-1bb32eefd8ec]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:33:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:33:03.514 206452 DEBUG oslo.privsep.daemon [-] privsep: reply[3c8ee66b-92b4-463f-b42e-eb0899e850f5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:33:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:33:03.540 206452 DEBUG oslo.privsep.daemon [-] privsep: reply[cbb0185d-bb01-4979-93e3-1762b602e7fc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:33:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:33:03.554 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[6c36f74c-0d67-4f03-9868-10d53e4687a8]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap34e10b77-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b3:31:13'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 28, 'tx_packets': 7, 'rx_bytes': 1456, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 28, 'tx_packets': 7, 'rx_bytes': 1456, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 25], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 477097, 'reachable_time': 25835, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 210122, 'error': None, 'target': 'ovnmeta-34e10b77-8ec0-4af1-a031-d83792585eee', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:33:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:33:03.568 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[bf7ebb63-828c-4eed-b320-9298d1297e66]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap34e10b77-81'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 477105, 'tstamp': 477105}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 210123, 'error': None, 'target': 'ovnmeta-34e10b77-8ec0-4af1-a031-d83792585eee', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap34e10b77-81'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 477107, 'tstamp': 477107}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 210123, 'error': None, 'target': 'ovnmeta-34e10b77-8ec0-4af1-a031-d83792585eee', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:33:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:33:03.569 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap34e10b77-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:33:03 compute-0 nova_compute[185723]: 2026-02-16 13:33:03.571 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:33:03 compute-0 nova_compute[185723]: 2026-02-16 13:33:03.575 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:33:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:33:03.576 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap34e10b77-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:33:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:33:03.577 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 13:33:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:33:03.577 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap34e10b77-80, col_values=(('external_ids', {'iface-id': '37eb0121-3449-47dc-8fd8-69d7f9268b6f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:33:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:33:03.578 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 13:33:03 compute-0 nova_compute[185723]: 2026-02-16 13:33:03.671 185727 INFO nova.virt.libvirt.driver [-] [instance: 1e7b84b5-47ea-44b8-b949-ad82d5f1c8aa] Instance destroyed successfully.
Feb 16 13:33:03 compute-0 nova_compute[185723]: 2026-02-16 13:33:03.671 185727 DEBUG nova.objects.instance [None req-46473dd4-3877-4938-b91c-3356dce4bf4d 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] Lazy-loading 'resources' on Instance uuid 1e7b84b5-47ea-44b8-b949-ad82d5f1c8aa obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 13:33:03 compute-0 nova_compute[185723]: 2026-02-16 13:33:03.694 185727 DEBUG nova.virt.libvirt.vif [None req-46473dd4-3877-4938-b91c-3356dce4bf4d 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-16T13:31:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-407301435',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-407301435',id=10,image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-16T13:31:28Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='9d212b8e966a499a9aad9b972bb7e76d',ramdisk_id='',reservation_id='r-w7yvpawb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-464275700',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-464275700-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-16T13:31:28Z,user_data=None,user_id='8712c0037def471dabf14879c0a418ec',uuid=1e7b84b5-47ea-44b8-b949-ad82d5f1c8aa,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c0af6030-8607-421e-b581-c7d30d70b02d", "address": "fa:16:3e:eb:fe:c4", "network": {"id": "34e10b77-8ec0-4af1-a031-d83792585eee", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-944275405-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9d212b8e966a499a9aad9b972bb7e76d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc0af6030-86", "ovs_interfaceid": "c0af6030-8607-421e-b581-c7d30d70b02d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Feb 16 13:33:03 compute-0 nova_compute[185723]: 2026-02-16 13:33:03.695 185727 DEBUG nova.network.os_vif_util [None req-46473dd4-3877-4938-b91c-3356dce4bf4d 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] Converting VIF {"id": "c0af6030-8607-421e-b581-c7d30d70b02d", "address": "fa:16:3e:eb:fe:c4", "network": {"id": "34e10b77-8ec0-4af1-a031-d83792585eee", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-944275405-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9d212b8e966a499a9aad9b972bb7e76d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc0af6030-86", "ovs_interfaceid": "c0af6030-8607-421e-b581-c7d30d70b02d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 13:33:03 compute-0 nova_compute[185723]: 2026-02-16 13:33:03.695 185727 DEBUG nova.network.os_vif_util [None req-46473dd4-3877-4938-b91c-3356dce4bf4d 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:eb:fe:c4,bridge_name='br-int',has_traffic_filtering=True,id=c0af6030-8607-421e-b581-c7d30d70b02d,network=Network(34e10b77-8ec0-4af1-a031-d83792585eee),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc0af6030-86') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 13:33:03 compute-0 nova_compute[185723]: 2026-02-16 13:33:03.696 185727 DEBUG os_vif [None req-46473dd4-3877-4938-b91c-3356dce4bf4d 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:eb:fe:c4,bridge_name='br-int',has_traffic_filtering=True,id=c0af6030-8607-421e-b581-c7d30d70b02d,network=Network(34e10b77-8ec0-4af1-a031-d83792585eee),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc0af6030-86') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Feb 16 13:33:03 compute-0 nova_compute[185723]: 2026-02-16 13:33:03.697 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:33:03 compute-0 nova_compute[185723]: 2026-02-16 13:33:03.697 185727 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc0af6030-86, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:33:03 compute-0 nova_compute[185723]: 2026-02-16 13:33:03.699 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:33:03 compute-0 nova_compute[185723]: 2026-02-16 13:33:03.701 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:33:03 compute-0 nova_compute[185723]: 2026-02-16 13:33:03.704 185727 INFO os_vif [None req-46473dd4-3877-4938-b91c-3356dce4bf4d 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:eb:fe:c4,bridge_name='br-int',has_traffic_filtering=True,id=c0af6030-8607-421e-b581-c7d30d70b02d,network=Network(34e10b77-8ec0-4af1-a031-d83792585eee),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc0af6030-86')
Feb 16 13:33:03 compute-0 nova_compute[185723]: 2026-02-16 13:33:03.705 185727 INFO nova.virt.libvirt.driver [None req-46473dd4-3877-4938-b91c-3356dce4bf4d 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] [instance: 1e7b84b5-47ea-44b8-b949-ad82d5f1c8aa] Deleting instance files /var/lib/nova/instances/1e7b84b5-47ea-44b8-b949-ad82d5f1c8aa_del
Feb 16 13:33:03 compute-0 nova_compute[185723]: 2026-02-16 13:33:03.705 185727 INFO nova.virt.libvirt.driver [None req-46473dd4-3877-4938-b91c-3356dce4bf4d 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] [instance: 1e7b84b5-47ea-44b8-b949-ad82d5f1c8aa] Deletion of /var/lib/nova/instances/1e7b84b5-47ea-44b8-b949-ad82d5f1c8aa_del complete
Feb 16 13:33:03 compute-0 nova_compute[185723]: 2026-02-16 13:33:03.802 185727 INFO nova.compute.manager [None req-46473dd4-3877-4938-b91c-3356dce4bf4d 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] [instance: 1e7b84b5-47ea-44b8-b949-ad82d5f1c8aa] Took 0.39 seconds to destroy the instance on the hypervisor.
Feb 16 13:33:03 compute-0 nova_compute[185723]: 2026-02-16 13:33:03.803 185727 DEBUG oslo.service.loopingcall [None req-46473dd4-3877-4938-b91c-3356dce4bf4d 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Feb 16 13:33:03 compute-0 nova_compute[185723]: 2026-02-16 13:33:03.804 185727 DEBUG nova.compute.manager [-] [instance: 1e7b84b5-47ea-44b8-b949-ad82d5f1c8aa] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Feb 16 13:33:03 compute-0 nova_compute[185723]: 2026-02-16 13:33:03.804 185727 DEBUG nova.network.neutron [-] [instance: 1e7b84b5-47ea-44b8-b949-ad82d5f1c8aa] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Feb 16 13:33:05 compute-0 nova_compute[185723]: 2026-02-16 13:33:05.263 185727 DEBUG nova.compute.manager [req-b2b980d4-d29b-4c2b-8760-5c37af219826 req-b1df04e0-ae71-48ee-8d81-756717e24154 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 1e7b84b5-47ea-44b8-b949-ad82d5f1c8aa] Received event network-vif-unplugged-c0af6030-8607-421e-b581-c7d30d70b02d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:33:05 compute-0 nova_compute[185723]: 2026-02-16 13:33:05.264 185727 DEBUG oslo_concurrency.lockutils [req-b2b980d4-d29b-4c2b-8760-5c37af219826 req-b1df04e0-ae71-48ee-8d81-756717e24154 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "1e7b84b5-47ea-44b8-b949-ad82d5f1c8aa-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:33:05 compute-0 nova_compute[185723]: 2026-02-16 13:33:05.265 185727 DEBUG oslo_concurrency.lockutils [req-b2b980d4-d29b-4c2b-8760-5c37af219826 req-b1df04e0-ae71-48ee-8d81-756717e24154 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "1e7b84b5-47ea-44b8-b949-ad82d5f1c8aa-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:33:05 compute-0 nova_compute[185723]: 2026-02-16 13:33:05.265 185727 DEBUG oslo_concurrency.lockutils [req-b2b980d4-d29b-4c2b-8760-5c37af219826 req-b1df04e0-ae71-48ee-8d81-756717e24154 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "1e7b84b5-47ea-44b8-b949-ad82d5f1c8aa-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:33:05 compute-0 nova_compute[185723]: 2026-02-16 13:33:05.265 185727 DEBUG nova.compute.manager [req-b2b980d4-d29b-4c2b-8760-5c37af219826 req-b1df04e0-ae71-48ee-8d81-756717e24154 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 1e7b84b5-47ea-44b8-b949-ad82d5f1c8aa] No waiting events found dispatching network-vif-unplugged-c0af6030-8607-421e-b581-c7d30d70b02d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:33:05 compute-0 nova_compute[185723]: 2026-02-16 13:33:05.265 185727 DEBUG nova.compute.manager [req-b2b980d4-d29b-4c2b-8760-5c37af219826 req-b1df04e0-ae71-48ee-8d81-756717e24154 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 1e7b84b5-47ea-44b8-b949-ad82d5f1c8aa] Received event network-vif-unplugged-c0af6030-8607-421e-b581-c7d30d70b02d for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Feb 16 13:33:06 compute-0 nova_compute[185723]: 2026-02-16 13:33:06.776 185727 DEBUG nova.network.neutron [-] [instance: 1e7b84b5-47ea-44b8-b949-ad82d5f1c8aa] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 13:33:06 compute-0 nova_compute[185723]: 2026-02-16 13:33:06.893 185727 INFO nova.compute.manager [-] [instance: 1e7b84b5-47ea-44b8-b949-ad82d5f1c8aa] Took 3.09 seconds to deallocate network for instance.
Feb 16 13:33:06 compute-0 nova_compute[185723]: 2026-02-16 13:33:06.965 185727 DEBUG oslo_concurrency.lockutils [None req-46473dd4-3877-4938-b91c-3356dce4bf4d 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:33:06 compute-0 nova_compute[185723]: 2026-02-16 13:33:06.965 185727 DEBUG oslo_concurrency.lockutils [None req-46473dd4-3877-4938-b91c-3356dce4bf4d 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:33:07 compute-0 nova_compute[185723]: 2026-02-16 13:33:07.121 185727 DEBUG nova.compute.manager [req-97021b58-cee7-4287-b844-9da7e6d622f5 req-ef908228-1d02-49ff-9515-dc08aebb1193 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 1e7b84b5-47ea-44b8-b949-ad82d5f1c8aa] Received event network-vif-deleted-c0af6030-8607-421e-b581-c7d30d70b02d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:33:07 compute-0 nova_compute[185723]: 2026-02-16 13:33:07.420 185727 DEBUG nova.compute.manager [req-b10d43de-a260-4bb1-bdc8-b43628ae057b req-063e48e3-b738-4656-9ffa-ba13505abe9b faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 1e7b84b5-47ea-44b8-b949-ad82d5f1c8aa] Received event network-vif-plugged-c0af6030-8607-421e-b581-c7d30d70b02d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:33:07 compute-0 nova_compute[185723]: 2026-02-16 13:33:07.420 185727 DEBUG oslo_concurrency.lockutils [req-b10d43de-a260-4bb1-bdc8-b43628ae057b req-063e48e3-b738-4656-9ffa-ba13505abe9b faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "1e7b84b5-47ea-44b8-b949-ad82d5f1c8aa-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:33:07 compute-0 nova_compute[185723]: 2026-02-16 13:33:07.421 185727 DEBUG oslo_concurrency.lockutils [req-b10d43de-a260-4bb1-bdc8-b43628ae057b req-063e48e3-b738-4656-9ffa-ba13505abe9b faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "1e7b84b5-47ea-44b8-b949-ad82d5f1c8aa-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:33:07 compute-0 nova_compute[185723]: 2026-02-16 13:33:07.421 185727 DEBUG oslo_concurrency.lockutils [req-b10d43de-a260-4bb1-bdc8-b43628ae057b req-063e48e3-b738-4656-9ffa-ba13505abe9b faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "1e7b84b5-47ea-44b8-b949-ad82d5f1c8aa-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:33:07 compute-0 nova_compute[185723]: 2026-02-16 13:33:07.421 185727 DEBUG nova.compute.manager [req-b10d43de-a260-4bb1-bdc8-b43628ae057b req-063e48e3-b738-4656-9ffa-ba13505abe9b faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 1e7b84b5-47ea-44b8-b949-ad82d5f1c8aa] No waiting events found dispatching network-vif-plugged-c0af6030-8607-421e-b581-c7d30d70b02d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:33:07 compute-0 nova_compute[185723]: 2026-02-16 13:33:07.421 185727 WARNING nova.compute.manager [req-b10d43de-a260-4bb1-bdc8-b43628ae057b req-063e48e3-b738-4656-9ffa-ba13505abe9b faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 1e7b84b5-47ea-44b8-b949-ad82d5f1c8aa] Received unexpected event network-vif-plugged-c0af6030-8607-421e-b581-c7d30d70b02d for instance with vm_state deleted and task_state None.
Feb 16 13:33:07 compute-0 nova_compute[185723]: 2026-02-16 13:33:07.474 185727 DEBUG nova.compute.provider_tree [None req-46473dd4-3877-4938-b91c-3356dce4bf4d 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] Inventory has not changed in ProviderTree for provider: c9501a85-df32-4b8f-bce0-9425ef1e7866 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 13:33:07 compute-0 nova_compute[185723]: 2026-02-16 13:33:07.512 185727 DEBUG nova.scheduler.client.report [None req-46473dd4-3877-4938-b91c-3356dce4bf4d 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] Inventory has not changed for provider c9501a85-df32-4b8f-bce0-9425ef1e7866 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 13:33:07 compute-0 nova_compute[185723]: 2026-02-16 13:33:07.551 185727 DEBUG oslo_concurrency.lockutils [None req-46473dd4-3877-4938-b91c-3356dce4bf4d 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.586s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:33:07 compute-0 nova_compute[185723]: 2026-02-16 13:33:07.580 185727 INFO nova.scheduler.client.report [None req-46473dd4-3877-4938-b91c-3356dce4bf4d 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] Deleted allocations for instance 1e7b84b5-47ea-44b8-b949-ad82d5f1c8aa
Feb 16 13:33:07 compute-0 nova_compute[185723]: 2026-02-16 13:33:07.708 185727 DEBUG oslo_concurrency.lockutils [None req-46473dd4-3877-4938-b91c-3356dce4bf4d 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] Lock "1e7b84b5-47ea-44b8-b949-ad82d5f1c8aa" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.296s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:33:08 compute-0 nova_compute[185723]: 2026-02-16 13:33:08.461 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:33:08 compute-0 nova_compute[185723]: 2026-02-16 13:33:08.699 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:33:08 compute-0 nova_compute[185723]: 2026-02-16 13:33:08.850 185727 DEBUG oslo_concurrency.lockutils [None req-ce57518d-34c6-46f7-bfea-278093d020e7 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] Acquiring lock "07689e3f-f214-4f57-a662-bc531b614c3d" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:33:08 compute-0 nova_compute[185723]: 2026-02-16 13:33:08.850 185727 DEBUG oslo_concurrency.lockutils [None req-ce57518d-34c6-46f7-bfea-278093d020e7 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] Lock "07689e3f-f214-4f57-a662-bc531b614c3d" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:33:08 compute-0 nova_compute[185723]: 2026-02-16 13:33:08.851 185727 DEBUG oslo_concurrency.lockutils [None req-ce57518d-34c6-46f7-bfea-278093d020e7 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] Acquiring lock "07689e3f-f214-4f57-a662-bc531b614c3d-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:33:08 compute-0 nova_compute[185723]: 2026-02-16 13:33:08.852 185727 DEBUG oslo_concurrency.lockutils [None req-ce57518d-34c6-46f7-bfea-278093d020e7 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] Lock "07689e3f-f214-4f57-a662-bc531b614c3d-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:33:08 compute-0 nova_compute[185723]: 2026-02-16 13:33:08.852 185727 DEBUG oslo_concurrency.lockutils [None req-ce57518d-34c6-46f7-bfea-278093d020e7 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] Lock "07689e3f-f214-4f57-a662-bc531b614c3d-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:33:08 compute-0 nova_compute[185723]: 2026-02-16 13:33:08.854 185727 INFO nova.compute.manager [None req-ce57518d-34c6-46f7-bfea-278093d020e7 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] Terminating instance
Feb 16 13:33:08 compute-0 nova_compute[185723]: 2026-02-16 13:33:08.856 185727 DEBUG nova.compute.manager [None req-ce57518d-34c6-46f7-bfea-278093d020e7 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Feb 16 13:33:08 compute-0 kernel: tap9ac0912f-d5 (unregistering): left promiscuous mode
Feb 16 13:33:08 compute-0 NetworkManager[56177]: <info>  [1771248788.8873] device (tap9ac0912f-d5): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Feb 16 13:33:08 compute-0 nova_compute[185723]: 2026-02-16 13:33:08.887 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:33:08 compute-0 ovn_controller[96072]: 2026-02-16T13:33:08Z|00095|binding|INFO|Releasing lport 9ac0912f-d593-4dad-bf05-01d7dd0b6677 from this chassis (sb_readonly=0)
Feb 16 13:33:08 compute-0 ovn_controller[96072]: 2026-02-16T13:33:08Z|00096|binding|INFO|Setting lport 9ac0912f-d593-4dad-bf05-01d7dd0b6677 down in Southbound
Feb 16 13:33:08 compute-0 ovn_controller[96072]: 2026-02-16T13:33:08Z|00097|binding|INFO|Removing iface tap9ac0912f-d5 ovn-installed in OVS
Feb 16 13:33:08 compute-0 nova_compute[185723]: 2026-02-16 13:33:08.891 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:33:08 compute-0 nova_compute[185723]: 2026-02-16 13:33:08.894 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:33:08 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:33:08.923 105360 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ba:1f:94 10.100.0.14'], port_security=['fa:16:3e:ba:1f:94 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '07689e3f-f214-4f57-a662-bc531b614c3d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-34e10b77-8ec0-4af1-a031-d83792585eee', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9d212b8e966a499a9aad9b972bb7e76d', 'neutron:revision_number': '13', 'neutron:security_group_ids': '145107b4-bbb8-4e69-b3bf-db62f38a1f3d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1fdee5c0-c83c-45cf-986e-fa2b109e36c1, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2e53046760>], logical_port=9ac0912f-d593-4dad-bf05-01d7dd0b6677) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2e53046760>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 13:33:08 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:33:08.926 105360 INFO neutron.agent.ovn.metadata.agent [-] Port 9ac0912f-d593-4dad-bf05-01d7dd0b6677 in datapath 34e10b77-8ec0-4af1-a031-d83792585eee unbound from our chassis
Feb 16 13:33:08 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:33:08.927 105360 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 34e10b77-8ec0-4af1-a031-d83792585eee, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 16 13:33:08 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:33:08.927 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[8da14ba8-4b97-47d2-aed8-b62b6c062c90]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:33:08 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:33:08.928 105360 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-34e10b77-8ec0-4af1-a031-d83792585eee namespace which is not needed anymore
Feb 16 13:33:08 compute-0 systemd[1]: machine-qemu\x2d7\x2dinstance\x2d00000009.scope: Deactivated successfully.
Feb 16 13:33:08 compute-0 systemd[1]: machine-qemu\x2d7\x2dinstance\x2d00000009.scope: Consumed 2.127s CPU time.
Feb 16 13:33:08 compute-0 systemd-machined[155229]: Machine qemu-7-instance-00000009 terminated.
Feb 16 13:33:09 compute-0 neutron-haproxy-ovnmeta-34e10b77-8ec0-4af1-a031-d83792585eee[209669]: [NOTICE]   (209673) : haproxy version is 2.8.14-c23fe91
Feb 16 13:33:09 compute-0 neutron-haproxy-ovnmeta-34e10b77-8ec0-4af1-a031-d83792585eee[209669]: [NOTICE]   (209673) : path to executable is /usr/sbin/haproxy
Feb 16 13:33:09 compute-0 neutron-haproxy-ovnmeta-34e10b77-8ec0-4af1-a031-d83792585eee[209669]: [WARNING]  (209673) : Exiting Master process...
Feb 16 13:33:09 compute-0 neutron-haproxy-ovnmeta-34e10b77-8ec0-4af1-a031-d83792585eee[209669]: [WARNING]  (209673) : Exiting Master process...
Feb 16 13:33:09 compute-0 neutron-haproxy-ovnmeta-34e10b77-8ec0-4af1-a031-d83792585eee[209669]: [ALERT]    (209673) : Current worker (209675) exited with code 143 (Terminated)
Feb 16 13:33:09 compute-0 neutron-haproxy-ovnmeta-34e10b77-8ec0-4af1-a031-d83792585eee[209669]: [WARNING]  (209673) : All workers exited. Exiting... (0)
Feb 16 13:33:09 compute-0 systemd[1]: libpod-cac232bda38c5dd79b9caf80bd16d4ab4e13e85376207a1e81efe654f8b314b1.scope: Deactivated successfully.
Feb 16 13:33:09 compute-0 podman[210167]: 2026-02-16 13:33:09.064272858 +0000 UTC m=+0.047056924 container died cac232bda38c5dd79b9caf80bd16d4ab4e13e85376207a1e81efe654f8b314b1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-34e10b77-8ec0-4af1-a031-d83792585eee, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 16 13:33:09 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-cac232bda38c5dd79b9caf80bd16d4ab4e13e85376207a1e81efe654f8b314b1-userdata-shm.mount: Deactivated successfully.
Feb 16 13:33:09 compute-0 systemd[1]: var-lib-containers-storage-overlay-fc5a63ec9f3bdef818d1cc65d7d69c8c0ef7ad3f99c8e3572288f2d05872350c-merged.mount: Deactivated successfully.
Feb 16 13:33:09 compute-0 podman[210167]: 2026-02-16 13:33:09.104805268 +0000 UTC m=+0.087589334 container cleanup cac232bda38c5dd79b9caf80bd16d4ab4e13e85376207a1e81efe654f8b314b1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-34e10b77-8ec0-4af1-a031-d83792585eee, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 16 13:33:09 compute-0 systemd[1]: libpod-conmon-cac232bda38c5dd79b9caf80bd16d4ab4e13e85376207a1e81efe654f8b314b1.scope: Deactivated successfully.
Feb 16 13:33:09 compute-0 nova_compute[185723]: 2026-02-16 13:33:09.122 185727 INFO nova.virt.libvirt.driver [-] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] Instance destroyed successfully.
Feb 16 13:33:09 compute-0 nova_compute[185723]: 2026-02-16 13:33:09.123 185727 DEBUG nova.objects.instance [None req-ce57518d-34c6-46f7-bfea-278093d020e7 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] Lazy-loading 'resources' on Instance uuid 07689e3f-f214-4f57-a662-bc531b614c3d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 13:33:09 compute-0 nova_compute[185723]: 2026-02-16 13:33:09.166 185727 DEBUG nova.virt.libvirt.vif [None req-ce57518d-34c6-46f7-bfea-278093d020e7 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-02-16T13:30:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-1190069071',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-1190069071',id=9,image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-16T13:31:11Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='9d212b8e966a499a9aad9b972bb7e76d',ramdisk_id='',reservation_id='r-wzp5jrw0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-464275700',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-464275700-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-16T13:32:52Z,user_data=None,user_id='8712c0037def471dabf14879c0a418ec',uuid=07689e3f-f214-4f57-a662-bc531b614c3d,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "9ac0912f-d593-4dad-bf05-01d7dd0b6677", "address": "fa:16:3e:ba:1f:94", "network": {"id": "34e10b77-8ec0-4af1-a031-d83792585eee", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-944275405-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9d212b8e966a499a9aad9b972bb7e76d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9ac0912f-d5", "ovs_interfaceid": "9ac0912f-d593-4dad-bf05-01d7dd0b6677", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Feb 16 13:33:09 compute-0 nova_compute[185723]: 2026-02-16 13:33:09.167 185727 DEBUG nova.network.os_vif_util [None req-ce57518d-34c6-46f7-bfea-278093d020e7 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] Converting VIF {"id": "9ac0912f-d593-4dad-bf05-01d7dd0b6677", "address": "fa:16:3e:ba:1f:94", "network": {"id": "34e10b77-8ec0-4af1-a031-d83792585eee", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-944275405-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9d212b8e966a499a9aad9b972bb7e76d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9ac0912f-d5", "ovs_interfaceid": "9ac0912f-d593-4dad-bf05-01d7dd0b6677", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 13:33:09 compute-0 nova_compute[185723]: 2026-02-16 13:33:09.168 185727 DEBUG nova.network.os_vif_util [None req-ce57518d-34c6-46f7-bfea-278093d020e7 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:ba:1f:94,bridge_name='br-int',has_traffic_filtering=True,id=9ac0912f-d593-4dad-bf05-01d7dd0b6677,network=Network(34e10b77-8ec0-4af1-a031-d83792585eee),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9ac0912f-d5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 13:33:09 compute-0 nova_compute[185723]: 2026-02-16 13:33:09.168 185727 DEBUG os_vif [None req-ce57518d-34c6-46f7-bfea-278093d020e7 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:ba:1f:94,bridge_name='br-int',has_traffic_filtering=True,id=9ac0912f-d593-4dad-bf05-01d7dd0b6677,network=Network(34e10b77-8ec0-4af1-a031-d83792585eee),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9ac0912f-d5') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Feb 16 13:33:09 compute-0 nova_compute[185723]: 2026-02-16 13:33:09.169 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:33:09 compute-0 nova_compute[185723]: 2026-02-16 13:33:09.170 185727 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9ac0912f-d5, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:33:09 compute-0 nova_compute[185723]: 2026-02-16 13:33:09.171 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:33:09 compute-0 nova_compute[185723]: 2026-02-16 13:33:09.174 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 13:33:09 compute-0 podman[210213]: 2026-02-16 13:33:09.174632747 +0000 UTC m=+0.045810622 container remove cac232bda38c5dd79b9caf80bd16d4ab4e13e85376207a1e81efe654f8b314b1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-34e10b77-8ec0-4af1-a031-d83792585eee, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Feb 16 13:33:09 compute-0 nova_compute[185723]: 2026-02-16 13:33:09.176 185727 INFO os_vif [None req-ce57518d-34c6-46f7-bfea-278093d020e7 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:ba:1f:94,bridge_name='br-int',has_traffic_filtering=True,id=9ac0912f-d593-4dad-bf05-01d7dd0b6677,network=Network(34e10b77-8ec0-4af1-a031-d83792585eee),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9ac0912f-d5')
Feb 16 13:33:09 compute-0 nova_compute[185723]: 2026-02-16 13:33:09.177 185727 INFO nova.virt.libvirt.driver [None req-ce57518d-34c6-46f7-bfea-278093d020e7 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] Deleting instance files /var/lib/nova/instances/07689e3f-f214-4f57-a662-bc531b614c3d_del
Feb 16 13:33:09 compute-0 nova_compute[185723]: 2026-02-16 13:33:09.177 185727 INFO nova.virt.libvirt.driver [None req-ce57518d-34c6-46f7-bfea-278093d020e7 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] Deletion of /var/lib/nova/instances/07689e3f-f214-4f57-a662-bc531b614c3d_del complete
Feb 16 13:33:09 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:33:09.180 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[f54f44fc-54a7-4756-b80d-0f20fe6d6d16]: (4, ('Mon Feb 16 01:33:09 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-34e10b77-8ec0-4af1-a031-d83792585eee (cac232bda38c5dd79b9caf80bd16d4ab4e13e85376207a1e81efe654f8b314b1)\ncac232bda38c5dd79b9caf80bd16d4ab4e13e85376207a1e81efe654f8b314b1\nMon Feb 16 01:33:09 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-34e10b77-8ec0-4af1-a031-d83792585eee (cac232bda38c5dd79b9caf80bd16d4ab4e13e85376207a1e81efe654f8b314b1)\ncac232bda38c5dd79b9caf80bd16d4ab4e13e85376207a1e81efe654f8b314b1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:33:09 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:33:09.182 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[1e72bdcc-d438-4bb9-896a-08531a085b6a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:33:09 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:33:09.183 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap34e10b77-80, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:33:09 compute-0 nova_compute[185723]: 2026-02-16 13:33:09.185 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:33:09 compute-0 kernel: tap34e10b77-80: left promiscuous mode
Feb 16 13:33:09 compute-0 nova_compute[185723]: 2026-02-16 13:33:09.191 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:33:09 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:33:09.194 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[5f5b761e-798e-43fc-971d-597b23bfecf0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:33:09 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:33:09.207 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[1ae176f2-0287-4290-b2d1-94f6b74b6190]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:33:09 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:33:09.208 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[d2403a4d-1b9d-4a81-b0c1-e9596f78481b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:33:09 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:33:09.220 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[4b4481a9-1fed-403f-8b16-db9020bf006e]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 477092, 'reachable_time': 15815, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 210229, 'error': None, 'target': 'ovnmeta-34e10b77-8ec0-4af1-a031-d83792585eee', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:33:09 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:33:09.224 105762 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-34e10b77-8ec0-4af1-a031-d83792585eee deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Feb 16 13:33:09 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:33:09.225 105762 DEBUG oslo.privsep.daemon [-] privsep: reply[da8faa19-4cff-457a-9c47-e12a01ee83ad]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:33:09 compute-0 systemd[1]: run-netns-ovnmeta\x2d34e10b77\x2d8ec0\x2d4af1\x2da031\x2dd83792585eee.mount: Deactivated successfully.
Feb 16 13:33:09 compute-0 nova_compute[185723]: 2026-02-16 13:33:09.437 185727 INFO nova.compute.manager [None req-ce57518d-34c6-46f7-bfea-278093d020e7 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] Took 0.58 seconds to destroy the instance on the hypervisor.
Feb 16 13:33:09 compute-0 nova_compute[185723]: 2026-02-16 13:33:09.438 185727 DEBUG oslo.service.loopingcall [None req-ce57518d-34c6-46f7-bfea-278093d020e7 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Feb 16 13:33:09 compute-0 nova_compute[185723]: 2026-02-16 13:33:09.439 185727 DEBUG nova.compute.manager [-] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Feb 16 13:33:09 compute-0 nova_compute[185723]: 2026-02-16 13:33:09.439 185727 DEBUG nova.network.neutron [-] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Feb 16 13:33:10 compute-0 nova_compute[185723]: 2026-02-16 13:33:10.099 185727 DEBUG nova.compute.manager [req-6efc0dbf-928b-48f3-ae05-4c495f98c459 req-79fed49f-8aeb-4892-a6c1-07f47481dba7 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] Received event network-vif-unplugged-9ac0912f-d593-4dad-bf05-01d7dd0b6677 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:33:10 compute-0 nova_compute[185723]: 2026-02-16 13:33:10.100 185727 DEBUG oslo_concurrency.lockutils [req-6efc0dbf-928b-48f3-ae05-4c495f98c459 req-79fed49f-8aeb-4892-a6c1-07f47481dba7 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "07689e3f-f214-4f57-a662-bc531b614c3d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:33:10 compute-0 nova_compute[185723]: 2026-02-16 13:33:10.100 185727 DEBUG oslo_concurrency.lockutils [req-6efc0dbf-928b-48f3-ae05-4c495f98c459 req-79fed49f-8aeb-4892-a6c1-07f47481dba7 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "07689e3f-f214-4f57-a662-bc531b614c3d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:33:10 compute-0 nova_compute[185723]: 2026-02-16 13:33:10.101 185727 DEBUG oslo_concurrency.lockutils [req-6efc0dbf-928b-48f3-ae05-4c495f98c459 req-79fed49f-8aeb-4892-a6c1-07f47481dba7 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "07689e3f-f214-4f57-a662-bc531b614c3d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:33:10 compute-0 nova_compute[185723]: 2026-02-16 13:33:10.101 185727 DEBUG nova.compute.manager [req-6efc0dbf-928b-48f3-ae05-4c495f98c459 req-79fed49f-8aeb-4892-a6c1-07f47481dba7 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] No waiting events found dispatching network-vif-unplugged-9ac0912f-d593-4dad-bf05-01d7dd0b6677 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:33:10 compute-0 nova_compute[185723]: 2026-02-16 13:33:10.101 185727 DEBUG nova.compute.manager [req-6efc0dbf-928b-48f3-ae05-4c495f98c459 req-79fed49f-8aeb-4892-a6c1-07f47481dba7 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] Received event network-vif-unplugged-9ac0912f-d593-4dad-bf05-01d7dd0b6677 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Feb 16 13:33:10 compute-0 nova_compute[185723]: 2026-02-16 13:33:10.459 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:33:10 compute-0 nova_compute[185723]: 2026-02-16 13:33:10.460 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 16 13:33:10 compute-0 nova_compute[185723]: 2026-02-16 13:33:10.460 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 16 13:33:10 compute-0 nova_compute[185723]: 2026-02-16 13:33:10.505 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] Skipping network cache update for instance because it is being deleted. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9875
Feb 16 13:33:10 compute-0 nova_compute[185723]: 2026-02-16 13:33:10.505 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 16 13:33:10 compute-0 nova_compute[185723]: 2026-02-16 13:33:10.506 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:33:11 compute-0 nova_compute[185723]: 2026-02-16 13:33:11.432 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:33:11 compute-0 nova_compute[185723]: 2026-02-16 13:33:11.433 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:33:12 compute-0 podman[210231]: 2026-02-16 13:33:12.02351824 +0000 UTC m=+0.045782672 container health_status d69bd0be854ec8043fcba555b70c2cd311c7a2510d07d6bc9929fe7684abece9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 16 13:33:12 compute-0 podman[210230]: 2026-02-16 13:33:12.028934475 +0000 UTC m=+0.057024232 container health_status 93151dcbec9f159583042df5101bdd7b5b1bcb5f3117955b4bc9cd1c0e465b98 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, org.opencontainers.image.created=2026-02-05T04:57:10Z, vcs-type=git, managed_by=edpm_ansible, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, version=9.7, distribution-scope=public, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.openshift.expose-services=, build-date=2026-02-05T04:57:10Z, io.buildah.version=1.33.7, release=1770267347, architecture=x86_64, vendor=Red Hat, Inc., config_id=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9/ubi-minimal)
Feb 16 13:33:12 compute-0 nova_compute[185723]: 2026-02-16 13:33:12.286 185727 DEBUG nova.compute.manager [req-2fd71a69-1f28-4c5a-87d4-22f90beb3b25 req-c3879676-fec5-4f50-af8f-7fa4ddc6d549 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] Received event network-vif-plugged-9ac0912f-d593-4dad-bf05-01d7dd0b6677 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:33:12 compute-0 nova_compute[185723]: 2026-02-16 13:33:12.287 185727 DEBUG oslo_concurrency.lockutils [req-2fd71a69-1f28-4c5a-87d4-22f90beb3b25 req-c3879676-fec5-4f50-af8f-7fa4ddc6d549 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "07689e3f-f214-4f57-a662-bc531b614c3d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:33:12 compute-0 nova_compute[185723]: 2026-02-16 13:33:12.287 185727 DEBUG oslo_concurrency.lockutils [req-2fd71a69-1f28-4c5a-87d4-22f90beb3b25 req-c3879676-fec5-4f50-af8f-7fa4ddc6d549 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "07689e3f-f214-4f57-a662-bc531b614c3d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:33:12 compute-0 nova_compute[185723]: 2026-02-16 13:33:12.287 185727 DEBUG oslo_concurrency.lockutils [req-2fd71a69-1f28-4c5a-87d4-22f90beb3b25 req-c3879676-fec5-4f50-af8f-7fa4ddc6d549 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "07689e3f-f214-4f57-a662-bc531b614c3d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:33:12 compute-0 nova_compute[185723]: 2026-02-16 13:33:12.287 185727 DEBUG nova.compute.manager [req-2fd71a69-1f28-4c5a-87d4-22f90beb3b25 req-c3879676-fec5-4f50-af8f-7fa4ddc6d549 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] No waiting events found dispatching network-vif-plugged-9ac0912f-d593-4dad-bf05-01d7dd0b6677 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:33:12 compute-0 nova_compute[185723]: 2026-02-16 13:33:12.287 185727 WARNING nova.compute.manager [req-2fd71a69-1f28-4c5a-87d4-22f90beb3b25 req-c3879676-fec5-4f50-af8f-7fa4ddc6d549 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] Received unexpected event network-vif-plugged-9ac0912f-d593-4dad-bf05-01d7dd0b6677 for instance with vm_state active and task_state deleting.
Feb 16 13:33:12 compute-0 nova_compute[185723]: 2026-02-16 13:33:12.434 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:33:13 compute-0 nova_compute[185723]: 2026-02-16 13:33:13.072 185727 DEBUG nova.network.neutron [-] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 13:33:13 compute-0 nova_compute[185723]: 2026-02-16 13:33:13.132 185727 INFO nova.compute.manager [-] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] Took 3.69 seconds to deallocate network for instance.
Feb 16 13:33:13 compute-0 nova_compute[185723]: 2026-02-16 13:33:13.232 185727 DEBUG oslo_concurrency.lockutils [None req-ce57518d-34c6-46f7-bfea-278093d020e7 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:33:13 compute-0 nova_compute[185723]: 2026-02-16 13:33:13.233 185727 DEBUG oslo_concurrency.lockutils [None req-ce57518d-34c6-46f7-bfea-278093d020e7 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:33:13 compute-0 nova_compute[185723]: 2026-02-16 13:33:13.245 185727 DEBUG oslo_concurrency.lockutils [None req-ce57518d-34c6-46f7-bfea-278093d020e7 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.012s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:33:13 compute-0 nova_compute[185723]: 2026-02-16 13:33:13.348 185727 INFO nova.scheduler.client.report [None req-ce57518d-34c6-46f7-bfea-278093d020e7 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] Deleted allocations for instance 07689e3f-f214-4f57-a662-bc531b614c3d
Feb 16 13:33:13 compute-0 nova_compute[185723]: 2026-02-16 13:33:13.428 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:33:13 compute-0 nova_compute[185723]: 2026-02-16 13:33:13.433 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:33:13 compute-0 nova_compute[185723]: 2026-02-16 13:33:13.462 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:33:13 compute-0 nova_compute[185723]: 2026-02-16 13:33:13.504 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:33:13 compute-0 nova_compute[185723]: 2026-02-16 13:33:13.505 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:33:13 compute-0 nova_compute[185723]: 2026-02-16 13:33:13.505 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:33:13 compute-0 nova_compute[185723]: 2026-02-16 13:33:13.506 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 16 13:33:13 compute-0 nova_compute[185723]: 2026-02-16 13:33:13.542 185727 DEBUG oslo_concurrency.lockutils [None req-ce57518d-34c6-46f7-bfea-278093d020e7 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] Lock "07689e3f-f214-4f57-a662-bc531b614c3d" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.691s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:33:13 compute-0 nova_compute[185723]: 2026-02-16 13:33:13.645 185727 WARNING nova.virt.libvirt.driver [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 13:33:13 compute-0 nova_compute[185723]: 2026-02-16 13:33:13.647 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5857MB free_disk=73.2271728515625GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 16 13:33:13 compute-0 nova_compute[185723]: 2026-02-16 13:33:13.647 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:33:13 compute-0 nova_compute[185723]: 2026-02-16 13:33:13.648 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:33:13 compute-0 nova_compute[185723]: 2026-02-16 13:33:13.776 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 16 13:33:13 compute-0 nova_compute[185723]: 2026-02-16 13:33:13.776 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 16 13:33:13 compute-0 nova_compute[185723]: 2026-02-16 13:33:13.807 185727 DEBUG nova.compute.provider_tree [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Inventory has not changed in ProviderTree for provider: c9501a85-df32-4b8f-bce0-9425ef1e7866 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 13:33:13 compute-0 nova_compute[185723]: 2026-02-16 13:33:13.822 185727 DEBUG nova.scheduler.client.report [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Inventory has not changed for provider c9501a85-df32-4b8f-bce0-9425ef1e7866 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 13:33:13 compute-0 nova_compute[185723]: 2026-02-16 13:33:13.870 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 16 13:33:13 compute-0 nova_compute[185723]: 2026-02-16 13:33:13.871 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.223s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:33:13 compute-0 nova_compute[185723]: 2026-02-16 13:33:13.871 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:33:13 compute-0 nova_compute[185723]: 2026-02-16 13:33:13.871 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Feb 16 13:33:14 compute-0 nova_compute[185723]: 2026-02-16 13:33:14.227 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:33:14 compute-0 nova_compute[185723]: 2026-02-16 13:33:14.447 185727 DEBUG nova.compute.manager [req-105190ae-c9c9-4720-9db4-26d06c28208e req-d0bb80be-2828-402e-8d3a-464c6d933956 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] Received event network-vif-deleted-9ac0912f-d593-4dad-bf05-01d7dd0b6677 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:33:14 compute-0 nova_compute[185723]: 2026-02-16 13:33:14.893 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:33:14 compute-0 nova_compute[185723]: 2026-02-16 13:33:14.923 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:33:14 compute-0 nova_compute[185723]: 2026-02-16 13:33:14.924 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 16 13:33:15 compute-0 sshd-session[210269]: Invalid user guest from 188.166.42.159 port 45968
Feb 16 13:33:15 compute-0 sshd-session[210269]: Connection closed by invalid user guest 188.166.42.159 port 45968 [preauth]
Feb 16 13:33:17 compute-0 podman[210271]: 2026-02-16 13:33:17.062640337 +0000 UTC m=+0.091519142 container health_status 19961a2f872cc72daf954f4dd556f980b5f58f137d145776df6132c167a8a93c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Feb 16 13:33:17 compute-0 nova_compute[185723]: 2026-02-16 13:33:17.434 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:33:18 compute-0 nova_compute[185723]: 2026-02-16 13:33:18.465 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:33:18 compute-0 nova_compute[185723]: 2026-02-16 13:33:18.669 185727 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1771248783.668025, 1e7b84b5-47ea-44b8-b949-ad82d5f1c8aa => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 13:33:18 compute-0 nova_compute[185723]: 2026-02-16 13:33:18.670 185727 INFO nova.compute.manager [-] [instance: 1e7b84b5-47ea-44b8-b949-ad82d5f1c8aa] VM Stopped (Lifecycle Event)
Feb 16 13:33:19 compute-0 nova_compute[185723]: 2026-02-16 13:33:19.030 185727 DEBUG nova.compute.manager [None req-5298e989-a37e-4fc4-80db-7771e7fee05a - - - - - -] [instance: 1e7b84b5-47ea-44b8-b949-ad82d5f1c8aa] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:33:19 compute-0 nova_compute[185723]: 2026-02-16 13:33:19.229 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:33:23 compute-0 nova_compute[185723]: 2026-02-16 13:33:23.469 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:33:23 compute-0 nova_compute[185723]: 2026-02-16 13:33:23.793 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:33:24 compute-0 nova_compute[185723]: 2026-02-16 13:33:24.121 185727 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1771248789.1199477, 07689e3f-f214-4f57-a662-bc531b614c3d => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 13:33:24 compute-0 nova_compute[185723]: 2026-02-16 13:33:24.122 185727 INFO nova.compute.manager [-] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] VM Stopped (Lifecycle Event)
Feb 16 13:33:24 compute-0 nova_compute[185723]: 2026-02-16 13:33:24.232 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:33:24 compute-0 nova_compute[185723]: 2026-02-16 13:33:24.275 185727 DEBUG nova.compute.manager [None req-731345e1-9adf-4e20-a53c-148f5d075992 - - - - - -] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:33:27 compute-0 podman[210297]: 2026-02-16 13:33:27.04085918 +0000 UTC m=+0.082333912 container health_status 4ec6105d1727f201f656d7e6e358931be8ceabc5af37fb5ad779abc620122180 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 16 13:33:28 compute-0 nova_compute[185723]: 2026-02-16 13:33:28.471 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:33:29 compute-0 nova_compute[185723]: 2026-02-16 13:33:29.234 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:33:29 compute-0 podman[195053]: time="2026-02-16T13:33:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:33:29 compute-0 podman[195053]: @ - - [16/Feb/2026:13:33:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16012 "" "Go-http-client/1.1"
Feb 16 13:33:29 compute-0 podman[195053]: @ - - [16/Feb/2026:13:33:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2174 "" "Go-http-client/1.1"
Feb 16 13:33:30 compute-0 sshd-session[210321]: Invalid user admin from 64.227.72.94 port 47118
Feb 16 13:33:30 compute-0 sshd-session[210321]: Connection closed by invalid user admin 64.227.72.94 port 47118 [preauth]
Feb 16 13:33:31 compute-0 openstack_network_exporter[197909]: ERROR   13:33:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:33:31 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:33:31 compute-0 openstack_network_exporter[197909]: ERROR   13:33:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:33:31 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:33:33 compute-0 nova_compute[185723]: 2026-02-16 13:33:33.433 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:33:33 compute-0 nova_compute[185723]: 2026-02-16 13:33:33.473 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:33:34 compute-0 nova_compute[185723]: 2026-02-16 13:33:34.236 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:33:38 compute-0 nova_compute[185723]: 2026-02-16 13:33:38.475 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:33:39 compute-0 nova_compute[185723]: 2026-02-16 13:33:39.238 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:33:43 compute-0 podman[210324]: 2026-02-16 13:33:42.999990967 +0000 UTC m=+0.041839303 container health_status d69bd0be854ec8043fcba555b70c2cd311c7a2510d07d6bc9929fe7684abece9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127)
Feb 16 13:33:43 compute-0 podman[210323]: 2026-02-16 13:33:43.019833292 +0000 UTC m=+0.062776295 container health_status 93151dcbec9f159583042df5101bdd7b5b1bcb5f3117955b4bc9cd1c0e465b98 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-type=git, config_id=openstack_network_exporter, build-date=2026-02-05T04:57:10Z, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, org.opencontainers.image.created=2026-02-05T04:57:10Z, maintainer=Red Hat, Inc., managed_by=edpm_ansible, release=1770267347, vendor=Red Hat, Inc., architecture=x86_64, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, container_name=openstack_network_exporter, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, version=9.7)
Feb 16 13:33:43 compute-0 ovn_controller[96072]: 2026-02-16T13:33:43Z|00098|memory_trim|INFO|Detected inactivity (last active 30011 ms ago): trimming memory
Feb 16 13:33:43 compute-0 nova_compute[185723]: 2026-02-16 13:33:43.477 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:33:44 compute-0 nova_compute[185723]: 2026-02-16 13:33:44.240 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:33:48 compute-0 podman[210362]: 2026-02-16 13:33:48.04171354 +0000 UTC m=+0.073432880 container health_status 19961a2f872cc72daf954f4dd556f980b5f58f137d145776df6132c167a8a93c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller)
Feb 16 13:33:48 compute-0 nova_compute[185723]: 2026-02-16 13:33:48.479 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:33:48 compute-0 sshd-session[210389]: Invalid user admin from 146.190.226.24 port 57882
Feb 16 13:33:48 compute-0 sshd-session[210389]: Connection closed by invalid user admin 146.190.226.24 port 57882 [preauth]
Feb 16 13:33:49 compute-0 nova_compute[185723]: 2026-02-16 13:33:49.242 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:33:53 compute-0 nova_compute[185723]: 2026-02-16 13:33:53.524 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:33:54 compute-0 nova_compute[185723]: 2026-02-16 13:33:54.244 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:33:58 compute-0 podman[210391]: 2026-02-16 13:33:58.007276337 +0000 UTC m=+0.045190407 container health_status 4ec6105d1727f201f656d7e6e358931be8ceabc5af37fb5ad779abc620122180 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 16 13:33:58 compute-0 nova_compute[185723]: 2026-02-16 13:33:58.527 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:33:59 compute-0 nova_compute[185723]: 2026-02-16 13:33:59.246 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:33:59 compute-0 podman[195053]: time="2026-02-16T13:33:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:33:59 compute-0 podman[195053]: @ - - [16/Feb/2026:13:33:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16012 "" "Go-http-client/1.1"
Feb 16 13:33:59 compute-0 podman[195053]: @ - - [16/Feb/2026:13:33:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2172 "" "Go-http-client/1.1"
Feb 16 13:34:01 compute-0 openstack_network_exporter[197909]: ERROR   13:34:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:34:01 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:34:01 compute-0 openstack_network_exporter[197909]: ERROR   13:34:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:34:01 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:34:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:34:03.226 105360 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:34:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:34:03.227 105360 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:34:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:34:03.227 105360 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:34:03 compute-0 nova_compute[185723]: 2026-02-16 13:34:03.529 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:34:04 compute-0 nova_compute[185723]: 2026-02-16 13:34:04.248 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:34:06 compute-0 sshd-session[210415]: Invalid user weblogic from 188.166.42.159 port 53272
Feb 16 13:34:06 compute-0 sshd-session[210415]: Connection closed by invalid user weblogic 188.166.42.159 port 53272 [preauth]
Feb 16 13:34:08 compute-0 nova_compute[185723]: 2026-02-16 13:34:08.533 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:34:09 compute-0 nova_compute[185723]: 2026-02-16 13:34:09.250 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:34:10 compute-0 nova_compute[185723]: 2026-02-16 13:34:10.465 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:34:10 compute-0 nova_compute[185723]: 2026-02-16 13:34:10.466 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 16 13:34:10 compute-0 nova_compute[185723]: 2026-02-16 13:34:10.466 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 16 13:34:10 compute-0 nova_compute[185723]: 2026-02-16 13:34:10.518 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 16 13:34:10 compute-0 nova_compute[185723]: 2026-02-16 13:34:10.519 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:34:11 compute-0 nova_compute[185723]: 2026-02-16 13:34:11.433 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:34:12 compute-0 nova_compute[185723]: 2026-02-16 13:34:12.433 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:34:12 compute-0 nova_compute[185723]: 2026-02-16 13:34:12.434 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:34:13 compute-0 nova_compute[185723]: 2026-02-16 13:34:13.429 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:34:13 compute-0 nova_compute[185723]: 2026-02-16 13:34:13.535 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:34:14 compute-0 podman[210417]: 2026-02-16 13:34:14.000493695 +0000 UTC m=+0.044901470 container health_status 93151dcbec9f159583042df5101bdd7b5b1bcb5f3117955b4bc9cd1c0e465b98 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, org.opencontainers.image.created=2026-02-05T04:57:10Z, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, name=ubi9/ubi-minimal, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, release=1770267347, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., build-date=2026-02-05T04:57:10Z, distribution-scope=public, architecture=x86_64, managed_by=edpm_ansible, vcs-type=git, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, version=9.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 16 13:34:14 compute-0 podman[210418]: 2026-02-16 13:34:14.010644438 +0000 UTC m=+0.050153130 container health_status d69bd0be854ec8043fcba555b70c2cd311c7a2510d07d6bc9929fe7684abece9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Feb 16 13:34:14 compute-0 nova_compute[185723]: 2026-02-16 13:34:14.252 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:34:14 compute-0 nova_compute[185723]: 2026-02-16 13:34:14.433 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:34:14 compute-0 nova_compute[185723]: 2026-02-16 13:34:14.466 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:34:14 compute-0 nova_compute[185723]: 2026-02-16 13:34:14.467 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:34:14 compute-0 nova_compute[185723]: 2026-02-16 13:34:14.467 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:34:14 compute-0 nova_compute[185723]: 2026-02-16 13:34:14.467 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 16 13:34:14 compute-0 nova_compute[185723]: 2026-02-16 13:34:14.621 185727 WARNING nova.virt.libvirt.driver [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 13:34:14 compute-0 nova_compute[185723]: 2026-02-16 13:34:14.623 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5861MB free_disk=73.22716903686523GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 16 13:34:14 compute-0 nova_compute[185723]: 2026-02-16 13:34:14.623 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:34:14 compute-0 nova_compute[185723]: 2026-02-16 13:34:14.623 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:34:14 compute-0 nova_compute[185723]: 2026-02-16 13:34:14.723 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 16 13:34:14 compute-0 nova_compute[185723]: 2026-02-16 13:34:14.723 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 16 13:34:14 compute-0 nova_compute[185723]: 2026-02-16 13:34:14.770 185727 DEBUG nova.compute.provider_tree [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Inventory has not changed in ProviderTree for provider: c9501a85-df32-4b8f-bce0-9425ef1e7866 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 13:34:14 compute-0 nova_compute[185723]: 2026-02-16 13:34:14.792 185727 DEBUG nova.scheduler.client.report [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Inventory has not changed for provider c9501a85-df32-4b8f-bce0-9425ef1e7866 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 13:34:14 compute-0 nova_compute[185723]: 2026-02-16 13:34:14.793 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 16 13:34:14 compute-0 nova_compute[185723]: 2026-02-16 13:34:14.793 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.170s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:34:15 compute-0 nova_compute[185723]: 2026-02-16 13:34:15.793 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:34:15 compute-0 nova_compute[185723]: 2026-02-16 13:34:15.794 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 16 13:34:17 compute-0 nova_compute[185723]: 2026-02-16 13:34:17.434 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:34:18 compute-0 sshd-session[210457]: Invalid user vps from 146.190.22.227 port 33542
Feb 16 13:34:18 compute-0 sshd-session[210459]: Invalid user admin from 64.227.72.94 port 45490
Feb 16 13:34:18 compute-0 nova_compute[185723]: 2026-02-16 13:34:18.535 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:34:18 compute-0 podman[210461]: 2026-02-16 13:34:18.564209715 +0000 UTC m=+0.079961864 container health_status 19961a2f872cc72daf954f4dd556f980b5f58f137d145776df6132c167a8a93c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Feb 16 13:34:18 compute-0 sshd-session[210459]: Connection closed by invalid user admin 64.227.72.94 port 45490 [preauth]
Feb 16 13:34:18 compute-0 sshd-session[210457]: Connection closed by invalid user vps 146.190.22.227 port 33542 [preauth]
Feb 16 13:34:19 compute-0 nova_compute[185723]: 2026-02-16 13:34:19.254 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:34:23 compute-0 nova_compute[185723]: 2026-02-16 13:34:23.127 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:34:23 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:34:23.127 105360 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=15, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '52:96:1b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '16:6c:7a:bd:49:39'}, ipsec=False) old=SB_Global(nb_cfg=14) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 13:34:23 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:34:23.128 105360 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 16 13:34:23 compute-0 nova_compute[185723]: 2026-02-16 13:34:23.583 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:34:24 compute-0 nova_compute[185723]: 2026-02-16 13:34:24.256 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:34:27 compute-0 nova_compute[185723]: 2026-02-16 13:34:27.259 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:34:28 compute-0 nova_compute[185723]: 2026-02-16 13:34:28.587 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:34:29 compute-0 podman[210487]: 2026-02-16 13:34:29.013133928 +0000 UTC m=+0.049060394 container health_status 4ec6105d1727f201f656d7e6e358931be8ceabc5af37fb5ad779abc620122180 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 16 13:34:29 compute-0 nova_compute[185723]: 2026-02-16 13:34:29.339 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:34:29 compute-0 podman[195053]: time="2026-02-16T13:34:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:34:29 compute-0 podman[195053]: @ - - [16/Feb/2026:13:34:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16012 "" "Go-http-client/1.1"
Feb 16 13:34:29 compute-0 podman[195053]: @ - - [16/Feb/2026:13:34:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2177 "" "Go-http-client/1.1"
Feb 16 13:34:31 compute-0 openstack_network_exporter[197909]: ERROR   13:34:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:34:31 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:34:31 compute-0 openstack_network_exporter[197909]: ERROR   13:34:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:34:31 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:34:33 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:34:33.132 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=b0e583b2-47d7-4bde-bbd6-282143e0c194, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '15'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:34:33 compute-0 nova_compute[185723]: 2026-02-16 13:34:33.588 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:34:34 compute-0 nova_compute[185723]: 2026-02-16 13:34:34.342 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:34:38 compute-0 nova_compute[185723]: 2026-02-16 13:34:38.592 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:34:39 compute-0 nova_compute[185723]: 2026-02-16 13:34:39.379 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:34:42 compute-0 systemd[1]: virtproxyd.service: Deactivated successfully.
Feb 16 13:34:43 compute-0 nova_compute[185723]: 2026-02-16 13:34:43.593 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:34:44 compute-0 nova_compute[185723]: 2026-02-16 13:34:44.380 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:34:45 compute-0 podman[210513]: 2026-02-16 13:34:45.030211252 +0000 UTC m=+0.062047367 container health_status d69bd0be854ec8043fcba555b70c2cd311c7a2510d07d6bc9929fe7684abece9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 16 13:34:45 compute-0 podman[210512]: 2026-02-16 13:34:45.030454748 +0000 UTC m=+0.065735229 container health_status 93151dcbec9f159583042df5101bdd7b5b1bcb5f3117955b4bc9cd1c0e465b98 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1770267347, architecture=x86_64, org.opencontainers.image.created=2026-02-05T04:57:10Z, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-type=git, build-date=2026-02-05T04:57:10Z, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=ubi9/ubi-minimal, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.7, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, maintainer=Red Hat, Inc.)
Feb 16 13:34:48 compute-0 nova_compute[185723]: 2026-02-16 13:34:48.596 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:34:49 compute-0 podman[210551]: 2026-02-16 13:34:49.049271607 +0000 UTC m=+0.082951718 container health_status 19961a2f872cc72daf954f4dd556f980b5f58f137d145776df6132c167a8a93c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Feb 16 13:34:49 compute-0 nova_compute[185723]: 2026-02-16 13:34:49.382 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:34:53 compute-0 nova_compute[185723]: 2026-02-16 13:34:53.628 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:34:54 compute-0 nova_compute[185723]: 2026-02-16 13:34:54.383 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:34:56 compute-0 sshd-session[210577]: Invalid user admin from 146.190.226.24 port 48552
Feb 16 13:34:56 compute-0 sshd-session[210577]: Connection closed by invalid user admin 146.190.226.24 port 48552 [preauth]
Feb 16 13:34:58 compute-0 nova_compute[185723]: 2026-02-16 13:34:58.629 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:34:59 compute-0 nova_compute[185723]: 2026-02-16 13:34:59.385 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:34:59 compute-0 podman[195053]: time="2026-02-16T13:34:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:34:59 compute-0 podman[195053]: @ - - [16/Feb/2026:13:34:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16012 "" "Go-http-client/1.1"
Feb 16 13:34:59 compute-0 podman[195053]: @ - - [16/Feb/2026:13:34:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2174 "" "Go-http-client/1.1"
Feb 16 13:35:00 compute-0 podman[210581]: 2026-02-16 13:35:00.042510605 +0000 UTC m=+0.081806330 container health_status 4ec6105d1727f201f656d7e6e358931be8ceabc5af37fb5ad779abc620122180 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 16 13:35:00 compute-0 sshd-session[210579]: Invalid user mysql from 188.166.42.159 port 36224
Feb 16 13:35:00 compute-0 sshd-session[210579]: Connection closed by invalid user mysql 188.166.42.159 port 36224 [preauth]
Feb 16 13:35:01 compute-0 openstack_network_exporter[197909]: ERROR   13:35:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:35:01 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:35:01 compute-0 openstack_network_exporter[197909]: ERROR   13:35:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:35:01 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:35:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:35:03.227 105360 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:35:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:35:03.228 105360 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:35:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:35:03.228 105360 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:35:03 compute-0 nova_compute[185723]: 2026-02-16 13:35:03.630 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:35:04 compute-0 nova_compute[185723]: 2026-02-16 13:35:04.387 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:35:05 compute-0 ovn_controller[96072]: 2026-02-16T13:35:05Z|00099|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Feb 16 13:35:05 compute-0 sshd-session[210607]: Invalid user admin from 64.227.72.94 port 47192
Feb 16 13:35:05 compute-0 sshd-session[210607]: Connection closed by invalid user admin 64.227.72.94 port 47192 [preauth]
Feb 16 13:35:08 compute-0 nova_compute[185723]: 2026-02-16 13:35:08.631 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:35:09 compute-0 nova_compute[185723]: 2026-02-16 13:35:09.389 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:35:10 compute-0 nova_compute[185723]: 2026-02-16 13:35:10.434 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:35:12 compute-0 nova_compute[185723]: 2026-02-16 13:35:12.432 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:35:12 compute-0 nova_compute[185723]: 2026-02-16 13:35:12.433 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 16 13:35:12 compute-0 nova_compute[185723]: 2026-02-16 13:35:12.433 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 16 13:35:12 compute-0 nova_compute[185723]: 2026-02-16 13:35:12.459 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 16 13:35:12 compute-0 nova_compute[185723]: 2026-02-16 13:35:12.459 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:35:12 compute-0 nova_compute[185723]: 2026-02-16 13:35:12.460 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:35:13 compute-0 nova_compute[185723]: 2026-02-16 13:35:13.433 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:35:14 compute-0 nova_compute[185723]: 2026-02-16 13:35:14.278 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:35:14 compute-0 nova_compute[185723]: 2026-02-16 13:35:14.391 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:35:14 compute-0 nova_compute[185723]: 2026-02-16 13:35:14.432 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:35:14 compute-0 nova_compute[185723]: 2026-02-16 13:35:14.467 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:35:14 compute-0 nova_compute[185723]: 2026-02-16 13:35:14.468 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:35:14 compute-0 nova_compute[185723]: 2026-02-16 13:35:14.468 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:35:14 compute-0 nova_compute[185723]: 2026-02-16 13:35:14.468 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 16 13:35:14 compute-0 nova_compute[185723]: 2026-02-16 13:35:14.630 185727 WARNING nova.virt.libvirt.driver [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 13:35:14 compute-0 nova_compute[185723]: 2026-02-16 13:35:14.631 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5864MB free_disk=73.22716522216797GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 16 13:35:14 compute-0 nova_compute[185723]: 2026-02-16 13:35:14.632 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:35:14 compute-0 nova_compute[185723]: 2026-02-16 13:35:14.632 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:35:14 compute-0 nova_compute[185723]: 2026-02-16 13:35:14.707 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 16 13:35:14 compute-0 nova_compute[185723]: 2026-02-16 13:35:14.708 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 16 13:35:14 compute-0 nova_compute[185723]: 2026-02-16 13:35:14.740 185727 DEBUG nova.compute.provider_tree [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Inventory has not changed in ProviderTree for provider: c9501a85-df32-4b8f-bce0-9425ef1e7866 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 13:35:14 compute-0 nova_compute[185723]: 2026-02-16 13:35:14.777 185727 DEBUG nova.scheduler.client.report [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Inventory has not changed for provider c9501a85-df32-4b8f-bce0-9425ef1e7866 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 13:35:14 compute-0 nova_compute[185723]: 2026-02-16 13:35:14.779 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 16 13:35:14 compute-0 nova_compute[185723]: 2026-02-16 13:35:14.779 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.147s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:35:15 compute-0 nova_compute[185723]: 2026-02-16 13:35:15.775 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:35:15 compute-0 nova_compute[185723]: 2026-02-16 13:35:15.776 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:35:15 compute-0 nova_compute[185723]: 2026-02-16 13:35:15.776 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 16 13:35:16 compute-0 podman[210610]: 2026-02-16 13:35:16.029474916 +0000 UTC m=+0.060810406 container health_status d69bd0be854ec8043fcba555b70c2cd311c7a2510d07d6bc9929fe7684abece9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0)
Feb 16 13:35:16 compute-0 podman[210609]: 2026-02-16 13:35:16.036656355 +0000 UTC m=+0.070367455 container health_status 93151dcbec9f159583042df5101bdd7b5b1bcb5f3117955b4bc9cd1c0e465b98 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., release=1770267347, io.buildah.version=1.33.7, distribution-scope=public, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.7, architecture=x86_64, container_name=openstack_network_exporter, managed_by=edpm_ansible, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, config_id=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2026-02-05T04:57:10Z, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, name=ubi9/ubi-minimal, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, build-date=2026-02-05T04:57:10Z, io.openshift.expose-services=, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c)
Feb 16 13:35:17 compute-0 nova_compute[185723]: 2026-02-16 13:35:17.428 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:35:17 compute-0 nova_compute[185723]: 2026-02-16 13:35:17.454 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:35:19 compute-0 nova_compute[185723]: 2026-02-16 13:35:19.280 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:35:19 compute-0 nova_compute[185723]: 2026-02-16 13:35:19.392 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:35:20 compute-0 podman[210648]: 2026-02-16 13:35:20.052244904 +0000 UTC m=+0.095077170 container health_status 19961a2f872cc72daf954f4dd556f980b5f58f137d145776df6132c167a8a93c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 16 13:35:24 compute-0 nova_compute[185723]: 2026-02-16 13:35:24.281 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:35:24 compute-0 nova_compute[185723]: 2026-02-16 13:35:24.394 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:35:26 compute-0 nova_compute[185723]: 2026-02-16 13:35:26.283 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:35:26 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:35:26.285 105360 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=16, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '52:96:1b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '16:6c:7a:bd:49:39'}, ipsec=False) old=SB_Global(nb_cfg=15) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 13:35:26 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:35:26.286 105360 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 16 13:35:29 compute-0 nova_compute[185723]: 2026-02-16 13:35:29.284 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:35:29 compute-0 nova_compute[185723]: 2026-02-16 13:35:29.395 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:35:29 compute-0 podman[195053]: time="2026-02-16T13:35:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:35:29 compute-0 podman[195053]: @ - - [16/Feb/2026:13:35:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16012 "" "Go-http-client/1.1"
Feb 16 13:35:29 compute-0 podman[195053]: @ - - [16/Feb/2026:13:35:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2177 "" "Go-http-client/1.1"
Feb 16 13:35:31 compute-0 podman[210674]: 2026-02-16 13:35:31.016245863 +0000 UTC m=+0.057784721 container health_status 4ec6105d1727f201f656d7e6e358931be8ceabc5af37fb5ad779abc620122180 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 16 13:35:31 compute-0 openstack_network_exporter[197909]: ERROR   13:35:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:35:31 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:35:31 compute-0 openstack_network_exporter[197909]: ERROR   13:35:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:35:31 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:35:32 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:35:32.289 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=b0e583b2-47d7-4bde-bbd6-282143e0c194, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '16'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:35:34 compute-0 nova_compute[185723]: 2026-02-16 13:35:34.286 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:35:34 compute-0 nova_compute[185723]: 2026-02-16 13:35:34.397 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:35:39 compute-0 nova_compute[185723]: 2026-02-16 13:35:39.287 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:35:39 compute-0 nova_compute[185723]: 2026-02-16 13:35:39.398 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:35:44 compute-0 nova_compute[185723]: 2026-02-16 13:35:44.290 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:35:44 compute-0 nova_compute[185723]: 2026-02-16 13:35:44.399 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:35:47 compute-0 podman[210698]: 2026-02-16 13:35:47.022230702 +0000 UTC m=+0.060153880 container health_status 93151dcbec9f159583042df5101bdd7b5b1bcb5f3117955b4bc9cd1c0e465b98 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9/ubi-minimal, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-02-05T04:57:10Z, architecture=x86_64, distribution-scope=public, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., vcs-type=git, io.buildah.version=1.33.7, release=1770267347, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.7, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2026-02-05T04:57:10Z, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible)
Feb 16 13:35:47 compute-0 podman[210699]: 2026-02-16 13:35:47.022315494 +0000 UTC m=+0.058177231 container health_status d69bd0be854ec8043fcba555b70c2cd311c7a2510d07d6bc9929fe7684abece9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 16 13:35:49 compute-0 nova_compute[185723]: 2026-02-16 13:35:49.296 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:35:49 compute-0 nova_compute[185723]: 2026-02-16 13:35:49.402 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:35:51 compute-0 podman[210736]: 2026-02-16 13:35:51.09250116 +0000 UTC m=+0.125260702 container health_status 19961a2f872cc72daf954f4dd556f980b5f58f137d145776df6132c167a8a93c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Feb 16 13:35:53 compute-0 sshd-session[210764]: Invalid user admin from 64.227.72.94 port 48302
Feb 16 13:35:53 compute-0 sshd-session[210764]: Connection closed by invalid user admin 64.227.72.94 port 48302 [preauth]
Feb 16 13:35:54 compute-0 nova_compute[185723]: 2026-02-16 13:35:54.296 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:35:54 compute-0 nova_compute[185723]: 2026-02-16 13:35:54.404 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:35:56 compute-0 sshd-session[210766]: Invalid user apache from 188.166.42.159 port 38720
Feb 16 13:35:56 compute-0 sshd-session[210766]: Connection closed by invalid user apache 188.166.42.159 port 38720 [preauth]
Feb 16 13:35:59 compute-0 nova_compute[185723]: 2026-02-16 13:35:59.297 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:35:59 compute-0 nova_compute[185723]: 2026-02-16 13:35:59.405 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:35:59 compute-0 podman[195053]: time="2026-02-16T13:35:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:35:59 compute-0 podman[195053]: @ - - [16/Feb/2026:13:35:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16012 "" "Go-http-client/1.1"
Feb 16 13:35:59 compute-0 podman[195053]: @ - - [16/Feb/2026:13:35:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2175 "" "Go-http-client/1.1"
Feb 16 13:36:00 compute-0 sshd-session[210768]: Invalid user testuser from 146.190.22.227 port 53354
Feb 16 13:36:00 compute-0 sshd-session[210768]: Connection closed by invalid user testuser 146.190.22.227 port 53354 [preauth]
Feb 16 13:36:01 compute-0 openstack_network_exporter[197909]: ERROR   13:36:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:36:01 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:36:01 compute-0 openstack_network_exporter[197909]: ERROR   13:36:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:36:01 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:36:02 compute-0 podman[210770]: 2026-02-16 13:36:02.013481209 +0000 UTC m=+0.052344075 container health_status 4ec6105d1727f201f656d7e6e358931be8ceabc5af37fb5ad779abc620122180 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 16 13:36:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:36:03.227 105360 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:36:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:36:03.228 105360 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:36:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:36:03.228 105360 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:36:04 compute-0 nova_compute[185723]: 2026-02-16 13:36:04.306 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:36:04 compute-0 nova_compute[185723]: 2026-02-16 13:36:04.407 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:36:04 compute-0 nova_compute[185723]: 2026-02-16 13:36:04.826 185727 DEBUG oslo_concurrency.lockutils [None req-35ac8cee-f1af-421a-8897-8c5a538f4e01 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Acquiring lock "6403f5ce-8933-4efa-b4a5-611cd66c8a29" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:36:04 compute-0 nova_compute[185723]: 2026-02-16 13:36:04.826 185727 DEBUG oslo_concurrency.lockutils [None req-35ac8cee-f1af-421a-8897-8c5a538f4e01 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "6403f5ce-8933-4efa-b4a5-611cd66c8a29" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:36:04 compute-0 nova_compute[185723]: 2026-02-16 13:36:04.865 185727 DEBUG nova.compute.manager [None req-35ac8cee-f1af-421a-8897-8c5a538f4e01 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 6403f5ce-8933-4efa-b4a5-611cd66c8a29] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Feb 16 13:36:04 compute-0 sshd-session[210795]: Invalid user admin from 146.190.226.24 port 37486
Feb 16 13:36:04 compute-0 nova_compute[185723]: 2026-02-16 13:36:04.969 185727 DEBUG oslo_concurrency.lockutils [None req-35ac8cee-f1af-421a-8897-8c5a538f4e01 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:36:04 compute-0 nova_compute[185723]: 2026-02-16 13:36:04.970 185727 DEBUG oslo_concurrency.lockutils [None req-35ac8cee-f1af-421a-8897-8c5a538f4e01 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:36:04 compute-0 nova_compute[185723]: 2026-02-16 13:36:04.978 185727 DEBUG nova.virt.hardware [None req-35ac8cee-f1af-421a-8897-8c5a538f4e01 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Feb 16 13:36:04 compute-0 nova_compute[185723]: 2026-02-16 13:36:04.979 185727 INFO nova.compute.claims [None req-35ac8cee-f1af-421a-8897-8c5a538f4e01 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 6403f5ce-8933-4efa-b4a5-611cd66c8a29] Claim successful on node compute-0.ctlplane.example.com
Feb 16 13:36:05 compute-0 nova_compute[185723]: 2026-02-16 13:36:05.088 185727 DEBUG nova.compute.provider_tree [None req-35ac8cee-f1af-421a-8897-8c5a538f4e01 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Inventory has not changed in ProviderTree for provider: c9501a85-df32-4b8f-bce0-9425ef1e7866 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 13:36:05 compute-0 nova_compute[185723]: 2026-02-16 13:36:05.109 185727 DEBUG nova.scheduler.client.report [None req-35ac8cee-f1af-421a-8897-8c5a538f4e01 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Inventory has not changed for provider c9501a85-df32-4b8f-bce0-9425ef1e7866 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 13:36:05 compute-0 nova_compute[185723]: 2026-02-16 13:36:05.129 185727 DEBUG oslo_concurrency.lockutils [None req-35ac8cee-f1af-421a-8897-8c5a538f4e01 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.159s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:36:05 compute-0 nova_compute[185723]: 2026-02-16 13:36:05.130 185727 DEBUG nova.compute.manager [None req-35ac8cee-f1af-421a-8897-8c5a538f4e01 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 6403f5ce-8933-4efa-b4a5-611cd66c8a29] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Feb 16 13:36:05 compute-0 sshd-session[210795]: Connection closed by invalid user admin 146.190.226.24 port 37486 [preauth]
Feb 16 13:36:05 compute-0 nova_compute[185723]: 2026-02-16 13:36:05.177 185727 DEBUG nova.compute.manager [None req-35ac8cee-f1af-421a-8897-8c5a538f4e01 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 6403f5ce-8933-4efa-b4a5-611cd66c8a29] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Feb 16 13:36:05 compute-0 nova_compute[185723]: 2026-02-16 13:36:05.178 185727 DEBUG nova.network.neutron [None req-35ac8cee-f1af-421a-8897-8c5a538f4e01 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 6403f5ce-8933-4efa-b4a5-611cd66c8a29] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Feb 16 13:36:05 compute-0 nova_compute[185723]: 2026-02-16 13:36:05.203 185727 INFO nova.virt.libvirt.driver [None req-35ac8cee-f1af-421a-8897-8c5a538f4e01 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 6403f5ce-8933-4efa-b4a5-611cd66c8a29] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Feb 16 13:36:05 compute-0 nova_compute[185723]: 2026-02-16 13:36:05.235 185727 DEBUG nova.compute.manager [None req-35ac8cee-f1af-421a-8897-8c5a538f4e01 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 6403f5ce-8933-4efa-b4a5-611cd66c8a29] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Feb 16 13:36:05 compute-0 nova_compute[185723]: 2026-02-16 13:36:05.394 185727 DEBUG nova.compute.manager [None req-35ac8cee-f1af-421a-8897-8c5a538f4e01 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 6403f5ce-8933-4efa-b4a5-611cd66c8a29] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Feb 16 13:36:05 compute-0 nova_compute[185723]: 2026-02-16 13:36:05.396 185727 DEBUG nova.virt.libvirt.driver [None req-35ac8cee-f1af-421a-8897-8c5a538f4e01 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 6403f5ce-8933-4efa-b4a5-611cd66c8a29] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Feb 16 13:36:05 compute-0 nova_compute[185723]: 2026-02-16 13:36:05.396 185727 INFO nova.virt.libvirt.driver [None req-35ac8cee-f1af-421a-8897-8c5a538f4e01 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 6403f5ce-8933-4efa-b4a5-611cd66c8a29] Creating image(s)
Feb 16 13:36:05 compute-0 nova_compute[185723]: 2026-02-16 13:36:05.397 185727 DEBUG oslo_concurrency.lockutils [None req-35ac8cee-f1af-421a-8897-8c5a538f4e01 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Acquiring lock "/var/lib/nova/instances/6403f5ce-8933-4efa-b4a5-611cd66c8a29/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:36:05 compute-0 nova_compute[185723]: 2026-02-16 13:36:05.397 185727 DEBUG oslo_concurrency.lockutils [None req-35ac8cee-f1af-421a-8897-8c5a538f4e01 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "/var/lib/nova/instances/6403f5ce-8933-4efa-b4a5-611cd66c8a29/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:36:05 compute-0 nova_compute[185723]: 2026-02-16 13:36:05.398 185727 DEBUG oslo_concurrency.lockutils [None req-35ac8cee-f1af-421a-8897-8c5a538f4e01 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "/var/lib/nova/instances/6403f5ce-8933-4efa-b4a5-611cd66c8a29/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:36:05 compute-0 nova_compute[185723]: 2026-02-16 13:36:05.411 185727 DEBUG oslo_concurrency.processutils [None req-35ac8cee-f1af-421a-8897-8c5a538f4e01 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:36:05 compute-0 nova_compute[185723]: 2026-02-16 13:36:05.463 185727 DEBUG oslo_concurrency.processutils [None req-35ac8cee-f1af-421a-8897-8c5a538f4e01 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:36:05 compute-0 nova_compute[185723]: 2026-02-16 13:36:05.465 185727 DEBUG oslo_concurrency.lockutils [None req-35ac8cee-f1af-421a-8897-8c5a538f4e01 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Acquiring lock "755ae6cb63977e865a71a8c38a1a67ce95c9d0b7" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:36:05 compute-0 nova_compute[185723]: 2026-02-16 13:36:05.465 185727 DEBUG oslo_concurrency.lockutils [None req-35ac8cee-f1af-421a-8897-8c5a538f4e01 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "755ae6cb63977e865a71a8c38a1a67ce95c9d0b7" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:36:05 compute-0 nova_compute[185723]: 2026-02-16 13:36:05.475 185727 DEBUG oslo_concurrency.processutils [None req-35ac8cee-f1af-421a-8897-8c5a538f4e01 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:36:05 compute-0 nova_compute[185723]: 2026-02-16 13:36:05.526 185727 DEBUG oslo_concurrency.processutils [None req-35ac8cee-f1af-421a-8897-8c5a538f4e01 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:36:05 compute-0 nova_compute[185723]: 2026-02-16 13:36:05.527 185727 DEBUG oslo_concurrency.processutils [None req-35ac8cee-f1af-421a-8897-8c5a538f4e01 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7,backing_fmt=raw /var/lib/nova/instances/6403f5ce-8933-4efa-b4a5-611cd66c8a29/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:36:05 compute-0 nova_compute[185723]: 2026-02-16 13:36:05.560 185727 DEBUG oslo_concurrency.processutils [None req-35ac8cee-f1af-421a-8897-8c5a538f4e01 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7,backing_fmt=raw /var/lib/nova/instances/6403f5ce-8933-4efa-b4a5-611cd66c8a29/disk 1073741824" returned: 0 in 0.033s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:36:05 compute-0 nova_compute[185723]: 2026-02-16 13:36:05.562 185727 DEBUG oslo_concurrency.lockutils [None req-35ac8cee-f1af-421a-8897-8c5a538f4e01 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "755ae6cb63977e865a71a8c38a1a67ce95c9d0b7" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.096s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:36:05 compute-0 nova_compute[185723]: 2026-02-16 13:36:05.562 185727 DEBUG oslo_concurrency.processutils [None req-35ac8cee-f1af-421a-8897-8c5a538f4e01 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:36:05 compute-0 nova_compute[185723]: 2026-02-16 13:36:05.613 185727 DEBUG oslo_concurrency.processutils [None req-35ac8cee-f1af-421a-8897-8c5a538f4e01 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:36:05 compute-0 nova_compute[185723]: 2026-02-16 13:36:05.614 185727 DEBUG nova.virt.disk.api [None req-35ac8cee-f1af-421a-8897-8c5a538f4e01 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Checking if we can resize image /var/lib/nova/instances/6403f5ce-8933-4efa-b4a5-611cd66c8a29/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Feb 16 13:36:05 compute-0 nova_compute[185723]: 2026-02-16 13:36:05.615 185727 DEBUG oslo_concurrency.processutils [None req-35ac8cee-f1af-421a-8897-8c5a538f4e01 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6403f5ce-8933-4efa-b4a5-611cd66c8a29/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:36:05 compute-0 nova_compute[185723]: 2026-02-16 13:36:05.674 185727 DEBUG oslo_concurrency.processutils [None req-35ac8cee-f1af-421a-8897-8c5a538f4e01 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6403f5ce-8933-4efa-b4a5-611cd66c8a29/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:36:05 compute-0 nova_compute[185723]: 2026-02-16 13:36:05.677 185727 DEBUG nova.virt.disk.api [None req-35ac8cee-f1af-421a-8897-8c5a538f4e01 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Cannot resize image /var/lib/nova/instances/6403f5ce-8933-4efa-b4a5-611cd66c8a29/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Feb 16 13:36:05 compute-0 nova_compute[185723]: 2026-02-16 13:36:05.679 185727 DEBUG nova.objects.instance [None req-35ac8cee-f1af-421a-8897-8c5a538f4e01 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lazy-loading 'migration_context' on Instance uuid 6403f5ce-8933-4efa-b4a5-611cd66c8a29 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 13:36:05 compute-0 nova_compute[185723]: 2026-02-16 13:36:05.699 185727 DEBUG nova.virt.libvirt.driver [None req-35ac8cee-f1af-421a-8897-8c5a538f4e01 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 6403f5ce-8933-4efa-b4a5-611cd66c8a29] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Feb 16 13:36:05 compute-0 nova_compute[185723]: 2026-02-16 13:36:05.699 185727 DEBUG nova.virt.libvirt.driver [None req-35ac8cee-f1af-421a-8897-8c5a538f4e01 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 6403f5ce-8933-4efa-b4a5-611cd66c8a29] Ensure instance console log exists: /var/lib/nova/instances/6403f5ce-8933-4efa-b4a5-611cd66c8a29/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Feb 16 13:36:05 compute-0 nova_compute[185723]: 2026-02-16 13:36:05.700 185727 DEBUG oslo_concurrency.lockutils [None req-35ac8cee-f1af-421a-8897-8c5a538f4e01 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:36:05 compute-0 nova_compute[185723]: 2026-02-16 13:36:05.700 185727 DEBUG oslo_concurrency.lockutils [None req-35ac8cee-f1af-421a-8897-8c5a538f4e01 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:36:05 compute-0 nova_compute[185723]: 2026-02-16 13:36:05.701 185727 DEBUG oslo_concurrency.lockutils [None req-35ac8cee-f1af-421a-8897-8c5a538f4e01 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:36:06 compute-0 nova_compute[185723]: 2026-02-16 13:36:06.242 185727 DEBUG nova.policy [None req-35ac8cee-f1af-421a-8897-8c5a538f4e01 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'e19cd2d8a8894526ba620ca3249e9a63', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '76c271745e704d5fa97fe16a7dcd4a81', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Feb 16 13:36:07 compute-0 nova_compute[185723]: 2026-02-16 13:36:07.778 185727 DEBUG nova.network.neutron [None req-35ac8cee-f1af-421a-8897-8c5a538f4e01 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 6403f5ce-8933-4efa-b4a5-611cd66c8a29] Successfully created port: fe70cd80-cb05-4753-88d4-22ea28d8e9b0 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Feb 16 13:36:09 compute-0 nova_compute[185723]: 2026-02-16 13:36:09.015 185727 DEBUG nova.network.neutron [None req-35ac8cee-f1af-421a-8897-8c5a538f4e01 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 6403f5ce-8933-4efa-b4a5-611cd66c8a29] Successfully updated port: fe70cd80-cb05-4753-88d4-22ea28d8e9b0 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Feb 16 13:36:09 compute-0 nova_compute[185723]: 2026-02-16 13:36:09.039 185727 DEBUG oslo_concurrency.lockutils [None req-35ac8cee-f1af-421a-8897-8c5a538f4e01 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Acquiring lock "refresh_cache-6403f5ce-8933-4efa-b4a5-611cd66c8a29" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 13:36:09 compute-0 nova_compute[185723]: 2026-02-16 13:36:09.040 185727 DEBUG oslo_concurrency.lockutils [None req-35ac8cee-f1af-421a-8897-8c5a538f4e01 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Acquired lock "refresh_cache-6403f5ce-8933-4efa-b4a5-611cd66c8a29" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 13:36:09 compute-0 nova_compute[185723]: 2026-02-16 13:36:09.040 185727 DEBUG nova.network.neutron [None req-35ac8cee-f1af-421a-8897-8c5a538f4e01 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 6403f5ce-8933-4efa-b4a5-611cd66c8a29] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 16 13:36:09 compute-0 nova_compute[185723]: 2026-02-16 13:36:09.228 185727 DEBUG nova.network.neutron [None req-35ac8cee-f1af-421a-8897-8c5a538f4e01 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 6403f5ce-8933-4efa-b4a5-611cd66c8a29] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Feb 16 13:36:09 compute-0 nova_compute[185723]: 2026-02-16 13:36:09.269 185727 DEBUG nova.compute.manager [req-d0942370-4853-4058-9f3e-a68a9a57b5f2 req-ce13d5b1-9b14-48bf-991c-51ce6d749d1d faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 6403f5ce-8933-4efa-b4a5-611cd66c8a29] Received event network-changed-fe70cd80-cb05-4753-88d4-22ea28d8e9b0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:36:09 compute-0 nova_compute[185723]: 2026-02-16 13:36:09.270 185727 DEBUG nova.compute.manager [req-d0942370-4853-4058-9f3e-a68a9a57b5f2 req-ce13d5b1-9b14-48bf-991c-51ce6d749d1d faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 6403f5ce-8933-4efa-b4a5-611cd66c8a29] Refreshing instance network info cache due to event network-changed-fe70cd80-cb05-4753-88d4-22ea28d8e9b0. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 16 13:36:09 compute-0 nova_compute[185723]: 2026-02-16 13:36:09.270 185727 DEBUG oslo_concurrency.lockutils [req-d0942370-4853-4058-9f3e-a68a9a57b5f2 req-ce13d5b1-9b14-48bf-991c-51ce6d749d1d faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "refresh_cache-6403f5ce-8933-4efa-b4a5-611cd66c8a29" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 13:36:09 compute-0 nova_compute[185723]: 2026-02-16 13:36:09.309 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:36:09 compute-0 nova_compute[185723]: 2026-02-16 13:36:09.408 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:36:10 compute-0 nova_compute[185723]: 2026-02-16 13:36:10.587 185727 DEBUG nova.network.neutron [None req-35ac8cee-f1af-421a-8897-8c5a538f4e01 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 6403f5ce-8933-4efa-b4a5-611cd66c8a29] Updating instance_info_cache with network_info: [{"id": "fe70cd80-cb05-4753-88d4-22ea28d8e9b0", "address": "fa:16:3e:ff:48:e8", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfe70cd80-cb", "ovs_interfaceid": "fe70cd80-cb05-4753-88d4-22ea28d8e9b0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 13:36:10 compute-0 nova_compute[185723]: 2026-02-16 13:36:10.611 185727 DEBUG oslo_concurrency.lockutils [None req-35ac8cee-f1af-421a-8897-8c5a538f4e01 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Releasing lock "refresh_cache-6403f5ce-8933-4efa-b4a5-611cd66c8a29" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 13:36:10 compute-0 nova_compute[185723]: 2026-02-16 13:36:10.612 185727 DEBUG nova.compute.manager [None req-35ac8cee-f1af-421a-8897-8c5a538f4e01 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 6403f5ce-8933-4efa-b4a5-611cd66c8a29] Instance network_info: |[{"id": "fe70cd80-cb05-4753-88d4-22ea28d8e9b0", "address": "fa:16:3e:ff:48:e8", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfe70cd80-cb", "ovs_interfaceid": "fe70cd80-cb05-4753-88d4-22ea28d8e9b0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Feb 16 13:36:10 compute-0 nova_compute[185723]: 2026-02-16 13:36:10.612 185727 DEBUG oslo_concurrency.lockutils [req-d0942370-4853-4058-9f3e-a68a9a57b5f2 req-ce13d5b1-9b14-48bf-991c-51ce6d749d1d faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquired lock "refresh_cache-6403f5ce-8933-4efa-b4a5-611cd66c8a29" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 13:36:10 compute-0 nova_compute[185723]: 2026-02-16 13:36:10.613 185727 DEBUG nova.network.neutron [req-d0942370-4853-4058-9f3e-a68a9a57b5f2 req-ce13d5b1-9b14-48bf-991c-51ce6d749d1d faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 6403f5ce-8933-4efa-b4a5-611cd66c8a29] Refreshing network info cache for port fe70cd80-cb05-4753-88d4-22ea28d8e9b0 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 16 13:36:10 compute-0 nova_compute[185723]: 2026-02-16 13:36:10.616 185727 DEBUG nova.virt.libvirt.driver [None req-35ac8cee-f1af-421a-8897-8c5a538f4e01 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 6403f5ce-8933-4efa-b4a5-611cd66c8a29] Start _get_guest_xml network_info=[{"id": "fe70cd80-cb05-4753-88d4-22ea28d8e9b0", "address": "fa:16:3e:ff:48:e8", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfe70cd80-cb", "ovs_interfaceid": "fe70cd80-cb05-4753-88d4-22ea28d8e9b0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-16T13:17:01Z,direct_url=<?>,disk_format='qcow2',id=6fb9af7f-2971-4890-a777-6e99e888717f,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='fa8ec31824694513a42cc22c81880a5c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-16T13:17:03Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'encrypted': False, 'device_type': 'disk', 'disk_bus': 'virtio', 'encryption_options': None, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'size': 0, 'image_id': '6fb9af7f-2971-4890-a777-6e99e888717f'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Feb 16 13:36:10 compute-0 nova_compute[185723]: 2026-02-16 13:36:10.621 185727 WARNING nova.virt.libvirt.driver [None req-35ac8cee-f1af-421a-8897-8c5a538f4e01 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 13:36:10 compute-0 nova_compute[185723]: 2026-02-16 13:36:10.628 185727 DEBUG nova.virt.libvirt.host [None req-35ac8cee-f1af-421a-8897-8c5a538f4e01 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Feb 16 13:36:10 compute-0 nova_compute[185723]: 2026-02-16 13:36:10.629 185727 DEBUG nova.virt.libvirt.host [None req-35ac8cee-f1af-421a-8897-8c5a538f4e01 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Feb 16 13:36:10 compute-0 nova_compute[185723]: 2026-02-16 13:36:10.636 185727 DEBUG nova.virt.libvirt.host [None req-35ac8cee-f1af-421a-8897-8c5a538f4e01 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Feb 16 13:36:10 compute-0 nova_compute[185723]: 2026-02-16 13:36:10.637 185727 DEBUG nova.virt.libvirt.host [None req-35ac8cee-f1af-421a-8897-8c5a538f4e01 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Feb 16 13:36:10 compute-0 nova_compute[185723]: 2026-02-16 13:36:10.638 185727 DEBUG nova.virt.libvirt.driver [None req-35ac8cee-f1af-421a-8897-8c5a538f4e01 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Feb 16 13:36:10 compute-0 nova_compute[185723]: 2026-02-16 13:36:10.638 185727 DEBUG nova.virt.hardware [None req-35ac8cee-f1af-421a-8897-8c5a538f4e01 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-02-16T13:16:59Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='6d89f72c-1760-421e-a5f2-83dfc3723b84',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-16T13:17:01Z,direct_url=<?>,disk_format='qcow2',id=6fb9af7f-2971-4890-a777-6e99e888717f,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='fa8ec31824694513a42cc22c81880a5c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-16T13:17:03Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Feb 16 13:36:10 compute-0 nova_compute[185723]: 2026-02-16 13:36:10.638 185727 DEBUG nova.virt.hardware [None req-35ac8cee-f1af-421a-8897-8c5a538f4e01 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Feb 16 13:36:10 compute-0 nova_compute[185723]: 2026-02-16 13:36:10.639 185727 DEBUG nova.virt.hardware [None req-35ac8cee-f1af-421a-8897-8c5a538f4e01 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Feb 16 13:36:10 compute-0 nova_compute[185723]: 2026-02-16 13:36:10.639 185727 DEBUG nova.virt.hardware [None req-35ac8cee-f1af-421a-8897-8c5a538f4e01 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Feb 16 13:36:10 compute-0 nova_compute[185723]: 2026-02-16 13:36:10.639 185727 DEBUG nova.virt.hardware [None req-35ac8cee-f1af-421a-8897-8c5a538f4e01 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Feb 16 13:36:10 compute-0 nova_compute[185723]: 2026-02-16 13:36:10.639 185727 DEBUG nova.virt.hardware [None req-35ac8cee-f1af-421a-8897-8c5a538f4e01 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Feb 16 13:36:10 compute-0 nova_compute[185723]: 2026-02-16 13:36:10.640 185727 DEBUG nova.virt.hardware [None req-35ac8cee-f1af-421a-8897-8c5a538f4e01 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Feb 16 13:36:10 compute-0 nova_compute[185723]: 2026-02-16 13:36:10.640 185727 DEBUG nova.virt.hardware [None req-35ac8cee-f1af-421a-8897-8c5a538f4e01 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Feb 16 13:36:10 compute-0 nova_compute[185723]: 2026-02-16 13:36:10.640 185727 DEBUG nova.virt.hardware [None req-35ac8cee-f1af-421a-8897-8c5a538f4e01 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Feb 16 13:36:10 compute-0 nova_compute[185723]: 2026-02-16 13:36:10.640 185727 DEBUG nova.virt.hardware [None req-35ac8cee-f1af-421a-8897-8c5a538f4e01 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Feb 16 13:36:10 compute-0 nova_compute[185723]: 2026-02-16 13:36:10.640 185727 DEBUG nova.virt.hardware [None req-35ac8cee-f1af-421a-8897-8c5a538f4e01 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Feb 16 13:36:10 compute-0 nova_compute[185723]: 2026-02-16 13:36:10.644 185727 DEBUG nova.virt.libvirt.vif [None req-35ac8cee-f1af-421a-8897-8c5a538f4e01 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-16T13:36:02Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1632444659',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1632444659',id=12,image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='76c271745e704d5fa97fe16a7dcd4a81',ramdisk_id='',reservation_id='r-efsiex6g',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-1085993185',owner_user_name='tempest-TestExecuteStrategies-1085993185-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-16T13:36:05Z,user_data=None,user_id='e19cd2d8a8894526ba620ca3249e9a63',uuid=6403f5ce-8933-4efa-b4a5-611cd66c8a29,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "fe70cd80-cb05-4753-88d4-22ea28d8e9b0", "address": "fa:16:3e:ff:48:e8", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfe70cd80-cb", "ovs_interfaceid": "fe70cd80-cb05-4753-88d4-22ea28d8e9b0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Feb 16 13:36:10 compute-0 nova_compute[185723]: 2026-02-16 13:36:10.644 185727 DEBUG nova.network.os_vif_util [None req-35ac8cee-f1af-421a-8897-8c5a538f4e01 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Converting VIF {"id": "fe70cd80-cb05-4753-88d4-22ea28d8e9b0", "address": "fa:16:3e:ff:48:e8", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfe70cd80-cb", "ovs_interfaceid": "fe70cd80-cb05-4753-88d4-22ea28d8e9b0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 13:36:10 compute-0 nova_compute[185723]: 2026-02-16 13:36:10.645 185727 DEBUG nova.network.os_vif_util [None req-35ac8cee-f1af-421a-8897-8c5a538f4e01 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ff:48:e8,bridge_name='br-int',has_traffic_filtering=True,id=fe70cd80-cb05-4753-88d4-22ea28d8e9b0,network=Network(62a1ccdd-3048-4bbf-acc8-c791bff79ee8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfe70cd80-cb') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 13:36:10 compute-0 nova_compute[185723]: 2026-02-16 13:36:10.646 185727 DEBUG nova.objects.instance [None req-35ac8cee-f1af-421a-8897-8c5a538f4e01 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lazy-loading 'pci_devices' on Instance uuid 6403f5ce-8933-4efa-b4a5-611cd66c8a29 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 13:36:10 compute-0 nova_compute[185723]: 2026-02-16 13:36:10.671 185727 DEBUG nova.virt.libvirt.driver [None req-35ac8cee-f1af-421a-8897-8c5a538f4e01 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 6403f5ce-8933-4efa-b4a5-611cd66c8a29] End _get_guest_xml xml=<domain type="kvm">
Feb 16 13:36:10 compute-0 nova_compute[185723]:   <uuid>6403f5ce-8933-4efa-b4a5-611cd66c8a29</uuid>
Feb 16 13:36:10 compute-0 nova_compute[185723]:   <name>instance-0000000c</name>
Feb 16 13:36:10 compute-0 nova_compute[185723]:   <memory>131072</memory>
Feb 16 13:36:10 compute-0 nova_compute[185723]:   <vcpu>1</vcpu>
Feb 16 13:36:10 compute-0 nova_compute[185723]:   <metadata>
Feb 16 13:36:10 compute-0 nova_compute[185723]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb 16 13:36:10 compute-0 nova_compute[185723]:       <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Feb 16 13:36:10 compute-0 nova_compute[185723]:       <nova:name>tempest-TestExecuteStrategies-server-1632444659</nova:name>
Feb 16 13:36:10 compute-0 nova_compute[185723]:       <nova:creationTime>2026-02-16 13:36:10</nova:creationTime>
Feb 16 13:36:10 compute-0 nova_compute[185723]:       <nova:flavor name="m1.nano">
Feb 16 13:36:10 compute-0 nova_compute[185723]:         <nova:memory>128</nova:memory>
Feb 16 13:36:10 compute-0 nova_compute[185723]:         <nova:disk>1</nova:disk>
Feb 16 13:36:10 compute-0 nova_compute[185723]:         <nova:swap>0</nova:swap>
Feb 16 13:36:10 compute-0 nova_compute[185723]:         <nova:ephemeral>0</nova:ephemeral>
Feb 16 13:36:10 compute-0 nova_compute[185723]:         <nova:vcpus>1</nova:vcpus>
Feb 16 13:36:10 compute-0 nova_compute[185723]:       </nova:flavor>
Feb 16 13:36:10 compute-0 nova_compute[185723]:       <nova:owner>
Feb 16 13:36:10 compute-0 nova_compute[185723]:         <nova:user uuid="e19cd2d8a8894526ba620ca3249e9a63">tempest-TestExecuteStrategies-1085993185-project-member</nova:user>
Feb 16 13:36:10 compute-0 nova_compute[185723]:         <nova:project uuid="76c271745e704d5fa97fe16a7dcd4a81">tempest-TestExecuteStrategies-1085993185</nova:project>
Feb 16 13:36:10 compute-0 nova_compute[185723]:       </nova:owner>
Feb 16 13:36:10 compute-0 nova_compute[185723]:       <nova:root type="image" uuid="6fb9af7f-2971-4890-a777-6e99e888717f"/>
Feb 16 13:36:10 compute-0 nova_compute[185723]:       <nova:ports>
Feb 16 13:36:10 compute-0 nova_compute[185723]:         <nova:port uuid="fe70cd80-cb05-4753-88d4-22ea28d8e9b0">
Feb 16 13:36:10 compute-0 nova_compute[185723]:           <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Feb 16 13:36:10 compute-0 nova_compute[185723]:         </nova:port>
Feb 16 13:36:10 compute-0 nova_compute[185723]:       </nova:ports>
Feb 16 13:36:10 compute-0 nova_compute[185723]:     </nova:instance>
Feb 16 13:36:10 compute-0 nova_compute[185723]:   </metadata>
Feb 16 13:36:10 compute-0 nova_compute[185723]:   <sysinfo type="smbios">
Feb 16 13:36:10 compute-0 nova_compute[185723]:     <system>
Feb 16 13:36:10 compute-0 nova_compute[185723]:       <entry name="manufacturer">RDO</entry>
Feb 16 13:36:10 compute-0 nova_compute[185723]:       <entry name="product">OpenStack Compute</entry>
Feb 16 13:36:10 compute-0 nova_compute[185723]:       <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Feb 16 13:36:10 compute-0 nova_compute[185723]:       <entry name="serial">6403f5ce-8933-4efa-b4a5-611cd66c8a29</entry>
Feb 16 13:36:10 compute-0 nova_compute[185723]:       <entry name="uuid">6403f5ce-8933-4efa-b4a5-611cd66c8a29</entry>
Feb 16 13:36:10 compute-0 nova_compute[185723]:       <entry name="family">Virtual Machine</entry>
Feb 16 13:36:10 compute-0 nova_compute[185723]:     </system>
Feb 16 13:36:10 compute-0 nova_compute[185723]:   </sysinfo>
Feb 16 13:36:10 compute-0 nova_compute[185723]:   <os>
Feb 16 13:36:10 compute-0 nova_compute[185723]:     <type arch="x86_64" machine="q35">hvm</type>
Feb 16 13:36:10 compute-0 nova_compute[185723]:     <boot dev="hd"/>
Feb 16 13:36:10 compute-0 nova_compute[185723]:     <smbios mode="sysinfo"/>
Feb 16 13:36:10 compute-0 nova_compute[185723]:   </os>
Feb 16 13:36:10 compute-0 nova_compute[185723]:   <features>
Feb 16 13:36:10 compute-0 nova_compute[185723]:     <acpi/>
Feb 16 13:36:10 compute-0 nova_compute[185723]:     <apic/>
Feb 16 13:36:10 compute-0 nova_compute[185723]:     <vmcoreinfo/>
Feb 16 13:36:10 compute-0 nova_compute[185723]:   </features>
Feb 16 13:36:10 compute-0 nova_compute[185723]:   <clock offset="utc">
Feb 16 13:36:10 compute-0 nova_compute[185723]:     <timer name="pit" tickpolicy="delay"/>
Feb 16 13:36:10 compute-0 nova_compute[185723]:     <timer name="rtc" tickpolicy="catchup"/>
Feb 16 13:36:10 compute-0 nova_compute[185723]:     <timer name="hpet" present="no"/>
Feb 16 13:36:10 compute-0 nova_compute[185723]:   </clock>
Feb 16 13:36:10 compute-0 nova_compute[185723]:   <cpu mode="custom" match="exact">
Feb 16 13:36:10 compute-0 nova_compute[185723]:     <model>Nehalem</model>
Feb 16 13:36:10 compute-0 nova_compute[185723]:     <topology sockets="1" cores="1" threads="1"/>
Feb 16 13:36:10 compute-0 nova_compute[185723]:   </cpu>
Feb 16 13:36:10 compute-0 nova_compute[185723]:   <devices>
Feb 16 13:36:10 compute-0 nova_compute[185723]:     <disk type="file" device="disk">
Feb 16 13:36:10 compute-0 nova_compute[185723]:       <driver name="qemu" type="qcow2" cache="none"/>
Feb 16 13:36:10 compute-0 nova_compute[185723]:       <source file="/var/lib/nova/instances/6403f5ce-8933-4efa-b4a5-611cd66c8a29/disk"/>
Feb 16 13:36:10 compute-0 nova_compute[185723]:       <target dev="vda" bus="virtio"/>
Feb 16 13:36:10 compute-0 nova_compute[185723]:     </disk>
Feb 16 13:36:10 compute-0 nova_compute[185723]:     <disk type="file" device="cdrom">
Feb 16 13:36:10 compute-0 nova_compute[185723]:       <driver name="qemu" type="raw" cache="none"/>
Feb 16 13:36:10 compute-0 nova_compute[185723]:       <source file="/var/lib/nova/instances/6403f5ce-8933-4efa-b4a5-611cd66c8a29/disk.config"/>
Feb 16 13:36:10 compute-0 nova_compute[185723]:       <target dev="sda" bus="sata"/>
Feb 16 13:36:10 compute-0 nova_compute[185723]:     </disk>
Feb 16 13:36:10 compute-0 nova_compute[185723]:     <interface type="ethernet">
Feb 16 13:36:10 compute-0 nova_compute[185723]:       <mac address="fa:16:3e:ff:48:e8"/>
Feb 16 13:36:10 compute-0 nova_compute[185723]:       <model type="virtio"/>
Feb 16 13:36:10 compute-0 nova_compute[185723]:       <driver name="vhost" rx_queue_size="512"/>
Feb 16 13:36:10 compute-0 nova_compute[185723]:       <mtu size="1442"/>
Feb 16 13:36:10 compute-0 nova_compute[185723]:       <target dev="tapfe70cd80-cb"/>
Feb 16 13:36:10 compute-0 nova_compute[185723]:     </interface>
Feb 16 13:36:10 compute-0 nova_compute[185723]:     <serial type="pty">
Feb 16 13:36:10 compute-0 nova_compute[185723]:       <log file="/var/lib/nova/instances/6403f5ce-8933-4efa-b4a5-611cd66c8a29/console.log" append="off"/>
Feb 16 13:36:10 compute-0 nova_compute[185723]:     </serial>
Feb 16 13:36:10 compute-0 nova_compute[185723]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Feb 16 13:36:10 compute-0 nova_compute[185723]:     <video>
Feb 16 13:36:10 compute-0 nova_compute[185723]:       <model type="virtio"/>
Feb 16 13:36:10 compute-0 nova_compute[185723]:     </video>
Feb 16 13:36:10 compute-0 nova_compute[185723]:     <input type="tablet" bus="usb"/>
Feb 16 13:36:10 compute-0 nova_compute[185723]:     <rng model="virtio">
Feb 16 13:36:10 compute-0 nova_compute[185723]:       <backend model="random">/dev/urandom</backend>
Feb 16 13:36:10 compute-0 nova_compute[185723]:     </rng>
Feb 16 13:36:10 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root"/>
Feb 16 13:36:10 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:36:10 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:36:10 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:36:10 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:36:10 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:36:10 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:36:10 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:36:10 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:36:10 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:36:10 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:36:10 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:36:10 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:36:10 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:36:10 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:36:10 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:36:10 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:36:10 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:36:10 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:36:10 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:36:10 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:36:10 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:36:10 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:36:10 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:36:10 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:36:10 compute-0 nova_compute[185723]:     <controller type="usb" index="0"/>
Feb 16 13:36:10 compute-0 nova_compute[185723]:     <memballoon model="virtio">
Feb 16 13:36:10 compute-0 nova_compute[185723]:       <stats period="10"/>
Feb 16 13:36:10 compute-0 nova_compute[185723]:     </memballoon>
Feb 16 13:36:10 compute-0 nova_compute[185723]:   </devices>
Feb 16 13:36:10 compute-0 nova_compute[185723]: </domain>
Feb 16 13:36:10 compute-0 nova_compute[185723]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Feb 16 13:36:10 compute-0 nova_compute[185723]: 2026-02-16 13:36:10.672 185727 DEBUG nova.compute.manager [None req-35ac8cee-f1af-421a-8897-8c5a538f4e01 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 6403f5ce-8933-4efa-b4a5-611cd66c8a29] Preparing to wait for external event network-vif-plugged-fe70cd80-cb05-4753-88d4-22ea28d8e9b0 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Feb 16 13:36:10 compute-0 nova_compute[185723]: 2026-02-16 13:36:10.673 185727 DEBUG oslo_concurrency.lockutils [None req-35ac8cee-f1af-421a-8897-8c5a538f4e01 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Acquiring lock "6403f5ce-8933-4efa-b4a5-611cd66c8a29-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:36:10 compute-0 nova_compute[185723]: 2026-02-16 13:36:10.673 185727 DEBUG oslo_concurrency.lockutils [None req-35ac8cee-f1af-421a-8897-8c5a538f4e01 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "6403f5ce-8933-4efa-b4a5-611cd66c8a29-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:36:10 compute-0 nova_compute[185723]: 2026-02-16 13:36:10.673 185727 DEBUG oslo_concurrency.lockutils [None req-35ac8cee-f1af-421a-8897-8c5a538f4e01 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "6403f5ce-8933-4efa-b4a5-611cd66c8a29-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:36:10 compute-0 nova_compute[185723]: 2026-02-16 13:36:10.674 185727 DEBUG nova.virt.libvirt.vif [None req-35ac8cee-f1af-421a-8897-8c5a538f4e01 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-16T13:36:02Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1632444659',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1632444659',id=12,image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='76c271745e704d5fa97fe16a7dcd4a81',ramdisk_id='',reservation_id='r-efsiex6g',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-1085993185',owner_user_name='tempest-TestExecuteStrategies-1085993185-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-16T13:36:05Z,user_data=None,user_id='e19cd2d8a8894526ba620ca3249e9a63',uuid=6403f5ce-8933-4efa-b4a5-611cd66c8a29,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "fe70cd80-cb05-4753-88d4-22ea28d8e9b0", "address": "fa:16:3e:ff:48:e8", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfe70cd80-cb", "ovs_interfaceid": "fe70cd80-cb05-4753-88d4-22ea28d8e9b0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Feb 16 13:36:10 compute-0 nova_compute[185723]: 2026-02-16 13:36:10.674 185727 DEBUG nova.network.os_vif_util [None req-35ac8cee-f1af-421a-8897-8c5a538f4e01 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Converting VIF {"id": "fe70cd80-cb05-4753-88d4-22ea28d8e9b0", "address": "fa:16:3e:ff:48:e8", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfe70cd80-cb", "ovs_interfaceid": "fe70cd80-cb05-4753-88d4-22ea28d8e9b0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 13:36:10 compute-0 nova_compute[185723]: 2026-02-16 13:36:10.675 185727 DEBUG nova.network.os_vif_util [None req-35ac8cee-f1af-421a-8897-8c5a538f4e01 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ff:48:e8,bridge_name='br-int',has_traffic_filtering=True,id=fe70cd80-cb05-4753-88d4-22ea28d8e9b0,network=Network(62a1ccdd-3048-4bbf-acc8-c791bff79ee8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfe70cd80-cb') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 13:36:10 compute-0 nova_compute[185723]: 2026-02-16 13:36:10.676 185727 DEBUG os_vif [None req-35ac8cee-f1af-421a-8897-8c5a538f4e01 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ff:48:e8,bridge_name='br-int',has_traffic_filtering=True,id=fe70cd80-cb05-4753-88d4-22ea28d8e9b0,network=Network(62a1ccdd-3048-4bbf-acc8-c791bff79ee8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfe70cd80-cb') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Feb 16 13:36:10 compute-0 nova_compute[185723]: 2026-02-16 13:36:10.676 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:36:10 compute-0 nova_compute[185723]: 2026-02-16 13:36:10.677 185727 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:36:10 compute-0 nova_compute[185723]: 2026-02-16 13:36:10.677 185727 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 13:36:10 compute-0 nova_compute[185723]: 2026-02-16 13:36:10.680 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:36:10 compute-0 nova_compute[185723]: 2026-02-16 13:36:10.680 185727 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfe70cd80-cb, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:36:10 compute-0 nova_compute[185723]: 2026-02-16 13:36:10.680 185727 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapfe70cd80-cb, col_values=(('external_ids', {'iface-id': 'fe70cd80-cb05-4753-88d4-22ea28d8e9b0', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ff:48:e8', 'vm-uuid': '6403f5ce-8933-4efa-b4a5-611cd66c8a29'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:36:10 compute-0 nova_compute[185723]: 2026-02-16 13:36:10.682 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:36:10 compute-0 NetworkManager[56177]: <info>  [1771248970.6843] manager: (tapfe70cd80-cb): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/43)
Feb 16 13:36:10 compute-0 nova_compute[185723]: 2026-02-16 13:36:10.685 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 13:36:10 compute-0 nova_compute[185723]: 2026-02-16 13:36:10.688 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:36:10 compute-0 nova_compute[185723]: 2026-02-16 13:36:10.689 185727 INFO os_vif [None req-35ac8cee-f1af-421a-8897-8c5a538f4e01 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ff:48:e8,bridge_name='br-int',has_traffic_filtering=True,id=fe70cd80-cb05-4753-88d4-22ea28d8e9b0,network=Network(62a1ccdd-3048-4bbf-acc8-c791bff79ee8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfe70cd80-cb')
Feb 16 13:36:10 compute-0 nova_compute[185723]: 2026-02-16 13:36:10.886 185727 DEBUG nova.virt.libvirt.driver [None req-35ac8cee-f1af-421a-8897-8c5a538f4e01 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 16 13:36:10 compute-0 nova_compute[185723]: 2026-02-16 13:36:10.887 185727 DEBUG nova.virt.libvirt.driver [None req-35ac8cee-f1af-421a-8897-8c5a538f4e01 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 16 13:36:10 compute-0 nova_compute[185723]: 2026-02-16 13:36:10.887 185727 DEBUG nova.virt.libvirt.driver [None req-35ac8cee-f1af-421a-8897-8c5a538f4e01 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] No VIF found with MAC fa:16:3e:ff:48:e8, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Feb 16 13:36:10 compute-0 nova_compute[185723]: 2026-02-16 13:36:10.887 185727 INFO nova.virt.libvirt.driver [None req-35ac8cee-f1af-421a-8897-8c5a538f4e01 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 6403f5ce-8933-4efa-b4a5-611cd66c8a29] Using config drive
Feb 16 13:36:11 compute-0 nova_compute[185723]: 2026-02-16 13:36:11.358 185727 INFO nova.virt.libvirt.driver [None req-35ac8cee-f1af-421a-8897-8c5a538f4e01 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 6403f5ce-8933-4efa-b4a5-611cd66c8a29] Creating config drive at /var/lib/nova/instances/6403f5ce-8933-4efa-b4a5-611cd66c8a29/disk.config
Feb 16 13:36:11 compute-0 nova_compute[185723]: 2026-02-16 13:36:11.363 185727 DEBUG oslo_concurrency.processutils [None req-35ac8cee-f1af-421a-8897-8c5a538f4e01 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/6403f5ce-8933-4efa-b4a5-611cd66c8a29/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmplqsj6tyh execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:36:11 compute-0 nova_compute[185723]: 2026-02-16 13:36:11.433 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:36:11 compute-0 nova_compute[185723]: 2026-02-16 13:36:11.488 185727 DEBUG oslo_concurrency.processutils [None req-35ac8cee-f1af-421a-8897-8c5a538f4e01 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/6403f5ce-8933-4efa-b4a5-611cd66c8a29/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmplqsj6tyh" returned: 0 in 0.125s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:36:11 compute-0 kernel: tapfe70cd80-cb: entered promiscuous mode
Feb 16 13:36:11 compute-0 NetworkManager[56177]: <info>  [1771248971.5564] manager: (tapfe70cd80-cb): new Tun device (/org/freedesktop/NetworkManager/Devices/44)
Feb 16 13:36:11 compute-0 nova_compute[185723]: 2026-02-16 13:36:11.556 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:36:11 compute-0 ovn_controller[96072]: 2026-02-16T13:36:11Z|00100|binding|INFO|Claiming lport fe70cd80-cb05-4753-88d4-22ea28d8e9b0 for this chassis.
Feb 16 13:36:11 compute-0 ovn_controller[96072]: 2026-02-16T13:36:11Z|00101|binding|INFO|fe70cd80-cb05-4753-88d4-22ea28d8e9b0: Claiming fa:16:3e:ff:48:e8 10.100.0.5
Feb 16 13:36:11 compute-0 nova_compute[185723]: 2026-02-16 13:36:11.559 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:36:11 compute-0 nova_compute[185723]: 2026-02-16 13:36:11.565 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:36:11 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:36:11.575 105360 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ff:48:e8 10.100.0.5'], port_security=['fa:16:3e:ff:48:e8 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '6403f5ce-8933-4efa-b4a5-611cd66c8a29', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '76c271745e704d5fa97fe16a7dcd4a81', 'neutron:revision_number': '2', 'neutron:security_group_ids': '832f9d98-96fb-45e2-8c11-0f4534711ee2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fb3ae2f5-242d-4288-83f4-c053c76499e5, chassis=[<ovs.db.idl.Row object at 0x7f2e53046760>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2e53046760>], logical_port=fe70cd80-cb05-4753-88d4-22ea28d8e9b0) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 13:36:11 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:36:11.577 105360 INFO neutron.agent.ovn.metadata.agent [-] Port fe70cd80-cb05-4753-88d4-22ea28d8e9b0 in datapath 62a1ccdd-3048-4bbf-acc8-c791bff79ee8 bound to our chassis
Feb 16 13:36:11 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:36:11.578 105360 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 62a1ccdd-3048-4bbf-acc8-c791bff79ee8
Feb 16 13:36:11 compute-0 ovn_controller[96072]: 2026-02-16T13:36:11Z|00102|binding|INFO|Setting lport fe70cd80-cb05-4753-88d4-22ea28d8e9b0 ovn-installed in OVS
Feb 16 13:36:11 compute-0 ovn_controller[96072]: 2026-02-16T13:36:11Z|00103|binding|INFO|Setting lport fe70cd80-cb05-4753-88d4-22ea28d8e9b0 up in Southbound
Feb 16 13:36:11 compute-0 nova_compute[185723]: 2026-02-16 13:36:11.581 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:36:11 compute-0 systemd-udevd[210832]: Network interface NamePolicy= disabled on kernel command line.
Feb 16 13:36:11 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:36:11.589 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[8a10ac18-26c7-4cd8-8a2f-53a23a810c9f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:36:11 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:36:11.592 105360 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap62a1ccdd-31 in ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Feb 16 13:36:11 compute-0 systemd-machined[155229]: New machine qemu-8-instance-0000000c.
Feb 16 13:36:11 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:36:11.594 206438 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap62a1ccdd-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Feb 16 13:36:11 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:36:11.594 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[0aa0f1a8-8ba9-488c-9f92-f5e6a551c7f7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:36:11 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:36:11.596 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[06d51b48-58bb-41f1-b755-9ac859a6f919]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:36:11 compute-0 NetworkManager[56177]: <info>  [1771248971.6063] device (tapfe70cd80-cb): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 16 13:36:11 compute-0 NetworkManager[56177]: <info>  [1771248971.6071] device (tapfe70cd80-cb): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Feb 16 13:36:11 compute-0 systemd[1]: Started Virtual Machine qemu-8-instance-0000000c.
Feb 16 13:36:11 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:36:11.611 105762 DEBUG oslo.privsep.daemon [-] privsep: reply[17f6cb57-a308-4e5c-8814-1791fe03c0a3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:36:11 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:36:11.624 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[b584d5fc-952b-4456-9431-094503df6e28]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:36:11 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:36:11.651 206452 DEBUG oslo.privsep.daemon [-] privsep: reply[159e5844-e187-4a95-a1c0-21804b5e47fe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:36:11 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:36:11.657 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[e4479ac1-24e4-4034-b299-3843b0d4786f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:36:11 compute-0 NetworkManager[56177]: <info>  [1771248971.6586] manager: (tap62a1ccdd-30): new Veth device (/org/freedesktop/NetworkManager/Devices/45)
Feb 16 13:36:11 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:36:11.686 206452 DEBUG oslo.privsep.daemon [-] privsep: reply[72a71409-6e5a-4b74-9df6-d944411f28dc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:36:11 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:36:11.690 206452 DEBUG oslo.privsep.daemon [-] privsep: reply[00ed8869-e2d8-442c-8fe6-7775aef68192]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:36:11 compute-0 NetworkManager[56177]: <info>  [1771248971.7099] device (tap62a1ccdd-30): carrier: link connected
Feb 16 13:36:11 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:36:11.715 206452 DEBUG oslo.privsep.daemon [-] privsep: reply[1acb2f68-b3f8-472f-8633-8cc7d3754c02]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:36:11 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:36:11.733 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[95832259-f631-4f4b-8d04-e8d4b22171d5]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap62a1ccdd-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a9:94:92'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 30], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 505510, 'reachable_time': 30793, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 210864, 'error': None, 'target': 'ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:36:11 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:36:11.749 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[76577911-f522-43d5-8d61-6c409596e73f]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fea9:9492'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 505510, 'tstamp': 505510}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 210865, 'error': None, 'target': 'ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:36:11 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:36:11.769 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[83b0fbca-3bc9-41ff-bb61-e5acecaa8873]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap62a1ccdd-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a9:94:92'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 30], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 505510, 'reachable_time': 30793, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 210866, 'error': None, 'target': 'ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:36:11 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:36:11.820 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[60e62294-7d6a-4824-a546-a7bfc20c9fb4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:36:11 compute-0 nova_compute[185723]: 2026-02-16 13:36:11.822 185727 DEBUG nova.compute.manager [req-740604d3-d7af-4784-a609-be18479f21e5 req-da14e1e3-a6a4-4480-b36d-402c07798955 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 6403f5ce-8933-4efa-b4a5-611cd66c8a29] Received event network-vif-plugged-fe70cd80-cb05-4753-88d4-22ea28d8e9b0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:36:11 compute-0 nova_compute[185723]: 2026-02-16 13:36:11.823 185727 DEBUG oslo_concurrency.lockutils [req-740604d3-d7af-4784-a609-be18479f21e5 req-da14e1e3-a6a4-4480-b36d-402c07798955 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "6403f5ce-8933-4efa-b4a5-611cd66c8a29-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:36:11 compute-0 nova_compute[185723]: 2026-02-16 13:36:11.823 185727 DEBUG oslo_concurrency.lockutils [req-740604d3-d7af-4784-a609-be18479f21e5 req-da14e1e3-a6a4-4480-b36d-402c07798955 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "6403f5ce-8933-4efa-b4a5-611cd66c8a29-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:36:11 compute-0 nova_compute[185723]: 2026-02-16 13:36:11.823 185727 DEBUG oslo_concurrency.lockutils [req-740604d3-d7af-4784-a609-be18479f21e5 req-da14e1e3-a6a4-4480-b36d-402c07798955 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "6403f5ce-8933-4efa-b4a5-611cd66c8a29-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:36:11 compute-0 nova_compute[185723]: 2026-02-16 13:36:11.824 185727 DEBUG nova.compute.manager [req-740604d3-d7af-4784-a609-be18479f21e5 req-da14e1e3-a6a4-4480-b36d-402c07798955 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 6403f5ce-8933-4efa-b4a5-611cd66c8a29] Processing event network-vif-plugged-fe70cd80-cb05-4753-88d4-22ea28d8e9b0 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Feb 16 13:36:11 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:36:11.869 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[f9b90d62-9c7b-4022-96f6-aa8535f2c9ba]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:36:11 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:36:11.870 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap62a1ccdd-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:36:11 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:36:11.871 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 13:36:11 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:36:11.871 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap62a1ccdd-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:36:11 compute-0 kernel: tap62a1ccdd-30: entered promiscuous mode
Feb 16 13:36:11 compute-0 nova_compute[185723]: 2026-02-16 13:36:11.872 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:36:11 compute-0 NetworkManager[56177]: <info>  [1771248971.8748] manager: (tap62a1ccdd-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/46)
Feb 16 13:36:11 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:36:11.879 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap62a1ccdd-30, col_values=(('external_ids', {'iface-id': 'ac21d57d-f71e-4560-b6aa-e9f6e3838308'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:36:11 compute-0 nova_compute[185723]: 2026-02-16 13:36:11.880 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:36:11 compute-0 nova_compute[185723]: 2026-02-16 13:36:11.881 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:36:11 compute-0 ovn_controller[96072]: 2026-02-16T13:36:11Z|00104|binding|INFO|Releasing lport ac21d57d-f71e-4560-b6aa-e9f6e3838308 from this chassis (sb_readonly=0)
Feb 16 13:36:11 compute-0 nova_compute[185723]: 2026-02-16 13:36:11.886 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:36:11 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:36:11.888 105360 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/62a1ccdd-3048-4bbf-acc8-c791bff79ee8.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/62a1ccdd-3048-4bbf-acc8-c791bff79ee8.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Feb 16 13:36:11 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:36:11.889 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[52dd0c5c-9cf9-4fe2-b1c1-bc8e949f60aa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:36:11 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:36:11.890 105360 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Feb 16 13:36:11 compute-0 ovn_metadata_agent[105355]: global
Feb 16 13:36:11 compute-0 ovn_metadata_agent[105355]:     log         /dev/log local0 debug
Feb 16 13:36:11 compute-0 ovn_metadata_agent[105355]:     log-tag     haproxy-metadata-proxy-62a1ccdd-3048-4bbf-acc8-c791bff79ee8
Feb 16 13:36:11 compute-0 ovn_metadata_agent[105355]:     user        root
Feb 16 13:36:11 compute-0 ovn_metadata_agent[105355]:     group       root
Feb 16 13:36:11 compute-0 ovn_metadata_agent[105355]:     maxconn     1024
Feb 16 13:36:11 compute-0 ovn_metadata_agent[105355]:     pidfile     /var/lib/neutron/external/pids/62a1ccdd-3048-4bbf-acc8-c791bff79ee8.pid.haproxy
Feb 16 13:36:11 compute-0 ovn_metadata_agent[105355]:     daemon
Feb 16 13:36:11 compute-0 ovn_metadata_agent[105355]: 
Feb 16 13:36:11 compute-0 ovn_metadata_agent[105355]: defaults
Feb 16 13:36:11 compute-0 ovn_metadata_agent[105355]:     log global
Feb 16 13:36:11 compute-0 ovn_metadata_agent[105355]:     mode http
Feb 16 13:36:11 compute-0 ovn_metadata_agent[105355]:     option httplog
Feb 16 13:36:11 compute-0 ovn_metadata_agent[105355]:     option dontlognull
Feb 16 13:36:11 compute-0 ovn_metadata_agent[105355]:     option http-server-close
Feb 16 13:36:11 compute-0 ovn_metadata_agent[105355]:     option forwardfor
Feb 16 13:36:11 compute-0 ovn_metadata_agent[105355]:     retries                 3
Feb 16 13:36:11 compute-0 ovn_metadata_agent[105355]:     timeout http-request    30s
Feb 16 13:36:11 compute-0 ovn_metadata_agent[105355]:     timeout connect         30s
Feb 16 13:36:11 compute-0 ovn_metadata_agent[105355]:     timeout client          32s
Feb 16 13:36:11 compute-0 ovn_metadata_agent[105355]:     timeout server          32s
Feb 16 13:36:11 compute-0 ovn_metadata_agent[105355]:     timeout http-keep-alive 30s
Feb 16 13:36:11 compute-0 ovn_metadata_agent[105355]: 
Feb 16 13:36:11 compute-0 ovn_metadata_agent[105355]: 
Feb 16 13:36:11 compute-0 ovn_metadata_agent[105355]: listen listener
Feb 16 13:36:11 compute-0 ovn_metadata_agent[105355]:     bind 169.254.169.254:80
Feb 16 13:36:11 compute-0 ovn_metadata_agent[105355]:     server metadata /var/lib/neutron/metadata_proxy
Feb 16 13:36:11 compute-0 ovn_metadata_agent[105355]:     http-request add-header X-OVN-Network-ID 62a1ccdd-3048-4bbf-acc8-c791bff79ee8
Feb 16 13:36:11 compute-0 ovn_metadata_agent[105355]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Feb 16 13:36:11 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:36:11.890 105360 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'env', 'PROCESS_TAG=haproxy-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/62a1ccdd-3048-4bbf-acc8-c791bff79ee8.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Feb 16 13:36:12 compute-0 nova_compute[185723]: 2026-02-16 13:36:12.228 185727 DEBUG nova.compute.manager [None req-35ac8cee-f1af-421a-8897-8c5a538f4e01 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 6403f5ce-8933-4efa-b4a5-611cd66c8a29] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Feb 16 13:36:12 compute-0 nova_compute[185723]: 2026-02-16 13:36:12.229 185727 DEBUG nova.virt.driver [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] Emitting event <LifecycleEvent: 1771248972.228106, 6403f5ce-8933-4efa-b4a5-611cd66c8a29 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 13:36:12 compute-0 nova_compute[185723]: 2026-02-16 13:36:12.230 185727 INFO nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: 6403f5ce-8933-4efa-b4a5-611cd66c8a29] VM Started (Lifecycle Event)
Feb 16 13:36:12 compute-0 nova_compute[185723]: 2026-02-16 13:36:12.232 185727 DEBUG nova.virt.libvirt.driver [None req-35ac8cee-f1af-421a-8897-8c5a538f4e01 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 6403f5ce-8933-4efa-b4a5-611cd66c8a29] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Feb 16 13:36:12 compute-0 nova_compute[185723]: 2026-02-16 13:36:12.235 185727 INFO nova.virt.libvirt.driver [-] [instance: 6403f5ce-8933-4efa-b4a5-611cd66c8a29] Instance spawned successfully.
Feb 16 13:36:12 compute-0 nova_compute[185723]: 2026-02-16 13:36:12.236 185727 DEBUG nova.virt.libvirt.driver [None req-35ac8cee-f1af-421a-8897-8c5a538f4e01 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 6403f5ce-8933-4efa-b4a5-611cd66c8a29] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Feb 16 13:36:12 compute-0 nova_compute[185723]: 2026-02-16 13:36:12.255 185727 DEBUG nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: 6403f5ce-8933-4efa-b4a5-611cd66c8a29] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:36:12 compute-0 nova_compute[185723]: 2026-02-16 13:36:12.261 185727 DEBUG nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: 6403f5ce-8933-4efa-b4a5-611cd66c8a29] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 16 13:36:12 compute-0 nova_compute[185723]: 2026-02-16 13:36:12.267 185727 DEBUG nova.virt.libvirt.driver [None req-35ac8cee-f1af-421a-8897-8c5a538f4e01 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 6403f5ce-8933-4efa-b4a5-611cd66c8a29] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:36:12 compute-0 nova_compute[185723]: 2026-02-16 13:36:12.268 185727 DEBUG nova.virt.libvirt.driver [None req-35ac8cee-f1af-421a-8897-8c5a538f4e01 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 6403f5ce-8933-4efa-b4a5-611cd66c8a29] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:36:12 compute-0 nova_compute[185723]: 2026-02-16 13:36:12.268 185727 DEBUG nova.virt.libvirt.driver [None req-35ac8cee-f1af-421a-8897-8c5a538f4e01 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 6403f5ce-8933-4efa-b4a5-611cd66c8a29] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:36:12 compute-0 nova_compute[185723]: 2026-02-16 13:36:12.269 185727 DEBUG nova.virt.libvirt.driver [None req-35ac8cee-f1af-421a-8897-8c5a538f4e01 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 6403f5ce-8933-4efa-b4a5-611cd66c8a29] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:36:12 compute-0 nova_compute[185723]: 2026-02-16 13:36:12.269 185727 DEBUG nova.virt.libvirt.driver [None req-35ac8cee-f1af-421a-8897-8c5a538f4e01 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 6403f5ce-8933-4efa-b4a5-611cd66c8a29] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:36:12 compute-0 nova_compute[185723]: 2026-02-16 13:36:12.269 185727 DEBUG nova.virt.libvirt.driver [None req-35ac8cee-f1af-421a-8897-8c5a538f4e01 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 6403f5ce-8933-4efa-b4a5-611cd66c8a29] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:36:12 compute-0 nova_compute[185723]: 2026-02-16 13:36:12.294 185727 INFO nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: 6403f5ce-8933-4efa-b4a5-611cd66c8a29] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 16 13:36:12 compute-0 nova_compute[185723]: 2026-02-16 13:36:12.295 185727 DEBUG nova.virt.driver [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] Emitting event <LifecycleEvent: 1771248972.2293608, 6403f5ce-8933-4efa-b4a5-611cd66c8a29 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 13:36:12 compute-0 podman[210904]: 2026-02-16 13:36:12.198849682 +0000 UTC m=+0.022706297 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Feb 16 13:36:12 compute-0 nova_compute[185723]: 2026-02-16 13:36:12.296 185727 INFO nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: 6403f5ce-8933-4efa-b4a5-611cd66c8a29] VM Paused (Lifecycle Event)
Feb 16 13:36:12 compute-0 podman[210904]: 2026-02-16 13:36:12.358814158 +0000 UTC m=+0.182670753 container create a57a61246fea3613066b0575aa9747df0abd08348fcfa3344b0e1003de7a59fb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, tcib_managed=true, io.buildah.version=1.41.3)
Feb 16 13:36:12 compute-0 nova_compute[185723]: 2026-02-16 13:36:12.365 185727 DEBUG nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: 6403f5ce-8933-4efa-b4a5-611cd66c8a29] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:36:12 compute-0 nova_compute[185723]: 2026-02-16 13:36:12.366 185727 INFO nova.compute.manager [None req-35ac8cee-f1af-421a-8897-8c5a538f4e01 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 6403f5ce-8933-4efa-b4a5-611cd66c8a29] Took 6.97 seconds to spawn the instance on the hypervisor.
Feb 16 13:36:12 compute-0 nova_compute[185723]: 2026-02-16 13:36:12.367 185727 DEBUG nova.compute.manager [None req-35ac8cee-f1af-421a-8897-8c5a538f4e01 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 6403f5ce-8933-4efa-b4a5-611cd66c8a29] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:36:12 compute-0 nova_compute[185723]: 2026-02-16 13:36:12.373 185727 DEBUG nova.virt.driver [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] Emitting event <LifecycleEvent: 1771248972.232346, 6403f5ce-8933-4efa-b4a5-611cd66c8a29 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 13:36:12 compute-0 nova_compute[185723]: 2026-02-16 13:36:12.373 185727 INFO nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: 6403f5ce-8933-4efa-b4a5-611cd66c8a29] VM Resumed (Lifecycle Event)
Feb 16 13:36:12 compute-0 systemd[1]: Started libpod-conmon-a57a61246fea3613066b0575aa9747df0abd08348fcfa3344b0e1003de7a59fb.scope.
Feb 16 13:36:12 compute-0 nova_compute[185723]: 2026-02-16 13:36:12.404 185727 DEBUG nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: 6403f5ce-8933-4efa-b4a5-611cd66c8a29] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:36:12 compute-0 nova_compute[185723]: 2026-02-16 13:36:12.409 185727 DEBUG nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: 6403f5ce-8933-4efa-b4a5-611cd66c8a29] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 16 13:36:12 compute-0 systemd[1]: Started libcrun container.
Feb 16 13:36:12 compute-0 nova_compute[185723]: 2026-02-16 13:36:12.432 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:36:12 compute-0 nova_compute[185723]: 2026-02-16 13:36:12.433 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 16 13:36:12 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/aedd1a1ede9506bdddae3bf326748c7bd036b6de99942b5e003086620f9f6a63/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 16 13:36:12 compute-0 nova_compute[185723]: 2026-02-16 13:36:12.433 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 16 13:36:12 compute-0 nova_compute[185723]: 2026-02-16 13:36:12.435 185727 INFO nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: 6403f5ce-8933-4efa-b4a5-611cd66c8a29] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 16 13:36:12 compute-0 podman[210904]: 2026-02-16 13:36:12.446020782 +0000 UTC m=+0.269877477 container init a57a61246fea3613066b0575aa9747df0abd08348fcfa3344b0e1003de7a59fb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 16 13:36:12 compute-0 podman[210904]: 2026-02-16 13:36:12.454943934 +0000 UTC m=+0.278800529 container start a57a61246fea3613066b0575aa9747df0abd08348fcfa3344b0e1003de7a59fb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Feb 16 13:36:12 compute-0 nova_compute[185723]: 2026-02-16 13:36:12.455 185727 INFO nova.compute.manager [None req-35ac8cee-f1af-421a-8897-8c5a538f4e01 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 6403f5ce-8933-4efa-b4a5-611cd66c8a29] Took 7.52 seconds to build instance.
Feb 16 13:36:12 compute-0 neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8[210921]: [NOTICE]   (210925) : New worker (210927) forked
Feb 16 13:36:12 compute-0 neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8[210921]: [NOTICE]   (210925) : Loading success.
Feb 16 13:36:12 compute-0 nova_compute[185723]: 2026-02-16 13:36:12.479 185727 DEBUG oslo_concurrency.lockutils [None req-35ac8cee-f1af-421a-8897-8c5a538f4e01 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "6403f5ce-8933-4efa-b4a5-611cd66c8a29" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.652s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:36:12 compute-0 nova_compute[185723]: 2026-02-16 13:36:12.588 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Acquiring lock "refresh_cache-6403f5ce-8933-4efa-b4a5-611cd66c8a29" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 13:36:12 compute-0 nova_compute[185723]: 2026-02-16 13:36:12.689 185727 DEBUG nova.network.neutron [req-d0942370-4853-4058-9f3e-a68a9a57b5f2 req-ce13d5b1-9b14-48bf-991c-51ce6d749d1d faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 6403f5ce-8933-4efa-b4a5-611cd66c8a29] Updated VIF entry in instance network info cache for port fe70cd80-cb05-4753-88d4-22ea28d8e9b0. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 16 13:36:12 compute-0 nova_compute[185723]: 2026-02-16 13:36:12.690 185727 DEBUG nova.network.neutron [req-d0942370-4853-4058-9f3e-a68a9a57b5f2 req-ce13d5b1-9b14-48bf-991c-51ce6d749d1d faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 6403f5ce-8933-4efa-b4a5-611cd66c8a29] Updating instance_info_cache with network_info: [{"id": "fe70cd80-cb05-4753-88d4-22ea28d8e9b0", "address": "fa:16:3e:ff:48:e8", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfe70cd80-cb", "ovs_interfaceid": "fe70cd80-cb05-4753-88d4-22ea28d8e9b0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 13:36:12 compute-0 nova_compute[185723]: 2026-02-16 13:36:12.717 185727 DEBUG oslo_concurrency.lockutils [req-d0942370-4853-4058-9f3e-a68a9a57b5f2 req-ce13d5b1-9b14-48bf-991c-51ce6d749d1d faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Releasing lock "refresh_cache-6403f5ce-8933-4efa-b4a5-611cd66c8a29" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 13:36:12 compute-0 nova_compute[185723]: 2026-02-16 13:36:12.717 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Acquired lock "refresh_cache-6403f5ce-8933-4efa-b4a5-611cd66c8a29" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 13:36:12 compute-0 nova_compute[185723]: 2026-02-16 13:36:12.718 185727 DEBUG nova.network.neutron [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] [instance: 6403f5ce-8933-4efa-b4a5-611cd66c8a29] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Feb 16 13:36:12 compute-0 nova_compute[185723]: 2026-02-16 13:36:12.718 185727 DEBUG nova.objects.instance [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 6403f5ce-8933-4efa-b4a5-611cd66c8a29 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 13:36:13 compute-0 nova_compute[185723]: 2026-02-16 13:36:13.939 185727 DEBUG nova.compute.manager [req-aaf58c3a-c413-4784-aa37-a8d9f7533142 req-b1e19477-1534-49c9-b425-6274a95c2197 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 6403f5ce-8933-4efa-b4a5-611cd66c8a29] Received event network-vif-plugged-fe70cd80-cb05-4753-88d4-22ea28d8e9b0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:36:13 compute-0 nova_compute[185723]: 2026-02-16 13:36:13.940 185727 DEBUG oslo_concurrency.lockutils [req-aaf58c3a-c413-4784-aa37-a8d9f7533142 req-b1e19477-1534-49c9-b425-6274a95c2197 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "6403f5ce-8933-4efa-b4a5-611cd66c8a29-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:36:13 compute-0 nova_compute[185723]: 2026-02-16 13:36:13.940 185727 DEBUG oslo_concurrency.lockutils [req-aaf58c3a-c413-4784-aa37-a8d9f7533142 req-b1e19477-1534-49c9-b425-6274a95c2197 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "6403f5ce-8933-4efa-b4a5-611cd66c8a29-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:36:13 compute-0 nova_compute[185723]: 2026-02-16 13:36:13.940 185727 DEBUG oslo_concurrency.lockutils [req-aaf58c3a-c413-4784-aa37-a8d9f7533142 req-b1e19477-1534-49c9-b425-6274a95c2197 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "6403f5ce-8933-4efa-b4a5-611cd66c8a29-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:36:13 compute-0 nova_compute[185723]: 2026-02-16 13:36:13.940 185727 DEBUG nova.compute.manager [req-aaf58c3a-c413-4784-aa37-a8d9f7533142 req-b1e19477-1534-49c9-b425-6274a95c2197 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 6403f5ce-8933-4efa-b4a5-611cd66c8a29] No waiting events found dispatching network-vif-plugged-fe70cd80-cb05-4753-88d4-22ea28d8e9b0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:36:13 compute-0 nova_compute[185723]: 2026-02-16 13:36:13.940 185727 WARNING nova.compute.manager [req-aaf58c3a-c413-4784-aa37-a8d9f7533142 req-b1e19477-1534-49c9-b425-6274a95c2197 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 6403f5ce-8933-4efa-b4a5-611cd66c8a29] Received unexpected event network-vif-plugged-fe70cd80-cb05-4753-88d4-22ea28d8e9b0 for instance with vm_state active and task_state None.
Feb 16 13:36:14 compute-0 nova_compute[185723]: 2026-02-16 13:36:14.311 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:36:14 compute-0 nova_compute[185723]: 2026-02-16 13:36:14.819 185727 DEBUG nova.network.neutron [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] [instance: 6403f5ce-8933-4efa-b4a5-611cd66c8a29] Updating instance_info_cache with network_info: [{"id": "fe70cd80-cb05-4753-88d4-22ea28d8e9b0", "address": "fa:16:3e:ff:48:e8", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfe70cd80-cb", "ovs_interfaceid": "fe70cd80-cb05-4753-88d4-22ea28d8e9b0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 13:36:14 compute-0 nova_compute[185723]: 2026-02-16 13:36:14.844 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Releasing lock "refresh_cache-6403f5ce-8933-4efa-b4a5-611cd66c8a29" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 13:36:14 compute-0 nova_compute[185723]: 2026-02-16 13:36:14.845 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] [instance: 6403f5ce-8933-4efa-b4a5-611cd66c8a29] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Feb 16 13:36:14 compute-0 nova_compute[185723]: 2026-02-16 13:36:14.845 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:36:14 compute-0 nova_compute[185723]: 2026-02-16 13:36:14.845 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:36:15 compute-0 nova_compute[185723]: 2026-02-16 13:36:15.432 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:36:15 compute-0 nova_compute[185723]: 2026-02-16 13:36:15.433 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:36:15 compute-0 nova_compute[185723]: 2026-02-16 13:36:15.433 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:36:15 compute-0 nova_compute[185723]: 2026-02-16 13:36:15.456 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:36:15 compute-0 nova_compute[185723]: 2026-02-16 13:36:15.457 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:36:15 compute-0 nova_compute[185723]: 2026-02-16 13:36:15.457 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:36:15 compute-0 nova_compute[185723]: 2026-02-16 13:36:15.457 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 16 13:36:15 compute-0 nova_compute[185723]: 2026-02-16 13:36:15.528 185727 DEBUG oslo_concurrency.processutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6403f5ce-8933-4efa-b4a5-611cd66c8a29/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:36:15 compute-0 nova_compute[185723]: 2026-02-16 13:36:15.613 185727 DEBUG oslo_concurrency.processutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6403f5ce-8933-4efa-b4a5-611cd66c8a29/disk --force-share --output=json" returned: 0 in 0.086s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:36:15 compute-0 nova_compute[185723]: 2026-02-16 13:36:15.614 185727 DEBUG oslo_concurrency.processutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6403f5ce-8933-4efa-b4a5-611cd66c8a29/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:36:15 compute-0 nova_compute[185723]: 2026-02-16 13:36:15.664 185727 DEBUG oslo_concurrency.processutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6403f5ce-8933-4efa-b4a5-611cd66c8a29/disk --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:36:15 compute-0 nova_compute[185723]: 2026-02-16 13:36:15.735 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:36:15 compute-0 nova_compute[185723]: 2026-02-16 13:36:15.832 185727 WARNING nova.virt.libvirt.driver [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 13:36:15 compute-0 nova_compute[185723]: 2026-02-16 13:36:15.833 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5653MB free_disk=73.22626495361328GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 16 13:36:15 compute-0 nova_compute[185723]: 2026-02-16 13:36:15.834 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:36:15 compute-0 nova_compute[185723]: 2026-02-16 13:36:15.835 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:36:15 compute-0 nova_compute[185723]: 2026-02-16 13:36:15.951 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Instance 6403f5ce-8933-4efa-b4a5-611cd66c8a29 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 16 13:36:15 compute-0 nova_compute[185723]: 2026-02-16 13:36:15.952 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 16 13:36:15 compute-0 nova_compute[185723]: 2026-02-16 13:36:15.952 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 16 13:36:15 compute-0 nova_compute[185723]: 2026-02-16 13:36:15.969 185727 DEBUG nova.scheduler.client.report [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Refreshing inventories for resource provider c9501a85-df32-4b8f-bce0-9425ef1e7866 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Feb 16 13:36:15 compute-0 nova_compute[185723]: 2026-02-16 13:36:15.989 185727 DEBUG nova.scheduler.client.report [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Updating ProviderTree inventory for provider c9501a85-df32-4b8f-bce0-9425ef1e7866 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Feb 16 13:36:15 compute-0 nova_compute[185723]: 2026-02-16 13:36:15.989 185727 DEBUG nova.compute.provider_tree [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Updating inventory in ProviderTree for provider c9501a85-df32-4b8f-bce0-9425ef1e7866 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Feb 16 13:36:16 compute-0 nova_compute[185723]: 2026-02-16 13:36:16.007 185727 DEBUG nova.scheduler.client.report [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Refreshing aggregate associations for resource provider c9501a85-df32-4b8f-bce0-9425ef1e7866, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Feb 16 13:36:16 compute-0 nova_compute[185723]: 2026-02-16 13:36:16.035 185727 DEBUG nova.scheduler.client.report [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Refreshing trait associations for resource provider c9501a85-df32-4b8f-bce0-9425ef1e7866, traits: COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_MMX,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_SSE,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_SSE2,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_SECURITY_TPM_1_2,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_VOLUME_EXTEND,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_STORAGE_BUS_USB,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_SECURITY_TPM_2_0,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_ACCELERATORS,COMPUTE_STORAGE_BUS_FDC,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NODE,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_SSE41,COMPUTE_IMAGE_TYPE_AKI _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Feb 16 13:36:16 compute-0 nova_compute[185723]: 2026-02-16 13:36:16.081 185727 DEBUG nova.compute.provider_tree [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Inventory has not changed in ProviderTree for provider: c9501a85-df32-4b8f-bce0-9425ef1e7866 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 13:36:16 compute-0 nova_compute[185723]: 2026-02-16 13:36:16.096 185727 DEBUG nova.scheduler.client.report [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Inventory has not changed for provider c9501a85-df32-4b8f-bce0-9425ef1e7866 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 13:36:16 compute-0 nova_compute[185723]: 2026-02-16 13:36:16.543 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 16 13:36:16 compute-0 nova_compute[185723]: 2026-02-16 13:36:16.543 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.709s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:36:18 compute-0 podman[210944]: 2026-02-16 13:36:18.02665899 +0000 UTC m=+0.061272564 container health_status d69bd0be854ec8043fcba555b70c2cd311c7a2510d07d6bc9929fe7684abece9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Feb 16 13:36:18 compute-0 podman[210943]: 2026-02-16 13:36:18.430620382 +0000 UTC m=+0.463735000 container health_status 93151dcbec9f159583042df5101bdd7b5b1bcb5f3117955b4bc9cd1c0e465b98 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2026-02-05T04:57:10Z, config_id=openstack_network_exporter, distribution-scope=public, vendor=Red Hat, Inc., container_name=openstack_network_exporter, io.openshift.expose-services=, name=ubi9/ubi-minimal, build-date=2026-02-05T04:57:10Z, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, release=1770267347, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, version=9.7, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Feb 16 13:36:18 compute-0 nova_compute[185723]: 2026-02-16 13:36:18.544 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:36:18 compute-0 nova_compute[185723]: 2026-02-16 13:36:18.545 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 16 13:36:19 compute-0 nova_compute[185723]: 2026-02-16 13:36:19.313 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:36:19 compute-0 nova_compute[185723]: 2026-02-16 13:36:19.433 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:36:20 compute-0 nova_compute[185723]: 2026-02-16 13:36:20.739 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:36:22 compute-0 podman[210982]: 2026-02-16 13:36:22.052803564 +0000 UTC m=+0.087578720 container health_status 19961a2f872cc72daf954f4dd556f980b5f58f137d145776df6132c167a8a93c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Feb 16 13:36:24 compute-0 nova_compute[185723]: 2026-02-16 13:36:24.318 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:36:25 compute-0 ovn_controller[96072]: 2026-02-16T13:36:25Z|00016|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:ff:48:e8 10.100.0.5
Feb 16 13:36:25 compute-0 ovn_controller[96072]: 2026-02-16T13:36:25Z|00017|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:ff:48:e8 10.100.0.5
Feb 16 13:36:25 compute-0 nova_compute[185723]: 2026-02-16 13:36:25.742 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:36:29 compute-0 nova_compute[185723]: 2026-02-16 13:36:29.321 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:36:29 compute-0 podman[195053]: time="2026-02-16T13:36:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:36:29 compute-0 podman[195053]: @ - - [16/Feb/2026:13:36:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17245 "" "Go-http-client/1.1"
Feb 16 13:36:29 compute-0 podman[195053]: @ - - [16/Feb/2026:13:36:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2633 "" "Go-http-client/1.1"
Feb 16 13:36:30 compute-0 nova_compute[185723]: 2026-02-16 13:36:30.746 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:36:30 compute-0 nova_compute[185723]: 2026-02-16 13:36:30.859 185727 DEBUG nova.virt.libvirt.driver [None req-8dd25858-979a-4e7c-8f1c-508c6184b1d9 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: fea12b84-b444-4299-a1d9-2e974fbb93e0] Creating tmpfile /var/lib/nova/instances/tmpc1cj4x3y to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10041
Feb 16 13:36:30 compute-0 nova_compute[185723]: 2026-02-16 13:36:30.860 185727 DEBUG nova.compute.manager [None req-8dd25858-979a-4e7c-8f1c-508c6184b1d9 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpc1cj4x3y',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.9/site-packages/nova/compute/manager.py:8476
Feb 16 13:36:31 compute-0 openstack_network_exporter[197909]: ERROR   13:36:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:36:31 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:36:31 compute-0 openstack_network_exporter[197909]: ERROR   13:36:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:36:31 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:36:32 compute-0 nova_compute[185723]: 2026-02-16 13:36:32.556 185727 DEBUG nova.compute.manager [None req-8dd25858-979a-4e7c-8f1c-508c6184b1d9 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpc1cj4x3y',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='fea12b84-b444-4299-a1d9-2e974fbb93e0',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8604
Feb 16 13:36:32 compute-0 nova_compute[185723]: 2026-02-16 13:36:32.587 185727 DEBUG oslo_concurrency.lockutils [None req-8dd25858-979a-4e7c-8f1c-508c6184b1d9 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "refresh_cache-fea12b84-b444-4299-a1d9-2e974fbb93e0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 13:36:32 compute-0 nova_compute[185723]: 2026-02-16 13:36:32.587 185727 DEBUG oslo_concurrency.lockutils [None req-8dd25858-979a-4e7c-8f1c-508c6184b1d9 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Acquired lock "refresh_cache-fea12b84-b444-4299-a1d9-2e974fbb93e0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 13:36:32 compute-0 nova_compute[185723]: 2026-02-16 13:36:32.588 185727 DEBUG nova.network.neutron [None req-8dd25858-979a-4e7c-8f1c-508c6184b1d9 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: fea12b84-b444-4299-a1d9-2e974fbb93e0] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 16 13:36:33 compute-0 podman[211030]: 2026-02-16 13:36:33.02605794 +0000 UTC m=+0.057047471 container health_status 4ec6105d1727f201f656d7e6e358931be8ceabc5af37fb5ad779abc620122180 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 16 13:36:34 compute-0 nova_compute[185723]: 2026-02-16 13:36:34.323 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:36:35 compute-0 nova_compute[185723]: 2026-02-16 13:36:35.278 185727 DEBUG nova.network.neutron [None req-8dd25858-979a-4e7c-8f1c-508c6184b1d9 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: fea12b84-b444-4299-a1d9-2e974fbb93e0] Updating instance_info_cache with network_info: [{"id": "0b908a4c-c96e-4244-b4b4-87f4ac6110bd", "address": "fa:16:3e:37:03:5a", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b908a4c-c9", "ovs_interfaceid": "0b908a4c-c96e-4244-b4b4-87f4ac6110bd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 13:36:35 compute-0 nova_compute[185723]: 2026-02-16 13:36:35.299 185727 DEBUG oslo_concurrency.lockutils [None req-8dd25858-979a-4e7c-8f1c-508c6184b1d9 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Releasing lock "refresh_cache-fea12b84-b444-4299-a1d9-2e974fbb93e0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 13:36:35 compute-0 nova_compute[185723]: 2026-02-16 13:36:35.301 185727 DEBUG nova.virt.libvirt.driver [None req-8dd25858-979a-4e7c-8f1c-508c6184b1d9 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: fea12b84-b444-4299-a1d9-2e974fbb93e0] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpc1cj4x3y',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='fea12b84-b444-4299-a1d9-2e974fbb93e0',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10827
Feb 16 13:36:35 compute-0 nova_compute[185723]: 2026-02-16 13:36:35.301 185727 DEBUG nova.virt.libvirt.driver [None req-8dd25858-979a-4e7c-8f1c-508c6184b1d9 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: fea12b84-b444-4299-a1d9-2e974fbb93e0] Creating instance directory: /var/lib/nova/instances/fea12b84-b444-4299-a1d9-2e974fbb93e0 pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10840
Feb 16 13:36:35 compute-0 nova_compute[185723]: 2026-02-16 13:36:35.302 185727 DEBUG nova.virt.libvirt.driver [None req-8dd25858-979a-4e7c-8f1c-508c6184b1d9 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: fea12b84-b444-4299-a1d9-2e974fbb93e0] Creating disk.info with the contents: {'/var/lib/nova/instances/fea12b84-b444-4299-a1d9-2e974fbb93e0/disk': 'qcow2', '/var/lib/nova/instances/fea12b84-b444-4299-a1d9-2e974fbb93e0/disk.config': 'raw'} pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10854
Feb 16 13:36:35 compute-0 nova_compute[185723]: 2026-02-16 13:36:35.302 185727 DEBUG nova.virt.libvirt.driver [None req-8dd25858-979a-4e7c-8f1c-508c6184b1d9 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: fea12b84-b444-4299-a1d9-2e974fbb93e0] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10864
Feb 16 13:36:35 compute-0 nova_compute[185723]: 2026-02-16 13:36:35.303 185727 DEBUG nova.objects.instance [None req-8dd25858-979a-4e7c-8f1c-508c6184b1d9 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lazy-loading 'trusted_certs' on Instance uuid fea12b84-b444-4299-a1d9-2e974fbb93e0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 13:36:35 compute-0 nova_compute[185723]: 2026-02-16 13:36:35.329 185727 DEBUG oslo_concurrency.processutils [None req-8dd25858-979a-4e7c-8f1c-508c6184b1d9 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:36:35 compute-0 nova_compute[185723]: 2026-02-16 13:36:35.382 185727 DEBUG oslo_concurrency.processutils [None req-8dd25858-979a-4e7c-8f1c-508c6184b1d9 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:36:35 compute-0 nova_compute[185723]: 2026-02-16 13:36:35.383 185727 DEBUG oslo_concurrency.lockutils [None req-8dd25858-979a-4e7c-8f1c-508c6184b1d9 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "755ae6cb63977e865a71a8c38a1a67ce95c9d0b7" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:36:35 compute-0 nova_compute[185723]: 2026-02-16 13:36:35.384 185727 DEBUG oslo_concurrency.lockutils [None req-8dd25858-979a-4e7c-8f1c-508c6184b1d9 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "755ae6cb63977e865a71a8c38a1a67ce95c9d0b7" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:36:35 compute-0 nova_compute[185723]: 2026-02-16 13:36:35.394 185727 DEBUG oslo_concurrency.processutils [None req-8dd25858-979a-4e7c-8f1c-508c6184b1d9 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:36:35 compute-0 nova_compute[185723]: 2026-02-16 13:36:35.441 185727 DEBUG oslo_concurrency.processutils [None req-8dd25858-979a-4e7c-8f1c-508c6184b1d9 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json" returned: 0 in 0.048s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:36:35 compute-0 nova_compute[185723]: 2026-02-16 13:36:35.442 185727 DEBUG oslo_concurrency.processutils [None req-8dd25858-979a-4e7c-8f1c-508c6184b1d9 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7,backing_fmt=raw /var/lib/nova/instances/fea12b84-b444-4299-a1d9-2e974fbb93e0/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:36:35 compute-0 nova_compute[185723]: 2026-02-16 13:36:35.466 185727 DEBUG oslo_concurrency.processutils [None req-8dd25858-979a-4e7c-8f1c-508c6184b1d9 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7,backing_fmt=raw /var/lib/nova/instances/fea12b84-b444-4299-a1d9-2e974fbb93e0/disk 1073741824" returned: 0 in 0.023s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:36:35 compute-0 nova_compute[185723]: 2026-02-16 13:36:35.467 185727 DEBUG oslo_concurrency.lockutils [None req-8dd25858-979a-4e7c-8f1c-508c6184b1d9 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "755ae6cb63977e865a71a8c38a1a67ce95c9d0b7" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.083s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:36:35 compute-0 nova_compute[185723]: 2026-02-16 13:36:35.467 185727 DEBUG oslo_concurrency.processutils [None req-8dd25858-979a-4e7c-8f1c-508c6184b1d9 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:36:35 compute-0 nova_compute[185723]: 2026-02-16 13:36:35.516 185727 DEBUG oslo_concurrency.processutils [None req-8dd25858-979a-4e7c-8f1c-508c6184b1d9 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json" returned: 0 in 0.049s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:36:35 compute-0 nova_compute[185723]: 2026-02-16 13:36:35.518 185727 DEBUG nova.virt.disk.api [None req-8dd25858-979a-4e7c-8f1c-508c6184b1d9 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Checking if we can resize image /var/lib/nova/instances/fea12b84-b444-4299-a1d9-2e974fbb93e0/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Feb 16 13:36:35 compute-0 nova_compute[185723]: 2026-02-16 13:36:35.519 185727 DEBUG oslo_concurrency.processutils [None req-8dd25858-979a-4e7c-8f1c-508c6184b1d9 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/fea12b84-b444-4299-a1d9-2e974fbb93e0/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:36:35 compute-0 nova_compute[185723]: 2026-02-16 13:36:35.575 185727 DEBUG oslo_concurrency.processutils [None req-8dd25858-979a-4e7c-8f1c-508c6184b1d9 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/fea12b84-b444-4299-a1d9-2e974fbb93e0/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:36:35 compute-0 nova_compute[185723]: 2026-02-16 13:36:35.576 185727 DEBUG nova.virt.disk.api [None req-8dd25858-979a-4e7c-8f1c-508c6184b1d9 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Cannot resize image /var/lib/nova/instances/fea12b84-b444-4299-a1d9-2e974fbb93e0/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Feb 16 13:36:35 compute-0 nova_compute[185723]: 2026-02-16 13:36:35.577 185727 DEBUG nova.objects.instance [None req-8dd25858-979a-4e7c-8f1c-508c6184b1d9 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lazy-loading 'migration_context' on Instance uuid fea12b84-b444-4299-a1d9-2e974fbb93e0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 13:36:35 compute-0 nova_compute[185723]: 2026-02-16 13:36:35.669 185727 DEBUG oslo_concurrency.processutils [None req-8dd25858-979a-4e7c-8f1c-508c6184b1d9 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/fea12b84-b444-4299-a1d9-2e974fbb93e0/disk.config 485376 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:36:35 compute-0 nova_compute[185723]: 2026-02-16 13:36:35.700 185727 DEBUG oslo_concurrency.processutils [None req-8dd25858-979a-4e7c-8f1c-508c6184b1d9 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/fea12b84-b444-4299-a1d9-2e974fbb93e0/disk.config 485376" returned: 0 in 0.031s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:36:35 compute-0 nova_compute[185723]: 2026-02-16 13:36:35.702 185727 DEBUG nova.virt.libvirt.volume.remotefs [None req-8dd25858-979a-4e7c-8f1c-508c6184b1d9 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Copying file compute-1.ctlplane.example.com:/var/lib/nova/instances/fea12b84-b444-4299-a1d9-2e974fbb93e0/disk.config to /var/lib/nova/instances/fea12b84-b444-4299-a1d9-2e974fbb93e0 copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Feb 16 13:36:35 compute-0 nova_compute[185723]: 2026-02-16 13:36:35.703 185727 DEBUG oslo_concurrency.processutils [None req-8dd25858-979a-4e7c-8f1c-508c6184b1d9 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Running cmd (subprocess): scp -C -r compute-1.ctlplane.example.com:/var/lib/nova/instances/fea12b84-b444-4299-a1d9-2e974fbb93e0/disk.config /var/lib/nova/instances/fea12b84-b444-4299-a1d9-2e974fbb93e0 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:36:35 compute-0 nova_compute[185723]: 2026-02-16 13:36:35.748 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:36:36 compute-0 nova_compute[185723]: 2026-02-16 13:36:36.160 185727 DEBUG oslo_concurrency.processutils [None req-8dd25858-979a-4e7c-8f1c-508c6184b1d9 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] CMD "scp -C -r compute-1.ctlplane.example.com:/var/lib/nova/instances/fea12b84-b444-4299-a1d9-2e974fbb93e0/disk.config /var/lib/nova/instances/fea12b84-b444-4299-a1d9-2e974fbb93e0" returned: 0 in 0.458s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:36:36 compute-0 nova_compute[185723]: 2026-02-16 13:36:36.161 185727 DEBUG nova.virt.libvirt.driver [None req-8dd25858-979a-4e7c-8f1c-508c6184b1d9 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: fea12b84-b444-4299-a1d9-2e974fbb93e0] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10794
Feb 16 13:36:36 compute-0 nova_compute[185723]: 2026-02-16 13:36:36.163 185727 DEBUG nova.virt.libvirt.vif [None req-8dd25858-979a-4e7c-8f1c-508c6184b1d9 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-16T13:35:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-276078181',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-276078181',id=11,image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-16T13:35:55Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='76c271745e704d5fa97fe16a7dcd4a81',ramdisk_id='',reservation_id='r-14c0jw0p',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-1085993185',owner_user_name='tempest-TestExecuteStrategies-1085993185-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2026-02-16T13:35:56Z,user_data=None,user_id='e19cd2d8a8894526ba620ca3249e9a63',uuid=fea12b84-b444-4299-a1d9-2e974fbb93e0,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "0b908a4c-c96e-4244-b4b4-87f4ac6110bd", "address": "fa:16:3e:37:03:5a", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap0b908a4c-c9", "ovs_interfaceid": "0b908a4c-c96e-4244-b4b4-87f4ac6110bd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Feb 16 13:36:36 compute-0 nova_compute[185723]: 2026-02-16 13:36:36.163 185727 DEBUG nova.network.os_vif_util [None req-8dd25858-979a-4e7c-8f1c-508c6184b1d9 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Converting VIF {"id": "0b908a4c-c96e-4244-b4b4-87f4ac6110bd", "address": "fa:16:3e:37:03:5a", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap0b908a4c-c9", "ovs_interfaceid": "0b908a4c-c96e-4244-b4b4-87f4ac6110bd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 13:36:36 compute-0 nova_compute[185723]: 2026-02-16 13:36:36.164 185727 DEBUG nova.network.os_vif_util [None req-8dd25858-979a-4e7c-8f1c-508c6184b1d9 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:37:03:5a,bridge_name='br-int',has_traffic_filtering=True,id=0b908a4c-c96e-4244-b4b4-87f4ac6110bd,network=Network(62a1ccdd-3048-4bbf-acc8-c791bff79ee8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0b908a4c-c9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 13:36:36 compute-0 nova_compute[185723]: 2026-02-16 13:36:36.164 185727 DEBUG os_vif [None req-8dd25858-979a-4e7c-8f1c-508c6184b1d9 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:37:03:5a,bridge_name='br-int',has_traffic_filtering=True,id=0b908a4c-c96e-4244-b4b4-87f4ac6110bd,network=Network(62a1ccdd-3048-4bbf-acc8-c791bff79ee8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0b908a4c-c9') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Feb 16 13:36:36 compute-0 nova_compute[185723]: 2026-02-16 13:36:36.164 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:36:36 compute-0 nova_compute[185723]: 2026-02-16 13:36:36.165 185727 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:36:36 compute-0 nova_compute[185723]: 2026-02-16 13:36:36.165 185727 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 13:36:36 compute-0 nova_compute[185723]: 2026-02-16 13:36:36.168 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:36:36 compute-0 nova_compute[185723]: 2026-02-16 13:36:36.168 185727 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0b908a4c-c9, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:36:36 compute-0 nova_compute[185723]: 2026-02-16 13:36:36.169 185727 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap0b908a4c-c9, col_values=(('external_ids', {'iface-id': '0b908a4c-c96e-4244-b4b4-87f4ac6110bd', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:37:03:5a', 'vm-uuid': 'fea12b84-b444-4299-a1d9-2e974fbb93e0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:36:36 compute-0 nova_compute[185723]: 2026-02-16 13:36:36.170 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:36:36 compute-0 NetworkManager[56177]: <info>  [1771248996.1716] manager: (tap0b908a4c-c9): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/47)
Feb 16 13:36:36 compute-0 nova_compute[185723]: 2026-02-16 13:36:36.173 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 13:36:36 compute-0 nova_compute[185723]: 2026-02-16 13:36:36.177 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:36:36 compute-0 nova_compute[185723]: 2026-02-16 13:36:36.178 185727 INFO os_vif [None req-8dd25858-979a-4e7c-8f1c-508c6184b1d9 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:37:03:5a,bridge_name='br-int',has_traffic_filtering=True,id=0b908a4c-c96e-4244-b4b4-87f4ac6110bd,network=Network(62a1ccdd-3048-4bbf-acc8-c791bff79ee8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0b908a4c-c9')
Feb 16 13:36:36 compute-0 nova_compute[185723]: 2026-02-16 13:36:36.179 185727 DEBUG nova.virt.libvirt.driver [None req-8dd25858-979a-4e7c-8f1c-508c6184b1d9 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10954
Feb 16 13:36:36 compute-0 nova_compute[185723]: 2026-02-16 13:36:36.179 185727 DEBUG nova.compute.manager [None req-8dd25858-979a-4e7c-8f1c-508c6184b1d9 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpc1cj4x3y',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='fea12b84-b444-4299-a1d9-2e974fbb93e0',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8668
Feb 16 13:36:39 compute-0 nova_compute[185723]: 2026-02-16 13:36:39.216 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:36:39 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:36:39.216 105360 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=17, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '52:96:1b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '16:6c:7a:bd:49:39'}, ipsec=False) old=SB_Global(nb_cfg=16) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 13:36:39 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:36:39.219 105360 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 16 13:36:39 compute-0 nova_compute[185723]: 2026-02-16 13:36:39.325 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:36:40 compute-0 nova_compute[185723]: 2026-02-16 13:36:40.250 185727 DEBUG nova.network.neutron [None req-8dd25858-979a-4e7c-8f1c-508c6184b1d9 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: fea12b84-b444-4299-a1d9-2e974fbb93e0] Port 0b908a4c-c96e-4244-b4b4-87f4ac6110bd updated with migration profile {'migrating_to': 'compute-0.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.9/site-packages/nova/network/neutron.py:354
Feb 16 13:36:40 compute-0 nova_compute[185723]: 2026-02-16 13:36:40.251 185727 DEBUG nova.compute.manager [None req-8dd25858-979a-4e7c-8f1c-508c6184b1d9 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpc1cj4x3y',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='fea12b84-b444-4299-a1d9-2e974fbb93e0',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8723
Feb 16 13:36:40 compute-0 systemd[1]: Starting libvirt proxy daemon...
Feb 16 13:36:40 compute-0 systemd[1]: Started libvirt proxy daemon.
Feb 16 13:36:40 compute-0 NetworkManager[56177]: <info>  [1771249000.5845] manager: (tap0b908a4c-c9): new Tun device (/org/freedesktop/NetworkManager/Devices/48)
Feb 16 13:36:40 compute-0 kernel: tap0b908a4c-c9: entered promiscuous mode
Feb 16 13:36:40 compute-0 nova_compute[185723]: 2026-02-16 13:36:40.588 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:36:40 compute-0 ovn_controller[96072]: 2026-02-16T13:36:40Z|00105|binding|INFO|Claiming lport 0b908a4c-c96e-4244-b4b4-87f4ac6110bd for this additional chassis.
Feb 16 13:36:40 compute-0 ovn_controller[96072]: 2026-02-16T13:36:40Z|00106|binding|INFO|0b908a4c-c96e-4244-b4b4-87f4ac6110bd: Claiming fa:16:3e:37:03:5a 10.100.0.14
Feb 16 13:36:40 compute-0 sshd-session[211077]: Invalid user admin from 64.227.72.94 port 54672
Feb 16 13:36:40 compute-0 ovn_controller[96072]: 2026-02-16T13:36:40Z|00107|binding|INFO|Setting lport 0b908a4c-c96e-4244-b4b4-87f4ac6110bd ovn-installed in OVS
Feb 16 13:36:40 compute-0 nova_compute[185723]: 2026-02-16 13:36:40.596 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:36:40 compute-0 nova_compute[185723]: 2026-02-16 13:36:40.599 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:36:40 compute-0 systemd-udevd[211113]: Network interface NamePolicy= disabled on kernel command line.
Feb 16 13:36:40 compute-0 systemd-machined[155229]: New machine qemu-9-instance-0000000b.
Feb 16 13:36:40 compute-0 systemd[1]: Started Virtual Machine qemu-9-instance-0000000b.
Feb 16 13:36:40 compute-0 NetworkManager[56177]: <info>  [1771249000.6255] device (tap0b908a4c-c9): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 16 13:36:40 compute-0 NetworkManager[56177]: <info>  [1771249000.6261] device (tap0b908a4c-c9): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Feb 16 13:36:40 compute-0 sshd-session[211077]: Connection closed by invalid user admin 64.227.72.94 port 54672 [preauth]
Feb 16 13:36:40 compute-0 nova_compute[185723]: 2026-02-16 13:36:40.989 185727 DEBUG nova.virt.driver [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] Emitting event <LifecycleEvent: 1771249000.98898, fea12b84-b444-4299-a1d9-2e974fbb93e0 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 13:36:40 compute-0 nova_compute[185723]: 2026-02-16 13:36:40.992 185727 INFO nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: fea12b84-b444-4299-a1d9-2e974fbb93e0] VM Started (Lifecycle Event)
Feb 16 13:36:41 compute-0 nova_compute[185723]: 2026-02-16 13:36:41.013 185727 DEBUG nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: fea12b84-b444-4299-a1d9-2e974fbb93e0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:36:41 compute-0 nova_compute[185723]: 2026-02-16 13:36:41.170 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:36:41 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:36:41.221 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=b0e583b2-47d7-4bde-bbd6-282143e0c194, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '17'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:36:41 compute-0 nova_compute[185723]: 2026-02-16 13:36:41.758 185727 DEBUG nova.virt.driver [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] Emitting event <LifecycleEvent: 1771249001.757857, fea12b84-b444-4299-a1d9-2e974fbb93e0 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 13:36:41 compute-0 nova_compute[185723]: 2026-02-16 13:36:41.758 185727 INFO nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: fea12b84-b444-4299-a1d9-2e974fbb93e0] VM Resumed (Lifecycle Event)
Feb 16 13:36:41 compute-0 nova_compute[185723]: 2026-02-16 13:36:41.778 185727 DEBUG nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: fea12b84-b444-4299-a1d9-2e974fbb93e0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:36:41 compute-0 nova_compute[185723]: 2026-02-16 13:36:41.782 185727 DEBUG nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: fea12b84-b444-4299-a1d9-2e974fbb93e0] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 16 13:36:41 compute-0 nova_compute[185723]: 2026-02-16 13:36:41.802 185727 INFO nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: fea12b84-b444-4299-a1d9-2e974fbb93e0] During the sync_power process the instance has moved from host compute-1.ctlplane.example.com to host compute-0.ctlplane.example.com
Feb 16 13:36:42 compute-0 ovn_controller[96072]: 2026-02-16T13:36:42Z|00108|binding|INFO|Claiming lport 0b908a4c-c96e-4244-b4b4-87f4ac6110bd for this chassis.
Feb 16 13:36:42 compute-0 ovn_controller[96072]: 2026-02-16T13:36:42Z|00109|binding|INFO|0b908a4c-c96e-4244-b4b4-87f4ac6110bd: Claiming fa:16:3e:37:03:5a 10.100.0.14
Feb 16 13:36:42 compute-0 ovn_controller[96072]: 2026-02-16T13:36:42Z|00110|binding|INFO|Setting lport 0b908a4c-c96e-4244-b4b4-87f4ac6110bd up in Southbound
Feb 16 13:36:42 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:36:42.757 105360 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:37:03:5a 10.100.0.14'], port_security=['fa:16:3e:37:03:5a 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'fea12b84-b444-4299-a1d9-2e974fbb93e0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '76c271745e704d5fa97fe16a7dcd4a81', 'neutron:revision_number': '10', 'neutron:security_group_ids': '832f9d98-96fb-45e2-8c11-0f4534711ee2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fb3ae2f5-242d-4288-83f4-c053c76499e5, chassis=[<ovs.db.idl.Row object at 0x7f2e53046760>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2e53046760>], logical_port=0b908a4c-c96e-4244-b4b4-87f4ac6110bd) old=Port_Binding(up=[False], additional_chassis=[<ovs.db.idl.Row object at 0x7f2e53046760>], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 13:36:42 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:36:42.758 105360 INFO neutron.agent.ovn.metadata.agent [-] Port 0b908a4c-c96e-4244-b4b4-87f4ac6110bd in datapath 62a1ccdd-3048-4bbf-acc8-c791bff79ee8 bound to our chassis
Feb 16 13:36:42 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:36:42.759 105360 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 62a1ccdd-3048-4bbf-acc8-c791bff79ee8
Feb 16 13:36:42 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:36:42.775 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[b69bd78d-fec6-4a5e-ac99-ed9155513af8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:36:42 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:36:42.805 206452 DEBUG oslo.privsep.daemon [-] privsep: reply[8c2e2121-ed34-48bb-ac3a-8105777c8114]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:36:42 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:36:42.809 206452 DEBUG oslo.privsep.daemon [-] privsep: reply[79c8a656-1dd1-4fe8-81cc-6db285a8ab87]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:36:42 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:36:42.832 206452 DEBUG oslo.privsep.daemon [-] privsep: reply[6b1ef90d-682b-497d-9387-ad298edf3392]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:36:42 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:36:42.845 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[4d77b7ef-1ab5-4678-8ee6-ea8c1f9d7391]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap62a1ccdd-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a9:94:92'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 13, 'tx_packets': 5, 'rx_bytes': 826, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 13, 'tx_packets': 5, 'rx_bytes': 826, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 30], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 505510, 'reachable_time': 30793, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 211145, 'error': None, 'target': 'ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:36:42 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:36:42.859 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[ddd000d6-3ddc-4bce-8a69-23b0e7df9274]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap62a1ccdd-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 505523, 'tstamp': 505523}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 211146, 'error': None, 'target': 'ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap62a1ccdd-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 505525, 'tstamp': 505525}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 211146, 'error': None, 'target': 'ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:36:42 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:36:42.860 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap62a1ccdd-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:36:42 compute-0 nova_compute[185723]: 2026-02-16 13:36:42.862 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:36:42 compute-0 nova_compute[185723]: 2026-02-16 13:36:42.863 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:36:42 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:36:42.864 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap62a1ccdd-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:36:42 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:36:42.865 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 13:36:42 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:36:42.865 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap62a1ccdd-30, col_values=(('external_ids', {'iface-id': 'ac21d57d-f71e-4560-b6aa-e9f6e3838308'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:36:42 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:36:42.865 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 13:36:42 compute-0 nova_compute[185723]: 2026-02-16 13:36:42.900 185727 INFO nova.compute.manager [None req-8dd25858-979a-4e7c-8f1c-508c6184b1d9 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: fea12b84-b444-4299-a1d9-2e974fbb93e0] Post operation of migration started
Feb 16 13:36:43 compute-0 nova_compute[185723]: 2026-02-16 13:36:43.507 185727 DEBUG oslo_concurrency.lockutils [None req-8dd25858-979a-4e7c-8f1c-508c6184b1d9 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "refresh_cache-fea12b84-b444-4299-a1d9-2e974fbb93e0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 13:36:43 compute-0 nova_compute[185723]: 2026-02-16 13:36:43.508 185727 DEBUG oslo_concurrency.lockutils [None req-8dd25858-979a-4e7c-8f1c-508c6184b1d9 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Acquired lock "refresh_cache-fea12b84-b444-4299-a1d9-2e974fbb93e0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 13:36:43 compute-0 nova_compute[185723]: 2026-02-16 13:36:43.508 185727 DEBUG nova.network.neutron [None req-8dd25858-979a-4e7c-8f1c-508c6184b1d9 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: fea12b84-b444-4299-a1d9-2e974fbb93e0] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 16 13:36:44 compute-0 nova_compute[185723]: 2026-02-16 13:36:44.376 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:36:46 compute-0 nova_compute[185723]: 2026-02-16 13:36:46.172 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:36:46 compute-0 nova_compute[185723]: 2026-02-16 13:36:46.454 185727 DEBUG nova.network.neutron [None req-8dd25858-979a-4e7c-8f1c-508c6184b1d9 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: fea12b84-b444-4299-a1d9-2e974fbb93e0] Updating instance_info_cache with network_info: [{"id": "0b908a4c-c96e-4244-b4b4-87f4ac6110bd", "address": "fa:16:3e:37:03:5a", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b908a4c-c9", "ovs_interfaceid": "0b908a4c-c96e-4244-b4b4-87f4ac6110bd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 13:36:46 compute-0 nova_compute[185723]: 2026-02-16 13:36:46.471 185727 DEBUG oslo_concurrency.lockutils [None req-8dd25858-979a-4e7c-8f1c-508c6184b1d9 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Releasing lock "refresh_cache-fea12b84-b444-4299-a1d9-2e974fbb93e0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 13:36:46 compute-0 nova_compute[185723]: 2026-02-16 13:36:46.487 185727 DEBUG oslo_concurrency.lockutils [None req-8dd25858-979a-4e7c-8f1c-508c6184b1d9 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:36:46 compute-0 nova_compute[185723]: 2026-02-16 13:36:46.489 185727 DEBUG oslo_concurrency.lockutils [None req-8dd25858-979a-4e7c-8f1c-508c6184b1d9 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:36:46 compute-0 nova_compute[185723]: 2026-02-16 13:36:46.489 185727 DEBUG oslo_concurrency.lockutils [None req-8dd25858-979a-4e7c-8f1c-508c6184b1d9 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:36:46 compute-0 nova_compute[185723]: 2026-02-16 13:36:46.495 185727 INFO nova.virt.libvirt.driver [None req-8dd25858-979a-4e7c-8f1c-508c6184b1d9 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: fea12b84-b444-4299-a1d9-2e974fbb93e0] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Feb 16 13:36:46 compute-0 virtqemud[184843]: Domain id=9 name='instance-0000000b' uuid=fea12b84-b444-4299-a1d9-2e974fbb93e0 is tainted: custom-monitor
Feb 16 13:36:47 compute-0 nova_compute[185723]: 2026-02-16 13:36:47.505 185727 INFO nova.virt.libvirt.driver [None req-8dd25858-979a-4e7c-8f1c-508c6184b1d9 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: fea12b84-b444-4299-a1d9-2e974fbb93e0] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Feb 16 13:36:48 compute-0 nova_compute[185723]: 2026-02-16 13:36:48.511 185727 INFO nova.virt.libvirt.driver [None req-8dd25858-979a-4e7c-8f1c-508c6184b1d9 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: fea12b84-b444-4299-a1d9-2e974fbb93e0] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Feb 16 13:36:48 compute-0 nova_compute[185723]: 2026-02-16 13:36:48.518 185727 DEBUG nova.compute.manager [None req-8dd25858-979a-4e7c-8f1c-508c6184b1d9 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: fea12b84-b444-4299-a1d9-2e974fbb93e0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:36:48 compute-0 nova_compute[185723]: 2026-02-16 13:36:48.739 185727 DEBUG nova.objects.instance [None req-8dd25858-979a-4e7c-8f1c-508c6184b1d9 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: fea12b84-b444-4299-a1d9-2e974fbb93e0] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Feb 16 13:36:49 compute-0 podman[211148]: 2026-02-16 13:36:49.026502909 +0000 UTC m=+0.061228413 container health_status d69bd0be854ec8043fcba555b70c2cd311c7a2510d07d6bc9929fe7684abece9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Feb 16 13:36:49 compute-0 podman[211147]: 2026-02-16 13:36:49.027614707 +0000 UTC m=+0.062340291 container health_status 93151dcbec9f159583042df5101bdd7b5b1bcb5f3117955b4bc9cd1c0e465b98 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9/ubi-minimal, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2026-02-05T04:57:10Z, managed_by=edpm_ansible, release=1770267347, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, io.buildah.version=1.33.7, vcs-type=git, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.tags=minimal rhel9, org.opencontainers.image.created=2026-02-05T04:57:10Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., version=9.7, config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Feb 16 13:36:49 compute-0 nova_compute[185723]: 2026-02-16 13:36:49.414 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:36:49 compute-0 sshd-session[211186]: Invalid user postgres from 188.166.42.159 port 56662
Feb 16 13:36:49 compute-0 sshd-session[211186]: Connection closed by invalid user postgres 188.166.42.159 port 56662 [preauth]
Feb 16 13:36:51 compute-0 nova_compute[185723]: 2026-02-16 13:36:51.174 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:36:53 compute-0 podman[211188]: 2026-02-16 13:36:53.043252102 +0000 UTC m=+0.072990602 container health_status 19961a2f872cc72daf954f4dd556f980b5f58f137d145776df6132c167a8a93c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible)
Feb 16 13:36:54 compute-0 nova_compute[185723]: 2026-02-16 13:36:54.415 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:36:54 compute-0 nova_compute[185723]: 2026-02-16 13:36:54.556 185727 DEBUG oslo_concurrency.lockutils [None req-9d5a0809-5b16-4de9-9a57-650db58c43a8 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Acquiring lock "6403f5ce-8933-4efa-b4a5-611cd66c8a29" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:36:54 compute-0 nova_compute[185723]: 2026-02-16 13:36:54.557 185727 DEBUG oslo_concurrency.lockutils [None req-9d5a0809-5b16-4de9-9a57-650db58c43a8 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "6403f5ce-8933-4efa-b4a5-611cd66c8a29" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:36:54 compute-0 nova_compute[185723]: 2026-02-16 13:36:54.557 185727 DEBUG oslo_concurrency.lockutils [None req-9d5a0809-5b16-4de9-9a57-650db58c43a8 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Acquiring lock "6403f5ce-8933-4efa-b4a5-611cd66c8a29-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:36:54 compute-0 nova_compute[185723]: 2026-02-16 13:36:54.557 185727 DEBUG oslo_concurrency.lockutils [None req-9d5a0809-5b16-4de9-9a57-650db58c43a8 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "6403f5ce-8933-4efa-b4a5-611cd66c8a29-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:36:54 compute-0 nova_compute[185723]: 2026-02-16 13:36:54.557 185727 DEBUG oslo_concurrency.lockutils [None req-9d5a0809-5b16-4de9-9a57-650db58c43a8 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "6403f5ce-8933-4efa-b4a5-611cd66c8a29-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:36:54 compute-0 nova_compute[185723]: 2026-02-16 13:36:54.558 185727 INFO nova.compute.manager [None req-9d5a0809-5b16-4de9-9a57-650db58c43a8 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 6403f5ce-8933-4efa-b4a5-611cd66c8a29] Terminating instance
Feb 16 13:36:54 compute-0 nova_compute[185723]: 2026-02-16 13:36:54.559 185727 DEBUG nova.compute.manager [None req-9d5a0809-5b16-4de9-9a57-650db58c43a8 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 6403f5ce-8933-4efa-b4a5-611cd66c8a29] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Feb 16 13:36:54 compute-0 kernel: tapfe70cd80-cb (unregistering): left promiscuous mode
Feb 16 13:36:54 compute-0 NetworkManager[56177]: <info>  [1771249014.5918] device (tapfe70cd80-cb): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Feb 16 13:36:54 compute-0 ovn_controller[96072]: 2026-02-16T13:36:54Z|00111|binding|INFO|Releasing lport fe70cd80-cb05-4753-88d4-22ea28d8e9b0 from this chassis (sb_readonly=0)
Feb 16 13:36:54 compute-0 ovn_controller[96072]: 2026-02-16T13:36:54Z|00112|binding|INFO|Setting lport fe70cd80-cb05-4753-88d4-22ea28d8e9b0 down in Southbound
Feb 16 13:36:54 compute-0 nova_compute[185723]: 2026-02-16 13:36:54.599 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:36:54 compute-0 ovn_controller[96072]: 2026-02-16T13:36:54Z|00113|binding|INFO|Removing iface tapfe70cd80-cb ovn-installed in OVS
Feb 16 13:36:54 compute-0 nova_compute[185723]: 2026-02-16 13:36:54.607 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:36:54 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:36:54.609 105360 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ff:48:e8 10.100.0.5'], port_security=['fa:16:3e:ff:48:e8 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '6403f5ce-8933-4efa-b4a5-611cd66c8a29', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '76c271745e704d5fa97fe16a7dcd4a81', 'neutron:revision_number': '4', 'neutron:security_group_ids': '832f9d98-96fb-45e2-8c11-0f4534711ee2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fb3ae2f5-242d-4288-83f4-c053c76499e5, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2e53046760>], logical_port=fe70cd80-cb05-4753-88d4-22ea28d8e9b0) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2e53046760>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 13:36:54 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:36:54.610 105360 INFO neutron.agent.ovn.metadata.agent [-] Port fe70cd80-cb05-4753-88d4-22ea28d8e9b0 in datapath 62a1ccdd-3048-4bbf-acc8-c791bff79ee8 unbound from our chassis
Feb 16 13:36:54 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:36:54.611 105360 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 62a1ccdd-3048-4bbf-acc8-c791bff79ee8
Feb 16 13:36:54 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:36:54.628 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[1dc4ea95-1c6c-40be-9e2d-5ae8079c84b9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:36:54 compute-0 systemd[1]: machine-qemu\x2d8\x2dinstance\x2d0000000c.scope: Deactivated successfully.
Feb 16 13:36:54 compute-0 systemd[1]: machine-qemu\x2d8\x2dinstance\x2d0000000c.scope: Consumed 13.482s CPU time.
Feb 16 13:36:54 compute-0 systemd-machined[155229]: Machine qemu-8-instance-0000000c terminated.
Feb 16 13:36:54 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:36:54.655 206452 DEBUG oslo.privsep.daemon [-] privsep: reply[6d948fa1-e9a9-4824-98ca-5c43c43a4551]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:36:54 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:36:54.659 206452 DEBUG oslo.privsep.daemon [-] privsep: reply[127bad23-10c3-47ea-b4d3-f8d14ad3c719]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:36:54 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:36:54.687 206452 DEBUG oslo.privsep.daemon [-] privsep: reply[a841aa94-f9b2-434b-989e-8ed185a4c8a2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:36:54 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:36:54.704 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[9c5599b1-a28b-4592-b2bf-544879933ef4]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap62a1ccdd-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a9:94:92'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 28, 'tx_packets': 7, 'rx_bytes': 1456, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 28, 'tx_packets': 7, 'rx_bytes': 1456, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 30], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 505510, 'reachable_time': 30793, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 211226, 'error': None, 'target': 'ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:36:54 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:36:54.719 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[3b7b050a-f8e3-437d-ab5f-89124cb24ddb]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap62a1ccdd-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 505523, 'tstamp': 505523}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 211227, 'error': None, 'target': 'ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap62a1ccdd-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 505525, 'tstamp': 505525}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 211227, 'error': None, 'target': 'ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:36:54 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:36:54.720 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap62a1ccdd-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:36:54 compute-0 nova_compute[185723]: 2026-02-16 13:36:54.722 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:36:54 compute-0 nova_compute[185723]: 2026-02-16 13:36:54.727 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:36:54 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:36:54.728 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap62a1ccdd-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:36:54 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:36:54.728 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 13:36:54 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:36:54.729 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap62a1ccdd-30, col_values=(('external_ids', {'iface-id': 'ac21d57d-f71e-4560-b6aa-e9f6e3838308'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:36:54 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:36:54.729 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 13:36:54 compute-0 nova_compute[185723]: 2026-02-16 13:36:54.829 185727 INFO nova.virt.libvirt.driver [-] [instance: 6403f5ce-8933-4efa-b4a5-611cd66c8a29] Instance destroyed successfully.
Feb 16 13:36:54 compute-0 nova_compute[185723]: 2026-02-16 13:36:54.829 185727 DEBUG nova.objects.instance [None req-9d5a0809-5b16-4de9-9a57-650db58c43a8 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lazy-loading 'resources' on Instance uuid 6403f5ce-8933-4efa-b4a5-611cd66c8a29 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 13:36:54 compute-0 nova_compute[185723]: 2026-02-16 13:36:54.880 185727 DEBUG nova.virt.libvirt.vif [None req-9d5a0809-5b16-4de9-9a57-650db58c43a8 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-16T13:36:02Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1632444659',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1632444659',id=12,image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-16T13:36:12Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='76c271745e704d5fa97fe16a7dcd4a81',ramdisk_id='',reservation_id='r-efsiex6g',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-1085993185',owner_user_name='tempest-TestExecuteStrategies-1085993185-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-16T13:36:12Z,user_data=None,user_id='e19cd2d8a8894526ba620ca3249e9a63',uuid=6403f5ce-8933-4efa-b4a5-611cd66c8a29,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "fe70cd80-cb05-4753-88d4-22ea28d8e9b0", "address": "fa:16:3e:ff:48:e8", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfe70cd80-cb", "ovs_interfaceid": "fe70cd80-cb05-4753-88d4-22ea28d8e9b0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Feb 16 13:36:54 compute-0 nova_compute[185723]: 2026-02-16 13:36:54.880 185727 DEBUG nova.network.os_vif_util [None req-9d5a0809-5b16-4de9-9a57-650db58c43a8 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Converting VIF {"id": "fe70cd80-cb05-4753-88d4-22ea28d8e9b0", "address": "fa:16:3e:ff:48:e8", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfe70cd80-cb", "ovs_interfaceid": "fe70cd80-cb05-4753-88d4-22ea28d8e9b0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 13:36:54 compute-0 nova_compute[185723]: 2026-02-16 13:36:54.882 185727 DEBUG nova.network.os_vif_util [None req-9d5a0809-5b16-4de9-9a57-650db58c43a8 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:ff:48:e8,bridge_name='br-int',has_traffic_filtering=True,id=fe70cd80-cb05-4753-88d4-22ea28d8e9b0,network=Network(62a1ccdd-3048-4bbf-acc8-c791bff79ee8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfe70cd80-cb') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 13:36:54 compute-0 nova_compute[185723]: 2026-02-16 13:36:54.882 185727 DEBUG os_vif [None req-9d5a0809-5b16-4de9-9a57-650db58c43a8 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:ff:48:e8,bridge_name='br-int',has_traffic_filtering=True,id=fe70cd80-cb05-4753-88d4-22ea28d8e9b0,network=Network(62a1ccdd-3048-4bbf-acc8-c791bff79ee8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfe70cd80-cb') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Feb 16 13:36:54 compute-0 nova_compute[185723]: 2026-02-16 13:36:54.885 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:36:54 compute-0 nova_compute[185723]: 2026-02-16 13:36:54.885 185727 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfe70cd80-cb, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:36:54 compute-0 nova_compute[185723]: 2026-02-16 13:36:54.888 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:36:54 compute-0 nova_compute[185723]: 2026-02-16 13:36:54.889 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:36:54 compute-0 nova_compute[185723]: 2026-02-16 13:36:54.891 185727 INFO os_vif [None req-9d5a0809-5b16-4de9-9a57-650db58c43a8 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:ff:48:e8,bridge_name='br-int',has_traffic_filtering=True,id=fe70cd80-cb05-4753-88d4-22ea28d8e9b0,network=Network(62a1ccdd-3048-4bbf-acc8-c791bff79ee8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfe70cd80-cb')
Feb 16 13:36:54 compute-0 nova_compute[185723]: 2026-02-16 13:36:54.892 185727 INFO nova.virt.libvirt.driver [None req-9d5a0809-5b16-4de9-9a57-650db58c43a8 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 6403f5ce-8933-4efa-b4a5-611cd66c8a29] Deleting instance files /var/lib/nova/instances/6403f5ce-8933-4efa-b4a5-611cd66c8a29_del
Feb 16 13:36:54 compute-0 nova_compute[185723]: 2026-02-16 13:36:54.893 185727 INFO nova.virt.libvirt.driver [None req-9d5a0809-5b16-4de9-9a57-650db58c43a8 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 6403f5ce-8933-4efa-b4a5-611cd66c8a29] Deletion of /var/lib/nova/instances/6403f5ce-8933-4efa-b4a5-611cd66c8a29_del complete
Feb 16 13:36:54 compute-0 nova_compute[185723]: 2026-02-16 13:36:54.952 185727 INFO nova.compute.manager [None req-9d5a0809-5b16-4de9-9a57-650db58c43a8 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 6403f5ce-8933-4efa-b4a5-611cd66c8a29] Took 0.39 seconds to destroy the instance on the hypervisor.
Feb 16 13:36:54 compute-0 nova_compute[185723]: 2026-02-16 13:36:54.952 185727 DEBUG oslo.service.loopingcall [None req-9d5a0809-5b16-4de9-9a57-650db58c43a8 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Feb 16 13:36:54 compute-0 nova_compute[185723]: 2026-02-16 13:36:54.953 185727 DEBUG nova.compute.manager [-] [instance: 6403f5ce-8933-4efa-b4a5-611cd66c8a29] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Feb 16 13:36:54 compute-0 nova_compute[185723]: 2026-02-16 13:36:54.953 185727 DEBUG nova.network.neutron [-] [instance: 6403f5ce-8933-4efa-b4a5-611cd66c8a29] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Feb 16 13:36:55 compute-0 nova_compute[185723]: 2026-02-16 13:36:55.621 185727 DEBUG nova.compute.manager [req-2ce7abad-906c-4616-a332-ad36e0bf33ef req-e924d971-4b50-4325-b710-711a2c0e08a7 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 6403f5ce-8933-4efa-b4a5-611cd66c8a29] Received event network-vif-unplugged-fe70cd80-cb05-4753-88d4-22ea28d8e9b0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:36:55 compute-0 nova_compute[185723]: 2026-02-16 13:36:55.622 185727 DEBUG oslo_concurrency.lockutils [req-2ce7abad-906c-4616-a332-ad36e0bf33ef req-e924d971-4b50-4325-b710-711a2c0e08a7 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "6403f5ce-8933-4efa-b4a5-611cd66c8a29-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:36:55 compute-0 nova_compute[185723]: 2026-02-16 13:36:55.622 185727 DEBUG oslo_concurrency.lockutils [req-2ce7abad-906c-4616-a332-ad36e0bf33ef req-e924d971-4b50-4325-b710-711a2c0e08a7 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "6403f5ce-8933-4efa-b4a5-611cd66c8a29-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:36:55 compute-0 nova_compute[185723]: 2026-02-16 13:36:55.622 185727 DEBUG oslo_concurrency.lockutils [req-2ce7abad-906c-4616-a332-ad36e0bf33ef req-e924d971-4b50-4325-b710-711a2c0e08a7 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "6403f5ce-8933-4efa-b4a5-611cd66c8a29-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:36:55 compute-0 nova_compute[185723]: 2026-02-16 13:36:55.622 185727 DEBUG nova.compute.manager [req-2ce7abad-906c-4616-a332-ad36e0bf33ef req-e924d971-4b50-4325-b710-711a2c0e08a7 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 6403f5ce-8933-4efa-b4a5-611cd66c8a29] No waiting events found dispatching network-vif-unplugged-fe70cd80-cb05-4753-88d4-22ea28d8e9b0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:36:55 compute-0 nova_compute[185723]: 2026-02-16 13:36:55.623 185727 DEBUG nova.compute.manager [req-2ce7abad-906c-4616-a332-ad36e0bf33ef req-e924d971-4b50-4325-b710-711a2c0e08a7 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 6403f5ce-8933-4efa-b4a5-611cd66c8a29] Received event network-vif-unplugged-fe70cd80-cb05-4753-88d4-22ea28d8e9b0 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Feb 16 13:36:57 compute-0 nova_compute[185723]: 2026-02-16 13:36:57.326 185727 DEBUG nova.network.neutron [-] [instance: 6403f5ce-8933-4efa-b4a5-611cd66c8a29] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 13:36:57 compute-0 nova_compute[185723]: 2026-02-16 13:36:57.368 185727 INFO nova.compute.manager [-] [instance: 6403f5ce-8933-4efa-b4a5-611cd66c8a29] Took 2.42 seconds to deallocate network for instance.
Feb 16 13:36:57 compute-0 nova_compute[185723]: 2026-02-16 13:36:57.419 185727 DEBUG oslo_concurrency.lockutils [None req-9d5a0809-5b16-4de9-9a57-650db58c43a8 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:36:57 compute-0 nova_compute[185723]: 2026-02-16 13:36:57.420 185727 DEBUG oslo_concurrency.lockutils [None req-9d5a0809-5b16-4de9-9a57-650db58c43a8 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:36:57 compute-0 nova_compute[185723]: 2026-02-16 13:36:57.498 185727 DEBUG nova.compute.provider_tree [None req-9d5a0809-5b16-4de9-9a57-650db58c43a8 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Inventory has not changed in ProviderTree for provider: c9501a85-df32-4b8f-bce0-9425ef1e7866 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 13:36:57 compute-0 nova_compute[185723]: 2026-02-16 13:36:57.529 185727 DEBUG nova.scheduler.client.report [None req-9d5a0809-5b16-4de9-9a57-650db58c43a8 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Inventory has not changed for provider c9501a85-df32-4b8f-bce0-9425ef1e7866 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 13:36:57 compute-0 nova_compute[185723]: 2026-02-16 13:36:57.556 185727 DEBUG oslo_concurrency.lockutils [None req-9d5a0809-5b16-4de9-9a57-650db58c43a8 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.137s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:36:57 compute-0 nova_compute[185723]: 2026-02-16 13:36:57.599 185727 INFO nova.scheduler.client.report [None req-9d5a0809-5b16-4de9-9a57-650db58c43a8 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Deleted allocations for instance 6403f5ce-8933-4efa-b4a5-611cd66c8a29
Feb 16 13:36:57 compute-0 nova_compute[185723]: 2026-02-16 13:36:57.691 185727 DEBUG oslo_concurrency.lockutils [None req-9d5a0809-5b16-4de9-9a57-650db58c43a8 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "6403f5ce-8933-4efa-b4a5-611cd66c8a29" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.135s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:36:57 compute-0 nova_compute[185723]: 2026-02-16 13:36:57.732 185727 DEBUG nova.compute.manager [req-31fbaa67-dcfc-458c-87f8-1bcea0bb5fbd req-bf769909-e437-4574-ac57-7e2df99ce0e1 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 6403f5ce-8933-4efa-b4a5-611cd66c8a29] Received event network-vif-plugged-fe70cd80-cb05-4753-88d4-22ea28d8e9b0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:36:57 compute-0 nova_compute[185723]: 2026-02-16 13:36:57.732 185727 DEBUG oslo_concurrency.lockutils [req-31fbaa67-dcfc-458c-87f8-1bcea0bb5fbd req-bf769909-e437-4574-ac57-7e2df99ce0e1 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "6403f5ce-8933-4efa-b4a5-611cd66c8a29-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:36:57 compute-0 nova_compute[185723]: 2026-02-16 13:36:57.732 185727 DEBUG oslo_concurrency.lockutils [req-31fbaa67-dcfc-458c-87f8-1bcea0bb5fbd req-bf769909-e437-4574-ac57-7e2df99ce0e1 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "6403f5ce-8933-4efa-b4a5-611cd66c8a29-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:36:57 compute-0 nova_compute[185723]: 2026-02-16 13:36:57.733 185727 DEBUG oslo_concurrency.lockutils [req-31fbaa67-dcfc-458c-87f8-1bcea0bb5fbd req-bf769909-e437-4574-ac57-7e2df99ce0e1 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "6403f5ce-8933-4efa-b4a5-611cd66c8a29-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:36:57 compute-0 nova_compute[185723]: 2026-02-16 13:36:57.733 185727 DEBUG nova.compute.manager [req-31fbaa67-dcfc-458c-87f8-1bcea0bb5fbd req-bf769909-e437-4574-ac57-7e2df99ce0e1 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 6403f5ce-8933-4efa-b4a5-611cd66c8a29] No waiting events found dispatching network-vif-plugged-fe70cd80-cb05-4753-88d4-22ea28d8e9b0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:36:57 compute-0 nova_compute[185723]: 2026-02-16 13:36:57.733 185727 WARNING nova.compute.manager [req-31fbaa67-dcfc-458c-87f8-1bcea0bb5fbd req-bf769909-e437-4574-ac57-7e2df99ce0e1 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 6403f5ce-8933-4efa-b4a5-611cd66c8a29] Received unexpected event network-vif-plugged-fe70cd80-cb05-4753-88d4-22ea28d8e9b0 for instance with vm_state deleted and task_state None.
Feb 16 13:36:57 compute-0 nova_compute[185723]: 2026-02-16 13:36:57.733 185727 DEBUG nova.compute.manager [req-31fbaa67-dcfc-458c-87f8-1bcea0bb5fbd req-bf769909-e437-4574-ac57-7e2df99ce0e1 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 6403f5ce-8933-4efa-b4a5-611cd66c8a29] Received event network-vif-deleted-fe70cd80-cb05-4753-88d4-22ea28d8e9b0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:36:58 compute-0 nova_compute[185723]: 2026-02-16 13:36:58.941 185727 DEBUG oslo_concurrency.lockutils [None req-41a0e94a-4ad7-4a85-a00e-443280f1e706 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Acquiring lock "fea12b84-b444-4299-a1d9-2e974fbb93e0" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:36:58 compute-0 nova_compute[185723]: 2026-02-16 13:36:58.941 185727 DEBUG oslo_concurrency.lockutils [None req-41a0e94a-4ad7-4a85-a00e-443280f1e706 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "fea12b84-b444-4299-a1d9-2e974fbb93e0" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:36:58 compute-0 nova_compute[185723]: 2026-02-16 13:36:58.942 185727 DEBUG oslo_concurrency.lockutils [None req-41a0e94a-4ad7-4a85-a00e-443280f1e706 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Acquiring lock "fea12b84-b444-4299-a1d9-2e974fbb93e0-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:36:58 compute-0 nova_compute[185723]: 2026-02-16 13:36:58.942 185727 DEBUG oslo_concurrency.lockutils [None req-41a0e94a-4ad7-4a85-a00e-443280f1e706 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "fea12b84-b444-4299-a1d9-2e974fbb93e0-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:36:58 compute-0 nova_compute[185723]: 2026-02-16 13:36:58.943 185727 DEBUG oslo_concurrency.lockutils [None req-41a0e94a-4ad7-4a85-a00e-443280f1e706 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "fea12b84-b444-4299-a1d9-2e974fbb93e0-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:36:58 compute-0 nova_compute[185723]: 2026-02-16 13:36:58.945 185727 INFO nova.compute.manager [None req-41a0e94a-4ad7-4a85-a00e-443280f1e706 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: fea12b84-b444-4299-a1d9-2e974fbb93e0] Terminating instance
Feb 16 13:36:58 compute-0 nova_compute[185723]: 2026-02-16 13:36:58.946 185727 DEBUG nova.compute.manager [None req-41a0e94a-4ad7-4a85-a00e-443280f1e706 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: fea12b84-b444-4299-a1d9-2e974fbb93e0] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Feb 16 13:36:58 compute-0 kernel: tap0b908a4c-c9 (unregistering): left promiscuous mode
Feb 16 13:36:58 compute-0 NetworkManager[56177]: <info>  [1771249018.9752] device (tap0b908a4c-c9): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Feb 16 13:36:59 compute-0 ovn_controller[96072]: 2026-02-16T13:36:59Z|00114|binding|INFO|Releasing lport 0b908a4c-c96e-4244-b4b4-87f4ac6110bd from this chassis (sb_readonly=0)
Feb 16 13:36:59 compute-0 ovn_controller[96072]: 2026-02-16T13:36:59Z|00115|binding|INFO|Setting lport 0b908a4c-c96e-4244-b4b4-87f4ac6110bd down in Southbound
Feb 16 13:36:59 compute-0 ovn_controller[96072]: 2026-02-16T13:36:59Z|00116|binding|INFO|Removing iface tap0b908a4c-c9 ovn-installed in OVS
Feb 16 13:36:59 compute-0 nova_compute[185723]: 2026-02-16 13:36:59.008 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:36:59 compute-0 nova_compute[185723]: 2026-02-16 13:36:59.011 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:36:59 compute-0 nova_compute[185723]: 2026-02-16 13:36:59.013 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:36:59 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:36:59.016 105360 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:37:03:5a 10.100.0.14'], port_security=['fa:16:3e:37:03:5a 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'fea12b84-b444-4299-a1d9-2e974fbb93e0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '76c271745e704d5fa97fe16a7dcd4a81', 'neutron:revision_number': '12', 'neutron:security_group_ids': '832f9d98-96fb-45e2-8c11-0f4534711ee2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fb3ae2f5-242d-4288-83f4-c053c76499e5, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2e53046760>], logical_port=0b908a4c-c96e-4244-b4b4-87f4ac6110bd) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2e53046760>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 13:36:59 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:36:59.017 105360 INFO neutron.agent.ovn.metadata.agent [-] Port 0b908a4c-c96e-4244-b4b4-87f4ac6110bd in datapath 62a1ccdd-3048-4bbf-acc8-c791bff79ee8 unbound from our chassis
Feb 16 13:36:59 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:36:59.019 105360 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 62a1ccdd-3048-4bbf-acc8-c791bff79ee8, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 16 13:36:59 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:36:59.020 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[398bf4a9-e0fd-4b7e-8e54-490ee80b3855]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:36:59 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:36:59.020 105360 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8 namespace which is not needed anymore
Feb 16 13:36:59 compute-0 systemd[1]: machine-qemu\x2d9\x2dinstance\x2d0000000b.scope: Deactivated successfully.
Feb 16 13:36:59 compute-0 systemd[1]: machine-qemu\x2d9\x2dinstance\x2d0000000b.scope: Consumed 1.470s CPU time.
Feb 16 13:36:59 compute-0 systemd-machined[155229]: Machine qemu-9-instance-0000000b terminated.
Feb 16 13:36:59 compute-0 neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8[210921]: [NOTICE]   (210925) : haproxy version is 2.8.14-c23fe91
Feb 16 13:36:59 compute-0 neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8[210921]: [NOTICE]   (210925) : path to executable is /usr/sbin/haproxy
Feb 16 13:36:59 compute-0 neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8[210921]: [WARNING]  (210925) : Exiting Master process...
Feb 16 13:36:59 compute-0 neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8[210921]: [WARNING]  (210925) : Exiting Master process...
Feb 16 13:36:59 compute-0 neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8[210921]: [ALERT]    (210925) : Current worker (210927) exited with code 143 (Terminated)
Feb 16 13:36:59 compute-0 neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8[210921]: [WARNING]  (210925) : All workers exited. Exiting... (0)
Feb 16 13:36:59 compute-0 systemd[1]: libpod-a57a61246fea3613066b0575aa9747df0abd08348fcfa3344b0e1003de7a59fb.scope: Deactivated successfully.
Feb 16 13:36:59 compute-0 podman[211278]: 2026-02-16 13:36:59.165621708 +0000 UTC m=+0.050320066 container died a57a61246fea3613066b0575aa9747df0abd08348fcfa3344b0e1003de7a59fb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.vendor=CentOS)
Feb 16 13:36:59 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a57a61246fea3613066b0575aa9747df0abd08348fcfa3344b0e1003de7a59fb-userdata-shm.mount: Deactivated successfully.
Feb 16 13:36:59 compute-0 systemd[1]: var-lib-containers-storage-overlay-aedd1a1ede9506bdddae3bf326748c7bd036b6de99942b5e003086620f9f6a63-merged.mount: Deactivated successfully.
Feb 16 13:36:59 compute-0 podman[211278]: 2026-02-16 13:36:59.215724017 +0000 UTC m=+0.100422375 container cleanup a57a61246fea3613066b0575aa9747df0abd08348fcfa3344b0e1003de7a59fb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true)
Feb 16 13:36:59 compute-0 nova_compute[185723]: 2026-02-16 13:36:59.215 185727 INFO nova.virt.libvirt.driver [-] [instance: fea12b84-b444-4299-a1d9-2e974fbb93e0] Instance destroyed successfully.
Feb 16 13:36:59 compute-0 nova_compute[185723]: 2026-02-16 13:36:59.216 185727 DEBUG nova.objects.instance [None req-41a0e94a-4ad7-4a85-a00e-443280f1e706 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lazy-loading 'resources' on Instance uuid fea12b84-b444-4299-a1d9-2e974fbb93e0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 13:36:59 compute-0 systemd[1]: libpod-conmon-a57a61246fea3613066b0575aa9747df0abd08348fcfa3344b0e1003de7a59fb.scope: Deactivated successfully.
Feb 16 13:36:59 compute-0 nova_compute[185723]: 2026-02-16 13:36:59.232 185727 DEBUG nova.virt.libvirt.vif [None req-41a0e94a-4ad7-4a85-a00e-443280f1e706 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-02-16T13:35:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-276078181',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-276078181',id=11,image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-16T13:35:55Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='76c271745e704d5fa97fe16a7dcd4a81',ramdisk_id='',reservation_id='r-14c0jw0p',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-1085993185',owner_user_name='tempest-TestExecuteStrategies-1085993185-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-16T13:36:48Z,user_data=None,user_id='e19cd2d8a8894526ba620ca3249e9a63',uuid=fea12b84-b444-4299-a1d9-2e974fbb93e0,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "0b908a4c-c96e-4244-b4b4-87f4ac6110bd", "address": "fa:16:3e:37:03:5a", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b908a4c-c9", "ovs_interfaceid": "0b908a4c-c96e-4244-b4b4-87f4ac6110bd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Feb 16 13:36:59 compute-0 nova_compute[185723]: 2026-02-16 13:36:59.232 185727 DEBUG nova.network.os_vif_util [None req-41a0e94a-4ad7-4a85-a00e-443280f1e706 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Converting VIF {"id": "0b908a4c-c96e-4244-b4b4-87f4ac6110bd", "address": "fa:16:3e:37:03:5a", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b908a4c-c9", "ovs_interfaceid": "0b908a4c-c96e-4244-b4b4-87f4ac6110bd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 13:36:59 compute-0 nova_compute[185723]: 2026-02-16 13:36:59.233 185727 DEBUG nova.network.os_vif_util [None req-41a0e94a-4ad7-4a85-a00e-443280f1e706 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:37:03:5a,bridge_name='br-int',has_traffic_filtering=True,id=0b908a4c-c96e-4244-b4b4-87f4ac6110bd,network=Network(62a1ccdd-3048-4bbf-acc8-c791bff79ee8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0b908a4c-c9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 13:36:59 compute-0 nova_compute[185723]: 2026-02-16 13:36:59.234 185727 DEBUG os_vif [None req-41a0e94a-4ad7-4a85-a00e-443280f1e706 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:37:03:5a,bridge_name='br-int',has_traffic_filtering=True,id=0b908a4c-c96e-4244-b4b4-87f4ac6110bd,network=Network(62a1ccdd-3048-4bbf-acc8-c791bff79ee8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0b908a4c-c9') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Feb 16 13:36:59 compute-0 nova_compute[185723]: 2026-02-16 13:36:59.235 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:36:59 compute-0 nova_compute[185723]: 2026-02-16 13:36:59.236 185727 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0b908a4c-c9, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:36:59 compute-0 nova_compute[185723]: 2026-02-16 13:36:59.238 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:36:59 compute-0 nova_compute[185723]: 2026-02-16 13:36:59.240 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 13:36:59 compute-0 nova_compute[185723]: 2026-02-16 13:36:59.243 185727 INFO os_vif [None req-41a0e94a-4ad7-4a85-a00e-443280f1e706 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:37:03:5a,bridge_name='br-int',has_traffic_filtering=True,id=0b908a4c-c96e-4244-b4b4-87f4ac6110bd,network=Network(62a1ccdd-3048-4bbf-acc8-c791bff79ee8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0b908a4c-c9')
Feb 16 13:36:59 compute-0 nova_compute[185723]: 2026-02-16 13:36:59.243 185727 INFO nova.virt.libvirt.driver [None req-41a0e94a-4ad7-4a85-a00e-443280f1e706 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: fea12b84-b444-4299-a1d9-2e974fbb93e0] Deleting instance files /var/lib/nova/instances/fea12b84-b444-4299-a1d9-2e974fbb93e0_del
Feb 16 13:36:59 compute-0 nova_compute[185723]: 2026-02-16 13:36:59.244 185727 INFO nova.virt.libvirt.driver [None req-41a0e94a-4ad7-4a85-a00e-443280f1e706 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: fea12b84-b444-4299-a1d9-2e974fbb93e0] Deletion of /var/lib/nova/instances/fea12b84-b444-4299-a1d9-2e974fbb93e0_del complete
Feb 16 13:36:59 compute-0 nova_compute[185723]: 2026-02-16 13:36:59.317 185727 INFO nova.compute.manager [None req-41a0e94a-4ad7-4a85-a00e-443280f1e706 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: fea12b84-b444-4299-a1d9-2e974fbb93e0] Took 0.37 seconds to destroy the instance on the hypervisor.
Feb 16 13:36:59 compute-0 nova_compute[185723]: 2026-02-16 13:36:59.317 185727 DEBUG oslo.service.loopingcall [None req-41a0e94a-4ad7-4a85-a00e-443280f1e706 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Feb 16 13:36:59 compute-0 nova_compute[185723]: 2026-02-16 13:36:59.318 185727 DEBUG nova.compute.manager [-] [instance: fea12b84-b444-4299-a1d9-2e974fbb93e0] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Feb 16 13:36:59 compute-0 nova_compute[185723]: 2026-02-16 13:36:59.318 185727 DEBUG nova.network.neutron [-] [instance: fea12b84-b444-4299-a1d9-2e974fbb93e0] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Feb 16 13:36:59 compute-0 podman[211325]: 2026-02-16 13:36:59.320398255 +0000 UTC m=+0.084483043 container remove a57a61246fea3613066b0575aa9747df0abd08348fcfa3344b0e1003de7a59fb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, tcib_managed=true)
Feb 16 13:36:59 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:36:59.326 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[430fe437-1329-41ce-86b9-f70426c324ec]: (4, ('Mon Feb 16 01:36:59 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8 (a57a61246fea3613066b0575aa9747df0abd08348fcfa3344b0e1003de7a59fb)\na57a61246fea3613066b0575aa9747df0abd08348fcfa3344b0e1003de7a59fb\nMon Feb 16 01:36:59 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8 (a57a61246fea3613066b0575aa9747df0abd08348fcfa3344b0e1003de7a59fb)\na57a61246fea3613066b0575aa9747df0abd08348fcfa3344b0e1003de7a59fb\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:36:59 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:36:59.327 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[ce7aff62-05de-4722-ae99-91d98896cddc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:36:59 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:36:59.328 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap62a1ccdd-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:36:59 compute-0 nova_compute[185723]: 2026-02-16 13:36:59.330 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:36:59 compute-0 kernel: tap62a1ccdd-30: left promiscuous mode
Feb 16 13:36:59 compute-0 nova_compute[185723]: 2026-02-16 13:36:59.334 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:36:59 compute-0 nova_compute[185723]: 2026-02-16 13:36:59.336 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:36:59 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:36:59.337 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[d5d53fb7-aeab-45bc-b4b4-48a72b9711d8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:36:59 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:36:59.353 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[f6562025-4427-4731-ab0b-c08527a73e8d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:36:59 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:36:59.354 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[b3047341-abd3-43cd-8453-3c93e33f3809]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:36:59 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:36:59.367 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[8fd9b6ee-83cb-4832-9c1f-c46598e54793]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 505504, 'reachable_time': 43290, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 211339, 'error': None, 'target': 'ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:36:59 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:36:59.370 105762 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Feb 16 13:36:59 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:36:59.371 105762 DEBUG oslo.privsep.daemon [-] privsep: reply[f2f01246-8021-4524-92bb-2a84fea77f42]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:36:59 compute-0 systemd[1]: run-netns-ovnmeta\x2d62a1ccdd\x2d3048\x2d4bbf\x2dacc8\x2dc791bff79ee8.mount: Deactivated successfully.
Feb 16 13:36:59 compute-0 nova_compute[185723]: 2026-02-16 13:36:59.416 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:36:59 compute-0 podman[195053]: time="2026-02-16T13:36:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:36:59 compute-0 podman[195053]: @ - - [16/Feb/2026:13:36:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16012 "" "Go-http-client/1.1"
Feb 16 13:36:59 compute-0 podman[195053]: @ - - [16/Feb/2026:13:36:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2174 "" "Go-http-client/1.1"
Feb 16 13:36:59 compute-0 nova_compute[185723]: 2026-02-16 13:36:59.886 185727 DEBUG nova.compute.manager [req-35d12fd6-c56e-4d46-9d28-92168f5453b2 req-14edfb96-8ab9-4fb3-be6b-61e141b2bd00 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: fea12b84-b444-4299-a1d9-2e974fbb93e0] Received event network-vif-unplugged-0b908a4c-c96e-4244-b4b4-87f4ac6110bd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:36:59 compute-0 nova_compute[185723]: 2026-02-16 13:36:59.886 185727 DEBUG oslo_concurrency.lockutils [req-35d12fd6-c56e-4d46-9d28-92168f5453b2 req-14edfb96-8ab9-4fb3-be6b-61e141b2bd00 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "fea12b84-b444-4299-a1d9-2e974fbb93e0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:36:59 compute-0 nova_compute[185723]: 2026-02-16 13:36:59.886 185727 DEBUG oslo_concurrency.lockutils [req-35d12fd6-c56e-4d46-9d28-92168f5453b2 req-14edfb96-8ab9-4fb3-be6b-61e141b2bd00 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "fea12b84-b444-4299-a1d9-2e974fbb93e0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:36:59 compute-0 nova_compute[185723]: 2026-02-16 13:36:59.886 185727 DEBUG oslo_concurrency.lockutils [req-35d12fd6-c56e-4d46-9d28-92168f5453b2 req-14edfb96-8ab9-4fb3-be6b-61e141b2bd00 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "fea12b84-b444-4299-a1d9-2e974fbb93e0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:36:59 compute-0 nova_compute[185723]: 2026-02-16 13:36:59.887 185727 DEBUG nova.compute.manager [req-35d12fd6-c56e-4d46-9d28-92168f5453b2 req-14edfb96-8ab9-4fb3-be6b-61e141b2bd00 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: fea12b84-b444-4299-a1d9-2e974fbb93e0] No waiting events found dispatching network-vif-unplugged-0b908a4c-c96e-4244-b4b4-87f4ac6110bd pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:36:59 compute-0 nova_compute[185723]: 2026-02-16 13:36:59.887 185727 DEBUG nova.compute.manager [req-35d12fd6-c56e-4d46-9d28-92168f5453b2 req-14edfb96-8ab9-4fb3-be6b-61e141b2bd00 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: fea12b84-b444-4299-a1d9-2e974fbb93e0] Received event network-vif-unplugged-0b908a4c-c96e-4244-b4b4-87f4ac6110bd for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Feb 16 13:36:59 compute-0 nova_compute[185723]: 2026-02-16 13:36:59.887 185727 DEBUG nova.compute.manager [req-35d12fd6-c56e-4d46-9d28-92168f5453b2 req-14edfb96-8ab9-4fb3-be6b-61e141b2bd00 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: fea12b84-b444-4299-a1d9-2e974fbb93e0] Received event network-vif-plugged-0b908a4c-c96e-4244-b4b4-87f4ac6110bd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:36:59 compute-0 nova_compute[185723]: 2026-02-16 13:36:59.887 185727 DEBUG oslo_concurrency.lockutils [req-35d12fd6-c56e-4d46-9d28-92168f5453b2 req-14edfb96-8ab9-4fb3-be6b-61e141b2bd00 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "fea12b84-b444-4299-a1d9-2e974fbb93e0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:36:59 compute-0 nova_compute[185723]: 2026-02-16 13:36:59.887 185727 DEBUG oslo_concurrency.lockutils [req-35d12fd6-c56e-4d46-9d28-92168f5453b2 req-14edfb96-8ab9-4fb3-be6b-61e141b2bd00 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "fea12b84-b444-4299-a1d9-2e974fbb93e0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:36:59 compute-0 nova_compute[185723]: 2026-02-16 13:36:59.887 185727 DEBUG oslo_concurrency.lockutils [req-35d12fd6-c56e-4d46-9d28-92168f5453b2 req-14edfb96-8ab9-4fb3-be6b-61e141b2bd00 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "fea12b84-b444-4299-a1d9-2e974fbb93e0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:36:59 compute-0 nova_compute[185723]: 2026-02-16 13:36:59.888 185727 DEBUG nova.compute.manager [req-35d12fd6-c56e-4d46-9d28-92168f5453b2 req-14edfb96-8ab9-4fb3-be6b-61e141b2bd00 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: fea12b84-b444-4299-a1d9-2e974fbb93e0] No waiting events found dispatching network-vif-plugged-0b908a4c-c96e-4244-b4b4-87f4ac6110bd pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:36:59 compute-0 nova_compute[185723]: 2026-02-16 13:36:59.888 185727 WARNING nova.compute.manager [req-35d12fd6-c56e-4d46-9d28-92168f5453b2 req-14edfb96-8ab9-4fb3-be6b-61e141b2bd00 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: fea12b84-b444-4299-a1d9-2e974fbb93e0] Received unexpected event network-vif-plugged-0b908a4c-c96e-4244-b4b4-87f4ac6110bd for instance with vm_state active and task_state deleting.
Feb 16 13:36:59 compute-0 nova_compute[185723]: 2026-02-16 13:36:59.889 185727 DEBUG nova.network.neutron [-] [instance: fea12b84-b444-4299-a1d9-2e974fbb93e0] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 13:36:59 compute-0 nova_compute[185723]: 2026-02-16 13:36:59.905 185727 INFO nova.compute.manager [-] [instance: fea12b84-b444-4299-a1d9-2e974fbb93e0] Took 0.59 seconds to deallocate network for instance.
Feb 16 13:36:59 compute-0 nova_compute[185723]: 2026-02-16 13:36:59.942 185727 DEBUG nova.compute.manager [req-55e3e0ff-2dae-42d3-b815-86741a052624 req-9721ac64-d8dd-47a3-b29a-924fb150a538 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: fea12b84-b444-4299-a1d9-2e974fbb93e0] Received event network-vif-deleted-0b908a4c-c96e-4244-b4b4-87f4ac6110bd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:36:59 compute-0 nova_compute[185723]: 2026-02-16 13:36:59.944 185727 DEBUG oslo_concurrency.lockutils [None req-41a0e94a-4ad7-4a85-a00e-443280f1e706 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:36:59 compute-0 nova_compute[185723]: 2026-02-16 13:36:59.944 185727 DEBUG oslo_concurrency.lockutils [None req-41a0e94a-4ad7-4a85-a00e-443280f1e706 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:36:59 compute-0 nova_compute[185723]: 2026-02-16 13:36:59.949 185727 DEBUG oslo_concurrency.lockutils [None req-41a0e94a-4ad7-4a85-a00e-443280f1e706 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.005s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:36:59 compute-0 nova_compute[185723]: 2026-02-16 13:36:59.969 185727 INFO nova.scheduler.client.report [None req-41a0e94a-4ad7-4a85-a00e-443280f1e706 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Deleted allocations for instance fea12b84-b444-4299-a1d9-2e974fbb93e0
Feb 16 13:37:00 compute-0 nova_compute[185723]: 2026-02-16 13:37:00.041 185727 DEBUG oslo_concurrency.lockutils [None req-41a0e94a-4ad7-4a85-a00e-443280f1e706 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "fea12b84-b444-4299-a1d9-2e974fbb93e0" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.099s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:37:01 compute-0 openstack_network_exporter[197909]: ERROR   13:37:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:37:01 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:37:01 compute-0 openstack_network_exporter[197909]: ERROR   13:37:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:37:01 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:37:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:37:03.229 105360 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:37:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:37:03.229 105360 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:37:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:37:03.230 105360 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:37:04 compute-0 podman[211340]: 2026-02-16 13:37:04.002811202 +0000 UTC m=+0.045059897 container health_status 4ec6105d1727f201f656d7e6e358931be8ceabc5af37fb5ad779abc620122180 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 16 13:37:04 compute-0 nova_compute[185723]: 2026-02-16 13:37:04.240 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:37:04 compute-0 nova_compute[185723]: 2026-02-16 13:37:04.417 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:37:09 compute-0 nova_compute[185723]: 2026-02-16 13:37:09.242 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:37:09 compute-0 nova_compute[185723]: 2026-02-16 13:37:09.420 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:37:09 compute-0 nova_compute[185723]: 2026-02-16 13:37:09.828 185727 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1771249014.826477, 6403f5ce-8933-4efa-b4a5-611cd66c8a29 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 13:37:09 compute-0 nova_compute[185723]: 2026-02-16 13:37:09.828 185727 INFO nova.compute.manager [-] [instance: 6403f5ce-8933-4efa-b4a5-611cd66c8a29] VM Stopped (Lifecycle Event)
Feb 16 13:37:09 compute-0 nova_compute[185723]: 2026-02-16 13:37:09.858 185727 DEBUG nova.compute.manager [None req-5edef4e4-3fab-4bc5-94ef-b66e769d9c98 - - - - - -] [instance: 6403f5ce-8933-4efa-b4a5-611cd66c8a29] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:37:11 compute-0 sshd-session[211365]: Invalid user admin from 146.190.226.24 port 55754
Feb 16 13:37:11 compute-0 sshd-session[211365]: Connection closed by invalid user admin 146.190.226.24 port 55754 [preauth]
Feb 16 13:37:12 compute-0 nova_compute[185723]: 2026-02-16 13:37:12.433 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:37:13 compute-0 nova_compute[185723]: 2026-02-16 13:37:13.433 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:37:13 compute-0 nova_compute[185723]: 2026-02-16 13:37:13.433 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 16 13:37:13 compute-0 nova_compute[185723]: 2026-02-16 13:37:13.433 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 16 13:37:13 compute-0 nova_compute[185723]: 2026-02-16 13:37:13.449 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 16 13:37:14 compute-0 nova_compute[185723]: 2026-02-16 13:37:14.214 185727 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1771249019.212572, fea12b84-b444-4299-a1d9-2e974fbb93e0 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 13:37:14 compute-0 nova_compute[185723]: 2026-02-16 13:37:14.214 185727 INFO nova.compute.manager [-] [instance: fea12b84-b444-4299-a1d9-2e974fbb93e0] VM Stopped (Lifecycle Event)
Feb 16 13:37:14 compute-0 nova_compute[185723]: 2026-02-16 13:37:14.245 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:37:14 compute-0 nova_compute[185723]: 2026-02-16 13:37:14.421 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:37:14 compute-0 nova_compute[185723]: 2026-02-16 13:37:14.432 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:37:14 compute-0 nova_compute[185723]: 2026-02-16 13:37:14.433 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:37:14 compute-0 nova_compute[185723]: 2026-02-16 13:37:14.537 185727 DEBUG nova.compute.manager [None req-e6648e2d-8e8b-4afe-b5aa-78ca327457bc - - - - - -] [instance: fea12b84-b444-4299-a1d9-2e974fbb93e0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:37:15 compute-0 nova_compute[185723]: 2026-02-16 13:37:15.428 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:37:15 compute-0 nova_compute[185723]: 2026-02-16 13:37:15.432 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:37:15 compute-0 nova_compute[185723]: 2026-02-16 13:37:15.433 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:37:15 compute-0 nova_compute[185723]: 2026-02-16 13:37:15.464 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:37:15 compute-0 nova_compute[185723]: 2026-02-16 13:37:15.465 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:37:15 compute-0 nova_compute[185723]: 2026-02-16 13:37:15.466 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:37:15 compute-0 nova_compute[185723]: 2026-02-16 13:37:15.466 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 16 13:37:15 compute-0 nova_compute[185723]: 2026-02-16 13:37:15.635 185727 WARNING nova.virt.libvirt.driver [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 13:37:15 compute-0 nova_compute[185723]: 2026-02-16 13:37:15.636 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5852MB free_disk=73.22517395019531GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 16 13:37:15 compute-0 nova_compute[185723]: 2026-02-16 13:37:15.637 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:37:15 compute-0 nova_compute[185723]: 2026-02-16 13:37:15.637 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:37:15 compute-0 nova_compute[185723]: 2026-02-16 13:37:15.709 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 16 13:37:15 compute-0 nova_compute[185723]: 2026-02-16 13:37:15.709 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 16 13:37:15 compute-0 nova_compute[185723]: 2026-02-16 13:37:15.730 185727 DEBUG nova.compute.provider_tree [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Inventory has not changed in ProviderTree for provider: c9501a85-df32-4b8f-bce0-9425ef1e7866 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 13:37:15 compute-0 nova_compute[185723]: 2026-02-16 13:37:15.748 185727 DEBUG nova.scheduler.client.report [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Inventory has not changed for provider c9501a85-df32-4b8f-bce0-9425ef1e7866 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 13:37:15 compute-0 nova_compute[185723]: 2026-02-16 13:37:15.780 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 16 13:37:15 compute-0 nova_compute[185723]: 2026-02-16 13:37:15.781 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.144s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:37:17 compute-0 nova_compute[185723]: 2026-02-16 13:37:17.777 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:37:18 compute-0 nova_compute[185723]: 2026-02-16 13:37:18.433 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:37:18 compute-0 nova_compute[185723]: 2026-02-16 13:37:18.433 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 16 13:37:19 compute-0 nova_compute[185723]: 2026-02-16 13:37:19.247 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:37:19 compute-0 nova_compute[185723]: 2026-02-16 13:37:19.467 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:37:20 compute-0 podman[211369]: 2026-02-16 13:37:20.04223185 +0000 UTC m=+0.075927554 container health_status d69bd0be854ec8043fcba555b70c2cd311c7a2510d07d6bc9929fe7684abece9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Feb 16 13:37:20 compute-0 podman[211368]: 2026-02-16 13:37:20.04223152 +0000 UTC m=+0.081997543 container health_status 93151dcbec9f159583042df5101bdd7b5b1bcb5f3117955b4bc9cd1c0e465b98 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, release=1770267347, container_name=openstack_network_exporter, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, vcs-type=git, config_id=openstack_network_exporter, name=ubi9/ubi-minimal, build-date=2026-02-05T04:57:10Z, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.created=2026-02-05T04:57:10Z)
Feb 16 13:37:20 compute-0 nova_compute[185723]: 2026-02-16 13:37:20.434 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:37:24 compute-0 podman[211406]: 2026-02-16 13:37:24.027417039 +0000 UTC m=+0.071061415 container health_status 19961a2f872cc72daf954f4dd556f980b5f58f137d145776df6132c167a8a93c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Feb 16 13:37:24 compute-0 nova_compute[185723]: 2026-02-16 13:37:24.249 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:37:24 compute-0 nova_compute[185723]: 2026-02-16 13:37:24.468 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:37:26 compute-0 sshd-session[211430]: Invalid user admin from 64.227.72.94 port 48676
Feb 16 13:37:26 compute-0 sshd-session[211430]: Connection closed by invalid user admin 64.227.72.94 port 48676 [preauth]
Feb 16 13:37:29 compute-0 nova_compute[185723]: 2026-02-16 13:37:29.251 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:37:29 compute-0 nova_compute[185723]: 2026-02-16 13:37:29.470 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:37:29 compute-0 podman[195053]: time="2026-02-16T13:37:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:37:29 compute-0 podman[195053]: @ - - [16/Feb/2026:13:37:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16012 "" "Go-http-client/1.1"
Feb 16 13:37:29 compute-0 podman[195053]: @ - - [16/Feb/2026:13:37:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2177 "" "Go-http-client/1.1"
Feb 16 13:37:29 compute-0 ovn_controller[96072]: 2026-02-16T13:37:29Z|00117|memory_trim|INFO|Detected inactivity (last active 30006 ms ago): trimming memory
Feb 16 13:37:31 compute-0 openstack_network_exporter[197909]: ERROR   13:37:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:37:31 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:37:31 compute-0 openstack_network_exporter[197909]: ERROR   13:37:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:37:31 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:37:31 compute-0 sshd-session[211432]: Invalid user server from 146.190.22.227 port 59790
Feb 16 13:37:32 compute-0 sshd-session[211432]: Connection closed by invalid user server 146.190.22.227 port 59790 [preauth]
Feb 16 13:37:34 compute-0 nova_compute[185723]: 2026-02-16 13:37:34.297 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:37:34 compute-0 podman[211434]: 2026-02-16 13:37:34.395619248 +0000 UTC m=+0.068912032 container health_status 4ec6105d1727f201f656d7e6e358931be8ceabc5af37fb5ad779abc620122180 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 16 13:37:34 compute-0 nova_compute[185723]: 2026-02-16 13:37:34.471 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:37:39 compute-0 nova_compute[185723]: 2026-02-16 13:37:39.300 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:37:39 compute-0 nova_compute[185723]: 2026-02-16 13:37:39.474 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:37:42 compute-0 sshd-session[211459]: Invalid user postgres from 188.166.42.159 port 53812
Feb 16 13:37:42 compute-0 sshd-session[211459]: Connection closed by invalid user postgres 188.166.42.159 port 53812 [preauth]
Feb 16 13:37:43 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:37:43.438 105360 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=18, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '52:96:1b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '16:6c:7a:bd:49:39'}, ipsec=False) old=SB_Global(nb_cfg=17) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 13:37:43 compute-0 nova_compute[185723]: 2026-02-16 13:37:43.438 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:37:43 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:37:43.439 105360 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 16 13:37:44 compute-0 nova_compute[185723]: 2026-02-16 13:37:44.303 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:37:44 compute-0 nova_compute[185723]: 2026-02-16 13:37:44.475 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:37:46 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:37:46.441 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=b0e583b2-47d7-4bde-bbd6-282143e0c194, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '18'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:37:49 compute-0 nova_compute[185723]: 2026-02-16 13:37:49.306 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:37:49 compute-0 nova_compute[185723]: 2026-02-16 13:37:49.477 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:37:51 compute-0 podman[211462]: 2026-02-16 13:37:51.01423243 +0000 UTC m=+0.049437354 container health_status d69bd0be854ec8043fcba555b70c2cd311c7a2510d07d6bc9929fe7684abece9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, tcib_managed=true, io.buildah.version=1.41.3)
Feb 16 13:37:51 compute-0 podman[211461]: 2026-02-16 13:37:51.031566275 +0000 UTC m=+0.071194038 container health_status 93151dcbec9f159583042df5101bdd7b5b1bcb5f3117955b4bc9cd1c0e465b98 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1770267347, config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, name=ubi9/ubi-minimal, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, build-date=2026-02-05T04:57:10Z, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vendor=Red Hat, Inc., version=9.7, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.created=2026-02-05T04:57:10Z)
Feb 16 13:37:51 compute-0 nova_compute[185723]: 2026-02-16 13:37:51.750 185727 DEBUG oslo_concurrency.lockutils [None req-2c297189-30f1-4e5b-95bf-dfb5a798f4ac e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Acquiring lock "81267e8d-93ab-405d-863c-176b83cabb76" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:37:51 compute-0 nova_compute[185723]: 2026-02-16 13:37:51.751 185727 DEBUG oslo_concurrency.lockutils [None req-2c297189-30f1-4e5b-95bf-dfb5a798f4ac e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "81267e8d-93ab-405d-863c-176b83cabb76" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:37:51 compute-0 nova_compute[185723]: 2026-02-16 13:37:51.766 185727 DEBUG nova.compute.manager [None req-2c297189-30f1-4e5b-95bf-dfb5a798f4ac e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 81267e8d-93ab-405d-863c-176b83cabb76] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Feb 16 13:37:51 compute-0 nova_compute[185723]: 2026-02-16 13:37:51.889 185727 DEBUG oslo_concurrency.lockutils [None req-2c297189-30f1-4e5b-95bf-dfb5a798f4ac e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:37:51 compute-0 nova_compute[185723]: 2026-02-16 13:37:51.890 185727 DEBUG oslo_concurrency.lockutils [None req-2c297189-30f1-4e5b-95bf-dfb5a798f4ac e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:37:51 compute-0 nova_compute[185723]: 2026-02-16 13:37:51.902 185727 DEBUG nova.virt.hardware [None req-2c297189-30f1-4e5b-95bf-dfb5a798f4ac e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Feb 16 13:37:51 compute-0 nova_compute[185723]: 2026-02-16 13:37:51.902 185727 INFO nova.compute.claims [None req-2c297189-30f1-4e5b-95bf-dfb5a798f4ac e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 81267e8d-93ab-405d-863c-176b83cabb76] Claim successful on node compute-0.ctlplane.example.com
Feb 16 13:37:52 compute-0 nova_compute[185723]: 2026-02-16 13:37:52.031 185727 DEBUG nova.compute.provider_tree [None req-2c297189-30f1-4e5b-95bf-dfb5a798f4ac e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Inventory has not changed in ProviderTree for provider: c9501a85-df32-4b8f-bce0-9425ef1e7866 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 13:37:52 compute-0 nova_compute[185723]: 2026-02-16 13:37:52.048 185727 DEBUG nova.scheduler.client.report [None req-2c297189-30f1-4e5b-95bf-dfb5a798f4ac e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Inventory has not changed for provider c9501a85-df32-4b8f-bce0-9425ef1e7866 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 13:37:52 compute-0 nova_compute[185723]: 2026-02-16 13:37:52.070 185727 DEBUG oslo_concurrency.lockutils [None req-2c297189-30f1-4e5b-95bf-dfb5a798f4ac e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.181s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:37:52 compute-0 nova_compute[185723]: 2026-02-16 13:37:52.071 185727 DEBUG nova.compute.manager [None req-2c297189-30f1-4e5b-95bf-dfb5a798f4ac e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 81267e8d-93ab-405d-863c-176b83cabb76] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Feb 16 13:37:52 compute-0 nova_compute[185723]: 2026-02-16 13:37:52.125 185727 DEBUG nova.compute.manager [None req-2c297189-30f1-4e5b-95bf-dfb5a798f4ac e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 81267e8d-93ab-405d-863c-176b83cabb76] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Feb 16 13:37:52 compute-0 nova_compute[185723]: 2026-02-16 13:37:52.125 185727 DEBUG nova.network.neutron [None req-2c297189-30f1-4e5b-95bf-dfb5a798f4ac e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 81267e8d-93ab-405d-863c-176b83cabb76] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Feb 16 13:37:52 compute-0 nova_compute[185723]: 2026-02-16 13:37:52.145 185727 INFO nova.virt.libvirt.driver [None req-2c297189-30f1-4e5b-95bf-dfb5a798f4ac e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 81267e8d-93ab-405d-863c-176b83cabb76] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Feb 16 13:37:52 compute-0 nova_compute[185723]: 2026-02-16 13:37:52.167 185727 DEBUG nova.compute.manager [None req-2c297189-30f1-4e5b-95bf-dfb5a798f4ac e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 81267e8d-93ab-405d-863c-176b83cabb76] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Feb 16 13:37:52 compute-0 nova_compute[185723]: 2026-02-16 13:37:52.280 185727 DEBUG nova.compute.manager [None req-2c297189-30f1-4e5b-95bf-dfb5a798f4ac e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 81267e8d-93ab-405d-863c-176b83cabb76] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Feb 16 13:37:52 compute-0 nova_compute[185723]: 2026-02-16 13:37:52.282 185727 DEBUG nova.virt.libvirt.driver [None req-2c297189-30f1-4e5b-95bf-dfb5a798f4ac e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 81267e8d-93ab-405d-863c-176b83cabb76] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Feb 16 13:37:52 compute-0 nova_compute[185723]: 2026-02-16 13:37:52.282 185727 INFO nova.virt.libvirt.driver [None req-2c297189-30f1-4e5b-95bf-dfb5a798f4ac e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 81267e8d-93ab-405d-863c-176b83cabb76] Creating image(s)
Feb 16 13:37:52 compute-0 nova_compute[185723]: 2026-02-16 13:37:52.283 185727 DEBUG oslo_concurrency.lockutils [None req-2c297189-30f1-4e5b-95bf-dfb5a798f4ac e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Acquiring lock "/var/lib/nova/instances/81267e8d-93ab-405d-863c-176b83cabb76/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:37:52 compute-0 nova_compute[185723]: 2026-02-16 13:37:52.283 185727 DEBUG oslo_concurrency.lockutils [None req-2c297189-30f1-4e5b-95bf-dfb5a798f4ac e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "/var/lib/nova/instances/81267e8d-93ab-405d-863c-176b83cabb76/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:37:52 compute-0 nova_compute[185723]: 2026-02-16 13:37:52.284 185727 DEBUG oslo_concurrency.lockutils [None req-2c297189-30f1-4e5b-95bf-dfb5a798f4ac e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "/var/lib/nova/instances/81267e8d-93ab-405d-863c-176b83cabb76/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:37:52 compute-0 nova_compute[185723]: 2026-02-16 13:37:52.302 185727 DEBUG oslo_concurrency.processutils [None req-2c297189-30f1-4e5b-95bf-dfb5a798f4ac e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:37:52 compute-0 nova_compute[185723]: 2026-02-16 13:37:52.351 185727 DEBUG oslo_concurrency.processutils [None req-2c297189-30f1-4e5b-95bf-dfb5a798f4ac e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json" returned: 0 in 0.048s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:37:52 compute-0 nova_compute[185723]: 2026-02-16 13:37:52.353 185727 DEBUG oslo_concurrency.lockutils [None req-2c297189-30f1-4e5b-95bf-dfb5a798f4ac e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Acquiring lock "755ae6cb63977e865a71a8c38a1a67ce95c9d0b7" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:37:52 compute-0 nova_compute[185723]: 2026-02-16 13:37:52.353 185727 DEBUG oslo_concurrency.lockutils [None req-2c297189-30f1-4e5b-95bf-dfb5a798f4ac e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "755ae6cb63977e865a71a8c38a1a67ce95c9d0b7" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:37:52 compute-0 nova_compute[185723]: 2026-02-16 13:37:52.370 185727 DEBUG oslo_concurrency.processutils [None req-2c297189-30f1-4e5b-95bf-dfb5a798f4ac e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:37:52 compute-0 nova_compute[185723]: 2026-02-16 13:37:52.388 185727 DEBUG nova.policy [None req-2c297189-30f1-4e5b-95bf-dfb5a798f4ac e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'e19cd2d8a8894526ba620ca3249e9a63', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '76c271745e704d5fa97fe16a7dcd4a81', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Feb 16 13:37:52 compute-0 nova_compute[185723]: 2026-02-16 13:37:52.420 185727 DEBUG oslo_concurrency.processutils [None req-2c297189-30f1-4e5b-95bf-dfb5a798f4ac e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:37:52 compute-0 nova_compute[185723]: 2026-02-16 13:37:52.421 185727 DEBUG oslo_concurrency.processutils [None req-2c297189-30f1-4e5b-95bf-dfb5a798f4ac e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7,backing_fmt=raw /var/lib/nova/instances/81267e8d-93ab-405d-863c-176b83cabb76/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:37:52 compute-0 nova_compute[185723]: 2026-02-16 13:37:52.450 185727 DEBUG oslo_concurrency.processutils [None req-2c297189-30f1-4e5b-95bf-dfb5a798f4ac e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7,backing_fmt=raw /var/lib/nova/instances/81267e8d-93ab-405d-863c-176b83cabb76/disk 1073741824" returned: 0 in 0.029s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:37:52 compute-0 nova_compute[185723]: 2026-02-16 13:37:52.451 185727 DEBUG oslo_concurrency.lockutils [None req-2c297189-30f1-4e5b-95bf-dfb5a798f4ac e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "755ae6cb63977e865a71a8c38a1a67ce95c9d0b7" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.098s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:37:52 compute-0 nova_compute[185723]: 2026-02-16 13:37:52.452 185727 DEBUG oslo_concurrency.processutils [None req-2c297189-30f1-4e5b-95bf-dfb5a798f4ac e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:37:52 compute-0 nova_compute[185723]: 2026-02-16 13:37:52.516 185727 DEBUG oslo_concurrency.processutils [None req-2c297189-30f1-4e5b-95bf-dfb5a798f4ac e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:37:52 compute-0 nova_compute[185723]: 2026-02-16 13:37:52.517 185727 DEBUG nova.virt.disk.api [None req-2c297189-30f1-4e5b-95bf-dfb5a798f4ac e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Checking if we can resize image /var/lib/nova/instances/81267e8d-93ab-405d-863c-176b83cabb76/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Feb 16 13:37:52 compute-0 nova_compute[185723]: 2026-02-16 13:37:52.518 185727 DEBUG oslo_concurrency.processutils [None req-2c297189-30f1-4e5b-95bf-dfb5a798f4ac e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/81267e8d-93ab-405d-863c-176b83cabb76/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:37:52 compute-0 nova_compute[185723]: 2026-02-16 13:37:52.569 185727 DEBUG oslo_concurrency.processutils [None req-2c297189-30f1-4e5b-95bf-dfb5a798f4ac e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/81267e8d-93ab-405d-863c-176b83cabb76/disk --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:37:52 compute-0 nova_compute[185723]: 2026-02-16 13:37:52.570 185727 DEBUG nova.virt.disk.api [None req-2c297189-30f1-4e5b-95bf-dfb5a798f4ac e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Cannot resize image /var/lib/nova/instances/81267e8d-93ab-405d-863c-176b83cabb76/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Feb 16 13:37:52 compute-0 nova_compute[185723]: 2026-02-16 13:37:52.570 185727 DEBUG nova.objects.instance [None req-2c297189-30f1-4e5b-95bf-dfb5a798f4ac e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lazy-loading 'migration_context' on Instance uuid 81267e8d-93ab-405d-863c-176b83cabb76 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 13:37:52 compute-0 nova_compute[185723]: 2026-02-16 13:37:52.594 185727 DEBUG nova.virt.libvirt.driver [None req-2c297189-30f1-4e5b-95bf-dfb5a798f4ac e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 81267e8d-93ab-405d-863c-176b83cabb76] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Feb 16 13:37:52 compute-0 nova_compute[185723]: 2026-02-16 13:37:52.594 185727 DEBUG nova.virt.libvirt.driver [None req-2c297189-30f1-4e5b-95bf-dfb5a798f4ac e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 81267e8d-93ab-405d-863c-176b83cabb76] Ensure instance console log exists: /var/lib/nova/instances/81267e8d-93ab-405d-863c-176b83cabb76/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Feb 16 13:37:52 compute-0 nova_compute[185723]: 2026-02-16 13:37:52.595 185727 DEBUG oslo_concurrency.lockutils [None req-2c297189-30f1-4e5b-95bf-dfb5a798f4ac e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:37:52 compute-0 nova_compute[185723]: 2026-02-16 13:37:52.595 185727 DEBUG oslo_concurrency.lockutils [None req-2c297189-30f1-4e5b-95bf-dfb5a798f4ac e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:37:52 compute-0 nova_compute[185723]: 2026-02-16 13:37:52.595 185727 DEBUG oslo_concurrency.lockutils [None req-2c297189-30f1-4e5b-95bf-dfb5a798f4ac e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:37:52 compute-0 nova_compute[185723]: 2026-02-16 13:37:52.978 185727 DEBUG nova.network.neutron [None req-2c297189-30f1-4e5b-95bf-dfb5a798f4ac e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 81267e8d-93ab-405d-863c-176b83cabb76] Successfully created port: 844199d8-6751-444b-b1e3-c6bb692ad49f _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Feb 16 13:37:54 compute-0 nova_compute[185723]: 2026-02-16 13:37:54.307 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:37:54 compute-0 nova_compute[185723]: 2026-02-16 13:37:54.519 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:37:55 compute-0 podman[211511]: 2026-02-16 13:37:55.030523352 +0000 UTC m=+0.072721955 container health_status 19961a2f872cc72daf954f4dd556f980b5f58f137d145776df6132c167a8a93c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller)
Feb 16 13:37:56 compute-0 nova_compute[185723]: 2026-02-16 13:37:56.484 185727 DEBUG nova.network.neutron [None req-2c297189-30f1-4e5b-95bf-dfb5a798f4ac e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 81267e8d-93ab-405d-863c-176b83cabb76] Successfully updated port: 844199d8-6751-444b-b1e3-c6bb692ad49f _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Feb 16 13:37:56 compute-0 nova_compute[185723]: 2026-02-16 13:37:56.502 185727 DEBUG oslo_concurrency.lockutils [None req-2c297189-30f1-4e5b-95bf-dfb5a798f4ac e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Acquiring lock "refresh_cache-81267e8d-93ab-405d-863c-176b83cabb76" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 13:37:56 compute-0 nova_compute[185723]: 2026-02-16 13:37:56.502 185727 DEBUG oslo_concurrency.lockutils [None req-2c297189-30f1-4e5b-95bf-dfb5a798f4ac e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Acquired lock "refresh_cache-81267e8d-93ab-405d-863c-176b83cabb76" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 13:37:56 compute-0 nova_compute[185723]: 2026-02-16 13:37:56.502 185727 DEBUG nova.network.neutron [None req-2c297189-30f1-4e5b-95bf-dfb5a798f4ac e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 81267e8d-93ab-405d-863c-176b83cabb76] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 16 13:37:56 compute-0 nova_compute[185723]: 2026-02-16 13:37:56.574 185727 DEBUG nova.compute.manager [req-3fe8e777-af39-4e57-ada4-e0d18a30becf req-5bc5e6bc-ee63-4c72-acf5-595ca3ea057f faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 81267e8d-93ab-405d-863c-176b83cabb76] Received event network-changed-844199d8-6751-444b-b1e3-c6bb692ad49f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:37:56 compute-0 nova_compute[185723]: 2026-02-16 13:37:56.574 185727 DEBUG nova.compute.manager [req-3fe8e777-af39-4e57-ada4-e0d18a30becf req-5bc5e6bc-ee63-4c72-acf5-595ca3ea057f faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 81267e8d-93ab-405d-863c-176b83cabb76] Refreshing instance network info cache due to event network-changed-844199d8-6751-444b-b1e3-c6bb692ad49f. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 16 13:37:56 compute-0 nova_compute[185723]: 2026-02-16 13:37:56.574 185727 DEBUG oslo_concurrency.lockutils [req-3fe8e777-af39-4e57-ada4-e0d18a30becf req-5bc5e6bc-ee63-4c72-acf5-595ca3ea057f faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "refresh_cache-81267e8d-93ab-405d-863c-176b83cabb76" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 13:37:57 compute-0 nova_compute[185723]: 2026-02-16 13:37:57.330 185727 DEBUG nova.network.neutron [None req-2c297189-30f1-4e5b-95bf-dfb5a798f4ac e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 81267e8d-93ab-405d-863c-176b83cabb76] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Feb 16 13:37:59 compute-0 nova_compute[185723]: 2026-02-16 13:37:59.311 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:37:59 compute-0 nova_compute[185723]: 2026-02-16 13:37:59.372 185727 DEBUG nova.network.neutron [None req-2c297189-30f1-4e5b-95bf-dfb5a798f4ac e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 81267e8d-93ab-405d-863c-176b83cabb76] Updating instance_info_cache with network_info: [{"id": "844199d8-6751-444b-b1e3-c6bb692ad49f", "address": "fa:16:3e:f6:00:5e", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap844199d8-67", "ovs_interfaceid": "844199d8-6751-444b-b1e3-c6bb692ad49f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 13:37:59 compute-0 nova_compute[185723]: 2026-02-16 13:37:59.401 185727 DEBUG oslo_concurrency.lockutils [None req-2c297189-30f1-4e5b-95bf-dfb5a798f4ac e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Releasing lock "refresh_cache-81267e8d-93ab-405d-863c-176b83cabb76" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 13:37:59 compute-0 nova_compute[185723]: 2026-02-16 13:37:59.402 185727 DEBUG nova.compute.manager [None req-2c297189-30f1-4e5b-95bf-dfb5a798f4ac e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 81267e8d-93ab-405d-863c-176b83cabb76] Instance network_info: |[{"id": "844199d8-6751-444b-b1e3-c6bb692ad49f", "address": "fa:16:3e:f6:00:5e", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap844199d8-67", "ovs_interfaceid": "844199d8-6751-444b-b1e3-c6bb692ad49f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Feb 16 13:37:59 compute-0 nova_compute[185723]: 2026-02-16 13:37:59.403 185727 DEBUG oslo_concurrency.lockutils [req-3fe8e777-af39-4e57-ada4-e0d18a30becf req-5bc5e6bc-ee63-4c72-acf5-595ca3ea057f faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquired lock "refresh_cache-81267e8d-93ab-405d-863c-176b83cabb76" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 13:37:59 compute-0 nova_compute[185723]: 2026-02-16 13:37:59.403 185727 DEBUG nova.network.neutron [req-3fe8e777-af39-4e57-ada4-e0d18a30becf req-5bc5e6bc-ee63-4c72-acf5-595ca3ea057f faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 81267e8d-93ab-405d-863c-176b83cabb76] Refreshing network info cache for port 844199d8-6751-444b-b1e3-c6bb692ad49f _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 16 13:37:59 compute-0 nova_compute[185723]: 2026-02-16 13:37:59.408 185727 DEBUG nova.virt.libvirt.driver [None req-2c297189-30f1-4e5b-95bf-dfb5a798f4ac e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 81267e8d-93ab-405d-863c-176b83cabb76] Start _get_guest_xml network_info=[{"id": "844199d8-6751-444b-b1e3-c6bb692ad49f", "address": "fa:16:3e:f6:00:5e", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap844199d8-67", "ovs_interfaceid": "844199d8-6751-444b-b1e3-c6bb692ad49f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-16T13:17:01Z,direct_url=<?>,disk_format='qcow2',id=6fb9af7f-2971-4890-a777-6e99e888717f,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='fa8ec31824694513a42cc22c81880a5c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-16T13:17:03Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'encrypted': False, 'device_type': 'disk', 'disk_bus': 'virtio', 'encryption_options': None, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'size': 0, 'image_id': '6fb9af7f-2971-4890-a777-6e99e888717f'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Feb 16 13:37:59 compute-0 nova_compute[185723]: 2026-02-16 13:37:59.416 185727 WARNING nova.virt.libvirt.driver [None req-2c297189-30f1-4e5b-95bf-dfb5a798f4ac e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 13:37:59 compute-0 nova_compute[185723]: 2026-02-16 13:37:59.424 185727 DEBUG nova.virt.libvirt.host [None req-2c297189-30f1-4e5b-95bf-dfb5a798f4ac e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Feb 16 13:37:59 compute-0 nova_compute[185723]: 2026-02-16 13:37:59.425 185727 DEBUG nova.virt.libvirt.host [None req-2c297189-30f1-4e5b-95bf-dfb5a798f4ac e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Feb 16 13:37:59 compute-0 nova_compute[185723]: 2026-02-16 13:37:59.435 185727 DEBUG nova.virt.libvirt.host [None req-2c297189-30f1-4e5b-95bf-dfb5a798f4ac e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Feb 16 13:37:59 compute-0 nova_compute[185723]: 2026-02-16 13:37:59.437 185727 DEBUG nova.virt.libvirt.host [None req-2c297189-30f1-4e5b-95bf-dfb5a798f4ac e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Feb 16 13:37:59 compute-0 nova_compute[185723]: 2026-02-16 13:37:59.439 185727 DEBUG nova.virt.libvirt.driver [None req-2c297189-30f1-4e5b-95bf-dfb5a798f4ac e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Feb 16 13:37:59 compute-0 nova_compute[185723]: 2026-02-16 13:37:59.439 185727 DEBUG nova.virt.hardware [None req-2c297189-30f1-4e5b-95bf-dfb5a798f4ac e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-02-16T13:16:59Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='6d89f72c-1760-421e-a5f2-83dfc3723b84',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-16T13:17:01Z,direct_url=<?>,disk_format='qcow2',id=6fb9af7f-2971-4890-a777-6e99e888717f,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='fa8ec31824694513a42cc22c81880a5c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-16T13:17:03Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Feb 16 13:37:59 compute-0 nova_compute[185723]: 2026-02-16 13:37:59.440 185727 DEBUG nova.virt.hardware [None req-2c297189-30f1-4e5b-95bf-dfb5a798f4ac e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Feb 16 13:37:59 compute-0 nova_compute[185723]: 2026-02-16 13:37:59.440 185727 DEBUG nova.virt.hardware [None req-2c297189-30f1-4e5b-95bf-dfb5a798f4ac e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Feb 16 13:37:59 compute-0 nova_compute[185723]: 2026-02-16 13:37:59.441 185727 DEBUG nova.virt.hardware [None req-2c297189-30f1-4e5b-95bf-dfb5a798f4ac e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Feb 16 13:37:59 compute-0 nova_compute[185723]: 2026-02-16 13:37:59.441 185727 DEBUG nova.virt.hardware [None req-2c297189-30f1-4e5b-95bf-dfb5a798f4ac e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Feb 16 13:37:59 compute-0 nova_compute[185723]: 2026-02-16 13:37:59.442 185727 DEBUG nova.virt.hardware [None req-2c297189-30f1-4e5b-95bf-dfb5a798f4ac e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Feb 16 13:37:59 compute-0 nova_compute[185723]: 2026-02-16 13:37:59.442 185727 DEBUG nova.virt.hardware [None req-2c297189-30f1-4e5b-95bf-dfb5a798f4ac e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Feb 16 13:37:59 compute-0 nova_compute[185723]: 2026-02-16 13:37:59.443 185727 DEBUG nova.virt.hardware [None req-2c297189-30f1-4e5b-95bf-dfb5a798f4ac e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Feb 16 13:37:59 compute-0 nova_compute[185723]: 2026-02-16 13:37:59.443 185727 DEBUG nova.virt.hardware [None req-2c297189-30f1-4e5b-95bf-dfb5a798f4ac e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Feb 16 13:37:59 compute-0 nova_compute[185723]: 2026-02-16 13:37:59.444 185727 DEBUG nova.virt.hardware [None req-2c297189-30f1-4e5b-95bf-dfb5a798f4ac e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Feb 16 13:37:59 compute-0 nova_compute[185723]: 2026-02-16 13:37:59.444 185727 DEBUG nova.virt.hardware [None req-2c297189-30f1-4e5b-95bf-dfb5a798f4ac e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Feb 16 13:37:59 compute-0 nova_compute[185723]: 2026-02-16 13:37:59.451 185727 DEBUG nova.virt.libvirt.vif [None req-2c297189-30f1-4e5b-95bf-dfb5a798f4ac e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-16T13:37:50Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-392044604',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-392044604',id=14,image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='76c271745e704d5fa97fe16a7dcd4a81',ramdisk_id='',reservation_id='r-d8munj1v',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-1085993185',owner_user_name='tempest-TestExecuteStrategies-1085993185-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-16T13:37:52Z,user_data=None,user_id='e19cd2d8a8894526ba620ca3249e9a63',uuid=81267e8d-93ab-405d-863c-176b83cabb76,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "844199d8-6751-444b-b1e3-c6bb692ad49f", "address": "fa:16:3e:f6:00:5e", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap844199d8-67", "ovs_interfaceid": "844199d8-6751-444b-b1e3-c6bb692ad49f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Feb 16 13:37:59 compute-0 nova_compute[185723]: 2026-02-16 13:37:59.451 185727 DEBUG nova.network.os_vif_util [None req-2c297189-30f1-4e5b-95bf-dfb5a798f4ac e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Converting VIF {"id": "844199d8-6751-444b-b1e3-c6bb692ad49f", "address": "fa:16:3e:f6:00:5e", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap844199d8-67", "ovs_interfaceid": "844199d8-6751-444b-b1e3-c6bb692ad49f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 13:37:59 compute-0 nova_compute[185723]: 2026-02-16 13:37:59.453 185727 DEBUG nova.network.os_vif_util [None req-2c297189-30f1-4e5b-95bf-dfb5a798f4ac e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f6:00:5e,bridge_name='br-int',has_traffic_filtering=True,id=844199d8-6751-444b-b1e3-c6bb692ad49f,network=Network(62a1ccdd-3048-4bbf-acc8-c791bff79ee8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap844199d8-67') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 13:37:59 compute-0 nova_compute[185723]: 2026-02-16 13:37:59.454 185727 DEBUG nova.objects.instance [None req-2c297189-30f1-4e5b-95bf-dfb5a798f4ac e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lazy-loading 'pci_devices' on Instance uuid 81267e8d-93ab-405d-863c-176b83cabb76 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 13:37:59 compute-0 nova_compute[185723]: 2026-02-16 13:37:59.474 185727 DEBUG nova.virt.libvirt.driver [None req-2c297189-30f1-4e5b-95bf-dfb5a798f4ac e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 81267e8d-93ab-405d-863c-176b83cabb76] End _get_guest_xml xml=<domain type="kvm">
Feb 16 13:37:59 compute-0 nova_compute[185723]:   <uuid>81267e8d-93ab-405d-863c-176b83cabb76</uuid>
Feb 16 13:37:59 compute-0 nova_compute[185723]:   <name>instance-0000000e</name>
Feb 16 13:37:59 compute-0 nova_compute[185723]:   <memory>131072</memory>
Feb 16 13:37:59 compute-0 nova_compute[185723]:   <vcpu>1</vcpu>
Feb 16 13:37:59 compute-0 nova_compute[185723]:   <metadata>
Feb 16 13:37:59 compute-0 nova_compute[185723]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb 16 13:37:59 compute-0 nova_compute[185723]:       <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Feb 16 13:37:59 compute-0 nova_compute[185723]:       <nova:name>tempest-TestExecuteStrategies-server-392044604</nova:name>
Feb 16 13:37:59 compute-0 nova_compute[185723]:       <nova:creationTime>2026-02-16 13:37:59</nova:creationTime>
Feb 16 13:37:59 compute-0 nova_compute[185723]:       <nova:flavor name="m1.nano">
Feb 16 13:37:59 compute-0 nova_compute[185723]:         <nova:memory>128</nova:memory>
Feb 16 13:37:59 compute-0 nova_compute[185723]:         <nova:disk>1</nova:disk>
Feb 16 13:37:59 compute-0 nova_compute[185723]:         <nova:swap>0</nova:swap>
Feb 16 13:37:59 compute-0 nova_compute[185723]:         <nova:ephemeral>0</nova:ephemeral>
Feb 16 13:37:59 compute-0 nova_compute[185723]:         <nova:vcpus>1</nova:vcpus>
Feb 16 13:37:59 compute-0 nova_compute[185723]:       </nova:flavor>
Feb 16 13:37:59 compute-0 nova_compute[185723]:       <nova:owner>
Feb 16 13:37:59 compute-0 nova_compute[185723]:         <nova:user uuid="e19cd2d8a8894526ba620ca3249e9a63">tempest-TestExecuteStrategies-1085993185-project-member</nova:user>
Feb 16 13:37:59 compute-0 nova_compute[185723]:         <nova:project uuid="76c271745e704d5fa97fe16a7dcd4a81">tempest-TestExecuteStrategies-1085993185</nova:project>
Feb 16 13:37:59 compute-0 nova_compute[185723]:       </nova:owner>
Feb 16 13:37:59 compute-0 nova_compute[185723]:       <nova:root type="image" uuid="6fb9af7f-2971-4890-a777-6e99e888717f"/>
Feb 16 13:37:59 compute-0 nova_compute[185723]:       <nova:ports>
Feb 16 13:37:59 compute-0 nova_compute[185723]:         <nova:port uuid="844199d8-6751-444b-b1e3-c6bb692ad49f">
Feb 16 13:37:59 compute-0 nova_compute[185723]:           <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Feb 16 13:37:59 compute-0 nova_compute[185723]:         </nova:port>
Feb 16 13:37:59 compute-0 nova_compute[185723]:       </nova:ports>
Feb 16 13:37:59 compute-0 nova_compute[185723]:     </nova:instance>
Feb 16 13:37:59 compute-0 nova_compute[185723]:   </metadata>
Feb 16 13:37:59 compute-0 nova_compute[185723]:   <sysinfo type="smbios">
Feb 16 13:37:59 compute-0 nova_compute[185723]:     <system>
Feb 16 13:37:59 compute-0 nova_compute[185723]:       <entry name="manufacturer">RDO</entry>
Feb 16 13:37:59 compute-0 nova_compute[185723]:       <entry name="product">OpenStack Compute</entry>
Feb 16 13:37:59 compute-0 nova_compute[185723]:       <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Feb 16 13:37:59 compute-0 nova_compute[185723]:       <entry name="serial">81267e8d-93ab-405d-863c-176b83cabb76</entry>
Feb 16 13:37:59 compute-0 nova_compute[185723]:       <entry name="uuid">81267e8d-93ab-405d-863c-176b83cabb76</entry>
Feb 16 13:37:59 compute-0 nova_compute[185723]:       <entry name="family">Virtual Machine</entry>
Feb 16 13:37:59 compute-0 nova_compute[185723]:     </system>
Feb 16 13:37:59 compute-0 nova_compute[185723]:   </sysinfo>
Feb 16 13:37:59 compute-0 nova_compute[185723]:   <os>
Feb 16 13:37:59 compute-0 nova_compute[185723]:     <type arch="x86_64" machine="q35">hvm</type>
Feb 16 13:37:59 compute-0 nova_compute[185723]:     <boot dev="hd"/>
Feb 16 13:37:59 compute-0 nova_compute[185723]:     <smbios mode="sysinfo"/>
Feb 16 13:37:59 compute-0 nova_compute[185723]:   </os>
Feb 16 13:37:59 compute-0 nova_compute[185723]:   <features>
Feb 16 13:37:59 compute-0 nova_compute[185723]:     <acpi/>
Feb 16 13:37:59 compute-0 nova_compute[185723]:     <apic/>
Feb 16 13:37:59 compute-0 nova_compute[185723]:     <vmcoreinfo/>
Feb 16 13:37:59 compute-0 nova_compute[185723]:   </features>
Feb 16 13:37:59 compute-0 nova_compute[185723]:   <clock offset="utc">
Feb 16 13:37:59 compute-0 nova_compute[185723]:     <timer name="pit" tickpolicy="delay"/>
Feb 16 13:37:59 compute-0 nova_compute[185723]:     <timer name="rtc" tickpolicy="catchup"/>
Feb 16 13:37:59 compute-0 nova_compute[185723]:     <timer name="hpet" present="no"/>
Feb 16 13:37:59 compute-0 nova_compute[185723]:   </clock>
Feb 16 13:37:59 compute-0 nova_compute[185723]:   <cpu mode="custom" match="exact">
Feb 16 13:37:59 compute-0 nova_compute[185723]:     <model>Nehalem</model>
Feb 16 13:37:59 compute-0 nova_compute[185723]:     <topology sockets="1" cores="1" threads="1"/>
Feb 16 13:37:59 compute-0 nova_compute[185723]:   </cpu>
Feb 16 13:37:59 compute-0 nova_compute[185723]:   <devices>
Feb 16 13:37:59 compute-0 nova_compute[185723]:     <disk type="file" device="disk">
Feb 16 13:37:59 compute-0 nova_compute[185723]:       <driver name="qemu" type="qcow2" cache="none"/>
Feb 16 13:37:59 compute-0 nova_compute[185723]:       <source file="/var/lib/nova/instances/81267e8d-93ab-405d-863c-176b83cabb76/disk"/>
Feb 16 13:37:59 compute-0 nova_compute[185723]:       <target dev="vda" bus="virtio"/>
Feb 16 13:37:59 compute-0 nova_compute[185723]:     </disk>
Feb 16 13:37:59 compute-0 nova_compute[185723]:     <disk type="file" device="cdrom">
Feb 16 13:37:59 compute-0 nova_compute[185723]:       <driver name="qemu" type="raw" cache="none"/>
Feb 16 13:37:59 compute-0 nova_compute[185723]:       <source file="/var/lib/nova/instances/81267e8d-93ab-405d-863c-176b83cabb76/disk.config"/>
Feb 16 13:37:59 compute-0 nova_compute[185723]:       <target dev="sda" bus="sata"/>
Feb 16 13:37:59 compute-0 nova_compute[185723]:     </disk>
Feb 16 13:37:59 compute-0 nova_compute[185723]:     <interface type="ethernet">
Feb 16 13:37:59 compute-0 nova_compute[185723]:       <mac address="fa:16:3e:f6:00:5e"/>
Feb 16 13:37:59 compute-0 nova_compute[185723]:       <model type="virtio"/>
Feb 16 13:37:59 compute-0 nova_compute[185723]:       <driver name="vhost" rx_queue_size="512"/>
Feb 16 13:37:59 compute-0 nova_compute[185723]:       <mtu size="1442"/>
Feb 16 13:37:59 compute-0 nova_compute[185723]:       <target dev="tap844199d8-67"/>
Feb 16 13:37:59 compute-0 nova_compute[185723]:     </interface>
Feb 16 13:37:59 compute-0 nova_compute[185723]:     <serial type="pty">
Feb 16 13:37:59 compute-0 nova_compute[185723]:       <log file="/var/lib/nova/instances/81267e8d-93ab-405d-863c-176b83cabb76/console.log" append="off"/>
Feb 16 13:37:59 compute-0 nova_compute[185723]:     </serial>
Feb 16 13:37:59 compute-0 nova_compute[185723]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Feb 16 13:37:59 compute-0 nova_compute[185723]:     <video>
Feb 16 13:37:59 compute-0 nova_compute[185723]:       <model type="virtio"/>
Feb 16 13:37:59 compute-0 nova_compute[185723]:     </video>
Feb 16 13:37:59 compute-0 nova_compute[185723]:     <input type="tablet" bus="usb"/>
Feb 16 13:37:59 compute-0 nova_compute[185723]:     <rng model="virtio">
Feb 16 13:37:59 compute-0 nova_compute[185723]:       <backend model="random">/dev/urandom</backend>
Feb 16 13:37:59 compute-0 nova_compute[185723]:     </rng>
Feb 16 13:37:59 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root"/>
Feb 16 13:37:59 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:37:59 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:37:59 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:37:59 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:37:59 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:37:59 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:37:59 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:37:59 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:37:59 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:37:59 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:37:59 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:37:59 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:37:59 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:37:59 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:37:59 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:37:59 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:37:59 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:37:59 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:37:59 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:37:59 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:37:59 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:37:59 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:37:59 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:37:59 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:37:59 compute-0 nova_compute[185723]:     <controller type="usb" index="0"/>
Feb 16 13:37:59 compute-0 nova_compute[185723]:     <memballoon model="virtio">
Feb 16 13:37:59 compute-0 nova_compute[185723]:       <stats period="10"/>
Feb 16 13:37:59 compute-0 nova_compute[185723]:     </memballoon>
Feb 16 13:37:59 compute-0 nova_compute[185723]:   </devices>
Feb 16 13:37:59 compute-0 nova_compute[185723]: </domain>
Feb 16 13:37:59 compute-0 nova_compute[185723]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Feb 16 13:37:59 compute-0 nova_compute[185723]: 2026-02-16 13:37:59.477 185727 DEBUG nova.compute.manager [None req-2c297189-30f1-4e5b-95bf-dfb5a798f4ac e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 81267e8d-93ab-405d-863c-176b83cabb76] Preparing to wait for external event network-vif-plugged-844199d8-6751-444b-b1e3-c6bb692ad49f prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Feb 16 13:37:59 compute-0 nova_compute[185723]: 2026-02-16 13:37:59.477 185727 DEBUG oslo_concurrency.lockutils [None req-2c297189-30f1-4e5b-95bf-dfb5a798f4ac e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Acquiring lock "81267e8d-93ab-405d-863c-176b83cabb76-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:37:59 compute-0 nova_compute[185723]: 2026-02-16 13:37:59.478 185727 DEBUG oslo_concurrency.lockutils [None req-2c297189-30f1-4e5b-95bf-dfb5a798f4ac e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "81267e8d-93ab-405d-863c-176b83cabb76-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:37:59 compute-0 nova_compute[185723]: 2026-02-16 13:37:59.478 185727 DEBUG oslo_concurrency.lockutils [None req-2c297189-30f1-4e5b-95bf-dfb5a798f4ac e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "81267e8d-93ab-405d-863c-176b83cabb76-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:37:59 compute-0 nova_compute[185723]: 2026-02-16 13:37:59.480 185727 DEBUG nova.virt.libvirt.vif [None req-2c297189-30f1-4e5b-95bf-dfb5a798f4ac e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-16T13:37:50Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-392044604',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-392044604',id=14,image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='76c271745e704d5fa97fe16a7dcd4a81',ramdisk_id='',reservation_id='r-d8munj1v',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-1085993185',owner_user_name='tempest-TestExecuteStrategies-1085993185-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-16T13:37:52Z,user_data=None,user_id='e19cd2d8a8894526ba620ca3249e9a63',uuid=81267e8d-93ab-405d-863c-176b83cabb76,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "844199d8-6751-444b-b1e3-c6bb692ad49f", "address": "fa:16:3e:f6:00:5e", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap844199d8-67", "ovs_interfaceid": "844199d8-6751-444b-b1e3-c6bb692ad49f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Feb 16 13:37:59 compute-0 nova_compute[185723]: 2026-02-16 13:37:59.480 185727 DEBUG nova.network.os_vif_util [None req-2c297189-30f1-4e5b-95bf-dfb5a798f4ac e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Converting VIF {"id": "844199d8-6751-444b-b1e3-c6bb692ad49f", "address": "fa:16:3e:f6:00:5e", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap844199d8-67", "ovs_interfaceid": "844199d8-6751-444b-b1e3-c6bb692ad49f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 13:37:59 compute-0 nova_compute[185723]: 2026-02-16 13:37:59.482 185727 DEBUG nova.network.os_vif_util [None req-2c297189-30f1-4e5b-95bf-dfb5a798f4ac e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f6:00:5e,bridge_name='br-int',has_traffic_filtering=True,id=844199d8-6751-444b-b1e3-c6bb692ad49f,network=Network(62a1ccdd-3048-4bbf-acc8-c791bff79ee8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap844199d8-67') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 13:37:59 compute-0 nova_compute[185723]: 2026-02-16 13:37:59.483 185727 DEBUG os_vif [None req-2c297189-30f1-4e5b-95bf-dfb5a798f4ac e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f6:00:5e,bridge_name='br-int',has_traffic_filtering=True,id=844199d8-6751-444b-b1e3-c6bb692ad49f,network=Network(62a1ccdd-3048-4bbf-acc8-c791bff79ee8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap844199d8-67') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Feb 16 13:37:59 compute-0 nova_compute[185723]: 2026-02-16 13:37:59.484 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:37:59 compute-0 nova_compute[185723]: 2026-02-16 13:37:59.484 185727 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:37:59 compute-0 nova_compute[185723]: 2026-02-16 13:37:59.485 185727 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 13:37:59 compute-0 nova_compute[185723]: 2026-02-16 13:37:59.489 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:37:59 compute-0 nova_compute[185723]: 2026-02-16 13:37:59.490 185727 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap844199d8-67, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:37:59 compute-0 nova_compute[185723]: 2026-02-16 13:37:59.490 185727 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap844199d8-67, col_values=(('external_ids', {'iface-id': '844199d8-6751-444b-b1e3-c6bb692ad49f', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:f6:00:5e', 'vm-uuid': '81267e8d-93ab-405d-863c-176b83cabb76'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:37:59 compute-0 NetworkManager[56177]: <info>  [1771249079.4940] manager: (tap844199d8-67): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/49)
Feb 16 13:37:59 compute-0 nova_compute[185723]: 2026-02-16 13:37:59.496 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 13:37:59 compute-0 nova_compute[185723]: 2026-02-16 13:37:59.502 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:37:59 compute-0 nova_compute[185723]: 2026-02-16 13:37:59.503 185727 INFO os_vif [None req-2c297189-30f1-4e5b-95bf-dfb5a798f4ac e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f6:00:5e,bridge_name='br-int',has_traffic_filtering=True,id=844199d8-6751-444b-b1e3-c6bb692ad49f,network=Network(62a1ccdd-3048-4bbf-acc8-c791bff79ee8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap844199d8-67')
Feb 16 13:37:59 compute-0 nova_compute[185723]: 2026-02-16 13:37:59.521 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:37:59 compute-0 nova_compute[185723]: 2026-02-16 13:37:59.572 185727 DEBUG nova.virt.libvirt.driver [None req-2c297189-30f1-4e5b-95bf-dfb5a798f4ac e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 16 13:37:59 compute-0 nova_compute[185723]: 2026-02-16 13:37:59.573 185727 DEBUG nova.virt.libvirt.driver [None req-2c297189-30f1-4e5b-95bf-dfb5a798f4ac e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 16 13:37:59 compute-0 nova_compute[185723]: 2026-02-16 13:37:59.573 185727 DEBUG nova.virt.libvirt.driver [None req-2c297189-30f1-4e5b-95bf-dfb5a798f4ac e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] No VIF found with MAC fa:16:3e:f6:00:5e, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Feb 16 13:37:59 compute-0 nova_compute[185723]: 2026-02-16 13:37:59.573 185727 INFO nova.virt.libvirt.driver [None req-2c297189-30f1-4e5b-95bf-dfb5a798f4ac e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 81267e8d-93ab-405d-863c-176b83cabb76] Using config drive
Feb 16 13:37:59 compute-0 podman[195053]: time="2026-02-16T13:37:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:37:59 compute-0 podman[195053]: @ - - [16/Feb/2026:13:37:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16012 "" "Go-http-client/1.1"
Feb 16 13:37:59 compute-0 podman[195053]: @ - - [16/Feb/2026:13:37:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2178 "" "Go-http-client/1.1"
Feb 16 13:38:00 compute-0 nova_compute[185723]: 2026-02-16 13:38:00.351 185727 INFO nova.virt.libvirt.driver [None req-2c297189-30f1-4e5b-95bf-dfb5a798f4ac e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 81267e8d-93ab-405d-863c-176b83cabb76] Creating config drive at /var/lib/nova/instances/81267e8d-93ab-405d-863c-176b83cabb76/disk.config
Feb 16 13:38:00 compute-0 nova_compute[185723]: 2026-02-16 13:38:00.357 185727 DEBUG oslo_concurrency.processutils [None req-2c297189-30f1-4e5b-95bf-dfb5a798f4ac e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/81267e8d-93ab-405d-863c-176b83cabb76/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmp59jnayqj execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:38:00 compute-0 nova_compute[185723]: 2026-02-16 13:38:00.485 185727 DEBUG oslo_concurrency.processutils [None req-2c297189-30f1-4e5b-95bf-dfb5a798f4ac e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/81267e8d-93ab-405d-863c-176b83cabb76/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmp59jnayqj" returned: 0 in 0.127s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:38:00 compute-0 kernel: tap844199d8-67: entered promiscuous mode
Feb 16 13:38:00 compute-0 NetworkManager[56177]: <info>  [1771249080.5483] manager: (tap844199d8-67): new Tun device (/org/freedesktop/NetworkManager/Devices/50)
Feb 16 13:38:00 compute-0 ovn_controller[96072]: 2026-02-16T13:38:00Z|00118|binding|INFO|Claiming lport 844199d8-6751-444b-b1e3-c6bb692ad49f for this chassis.
Feb 16 13:38:00 compute-0 ovn_controller[96072]: 2026-02-16T13:38:00Z|00119|binding|INFO|844199d8-6751-444b-b1e3-c6bb692ad49f: Claiming fa:16:3e:f6:00:5e 10.100.0.9
Feb 16 13:38:00 compute-0 nova_compute[185723]: 2026-02-16 13:38:00.551 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:38:00 compute-0 ovn_controller[96072]: 2026-02-16T13:38:00Z|00120|binding|INFO|Setting lport 844199d8-6751-444b-b1e3-c6bb692ad49f ovn-installed in OVS
Feb 16 13:38:00 compute-0 nova_compute[185723]: 2026-02-16 13:38:00.556 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:38:00 compute-0 nova_compute[185723]: 2026-02-16 13:38:00.559 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:38:00 compute-0 nova_compute[185723]: 2026-02-16 13:38:00.562 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:38:00 compute-0 ovn_controller[96072]: 2026-02-16T13:38:00Z|00121|binding|INFO|Setting lport 844199d8-6751-444b-b1e3-c6bb692ad49f up in Southbound
Feb 16 13:38:00 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:38:00.565 105360 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f6:00:5e 10.100.0.9'], port_security=['fa:16:3e:f6:00:5e 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '81267e8d-93ab-405d-863c-176b83cabb76', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '76c271745e704d5fa97fe16a7dcd4a81', 'neutron:revision_number': '2', 'neutron:security_group_ids': '832f9d98-96fb-45e2-8c11-0f4534711ee2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fb3ae2f5-242d-4288-83f4-c053c76499e5, chassis=[<ovs.db.idl.Row object at 0x7f2e53046760>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2e53046760>], logical_port=844199d8-6751-444b-b1e3-c6bb692ad49f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 13:38:00 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:38:00.566 105360 INFO neutron.agent.ovn.metadata.agent [-] Port 844199d8-6751-444b-b1e3-c6bb692ad49f in datapath 62a1ccdd-3048-4bbf-acc8-c791bff79ee8 bound to our chassis
Feb 16 13:38:00 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:38:00.567 105360 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 62a1ccdd-3048-4bbf-acc8-c791bff79ee8
Feb 16 13:38:00 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:38:00.577 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[584c79d4-ad0f-4f1c-97b4-3fcb72a39aba]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:38:00 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:38:00.579 105360 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap62a1ccdd-31 in ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Feb 16 13:38:00 compute-0 systemd-udevd[211559]: Network interface NamePolicy= disabled on kernel command line.
Feb 16 13:38:00 compute-0 systemd-machined[155229]: New machine qemu-10-instance-0000000e.
Feb 16 13:38:00 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:38:00.581 206438 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap62a1ccdd-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Feb 16 13:38:00 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:38:00.581 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[e43d2587-1fc8-4180-abd2-f07d53fcfb3b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:38:00 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:38:00.582 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[9cb21cf8-828f-4ca1-80cc-fc38749123ee]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:38:00 compute-0 NetworkManager[56177]: <info>  [1771249080.5908] device (tap844199d8-67): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 16 13:38:00 compute-0 NetworkManager[56177]: <info>  [1771249080.5924] device (tap844199d8-67): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Feb 16 13:38:00 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:38:00.595 105762 DEBUG oslo.privsep.daemon [-] privsep: reply[eeaa4386-0534-4ce9-9d96-e4af7a66994f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:38:00 compute-0 systemd[1]: Started Virtual Machine qemu-10-instance-0000000e.
Feb 16 13:38:00 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:38:00.606 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[8ba16ac1-d70e-416f-9926-9e8a20ecb7ce]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:38:00 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:38:00.629 206452 DEBUG oslo.privsep.daemon [-] privsep: reply[971c693c-39c8-4949-9896-6dca2cb2cdf4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:38:00 compute-0 systemd-udevd[211562]: Network interface NamePolicy= disabled on kernel command line.
Feb 16 13:38:00 compute-0 NetworkManager[56177]: <info>  [1771249080.6351] manager: (tap62a1ccdd-30): new Veth device (/org/freedesktop/NetworkManager/Devices/51)
Feb 16 13:38:00 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:38:00.635 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[5ac73690-ad9b-493d-bf4f-275ba7e79971]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:38:00 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:38:00.668 206452 DEBUG oslo.privsep.daemon [-] privsep: reply[7c17fe27-e1d6-4e2d-b010-633444924812]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:38:00 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:38:00.671 206452 DEBUG oslo.privsep.daemon [-] privsep: reply[1718a1de-23cd-4c5d-a976-2b56b79e4e14]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:38:00 compute-0 NetworkManager[56177]: <info>  [1771249080.6897] device (tap62a1ccdd-30): carrier: link connected
Feb 16 13:38:00 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:38:00.694 206452 DEBUG oslo.privsep.daemon [-] privsep: reply[d5bc0ac8-d5d9-4fa0-b916-fc661c4287a4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:38:00 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:38:00.708 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[ef6d8e3b-13f0-4029-9729-71e3409f0193]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap62a1ccdd-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a9:94:92'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 35], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 516408, 'reachable_time': 38827, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 211591, 'error': None, 'target': 'ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:38:00 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:38:00.723 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[c7f32f58-524e-43ce-b36e-95c73749d55f]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fea9:9492'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 516408, 'tstamp': 516408}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 211592, 'error': None, 'target': 'ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:38:00 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:38:00.737 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[74bf5166-a9e4-4ba9-9d01-893c0769fcf6]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap62a1ccdd-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a9:94:92'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 35], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 516408, 'reachable_time': 38827, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 211593, 'error': None, 'target': 'ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:38:00 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:38:00.768 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[a23742f6-e21e-49d1-a5c8-5885a05ce058]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:38:00 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:38:00.811 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[f25fb285-b05f-4dd5-8c00-6cc2adf1c213]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:38:00 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:38:00.813 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap62a1ccdd-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:38:00 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:38:00.813 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 13:38:00 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:38:00.814 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap62a1ccdd-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:38:00 compute-0 kernel: tap62a1ccdd-30: entered promiscuous mode
Feb 16 13:38:00 compute-0 NetworkManager[56177]: <info>  [1771249080.8165] manager: (tap62a1ccdd-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/52)
Feb 16 13:38:00 compute-0 nova_compute[185723]: 2026-02-16 13:38:00.815 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:38:00 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:38:00.818 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap62a1ccdd-30, col_values=(('external_ids', {'iface-id': 'ac21d57d-f71e-4560-b6aa-e9f6e3838308'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:38:00 compute-0 nova_compute[185723]: 2026-02-16 13:38:00.819 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:38:00 compute-0 ovn_controller[96072]: 2026-02-16T13:38:00Z|00122|binding|INFO|Releasing lport ac21d57d-f71e-4560-b6aa-e9f6e3838308 from this chassis (sb_readonly=0)
Feb 16 13:38:00 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:38:00.822 105360 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/62a1ccdd-3048-4bbf-acc8-c791bff79ee8.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/62a1ccdd-3048-4bbf-acc8-c791bff79ee8.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Feb 16 13:38:00 compute-0 nova_compute[185723]: 2026-02-16 13:38:00.825 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:38:00 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:38:00.825 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[3cb619f6-e3a1-4ad7-bc9e-d31cf9bb7057]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:38:00 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:38:00.826 105360 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Feb 16 13:38:00 compute-0 ovn_metadata_agent[105355]: global
Feb 16 13:38:00 compute-0 ovn_metadata_agent[105355]:     log         /dev/log local0 debug
Feb 16 13:38:00 compute-0 ovn_metadata_agent[105355]:     log-tag     haproxy-metadata-proxy-62a1ccdd-3048-4bbf-acc8-c791bff79ee8
Feb 16 13:38:00 compute-0 ovn_metadata_agent[105355]:     user        root
Feb 16 13:38:00 compute-0 ovn_metadata_agent[105355]:     group       root
Feb 16 13:38:00 compute-0 ovn_metadata_agent[105355]:     maxconn     1024
Feb 16 13:38:00 compute-0 ovn_metadata_agent[105355]:     pidfile     /var/lib/neutron/external/pids/62a1ccdd-3048-4bbf-acc8-c791bff79ee8.pid.haproxy
Feb 16 13:38:00 compute-0 ovn_metadata_agent[105355]:     daemon
Feb 16 13:38:00 compute-0 ovn_metadata_agent[105355]: 
Feb 16 13:38:00 compute-0 ovn_metadata_agent[105355]: defaults
Feb 16 13:38:00 compute-0 ovn_metadata_agent[105355]:     log global
Feb 16 13:38:00 compute-0 ovn_metadata_agent[105355]:     mode http
Feb 16 13:38:00 compute-0 ovn_metadata_agent[105355]:     option httplog
Feb 16 13:38:00 compute-0 ovn_metadata_agent[105355]:     option dontlognull
Feb 16 13:38:00 compute-0 ovn_metadata_agent[105355]:     option http-server-close
Feb 16 13:38:00 compute-0 ovn_metadata_agent[105355]:     option forwardfor
Feb 16 13:38:00 compute-0 ovn_metadata_agent[105355]:     retries                 3
Feb 16 13:38:00 compute-0 ovn_metadata_agent[105355]:     timeout http-request    30s
Feb 16 13:38:00 compute-0 ovn_metadata_agent[105355]:     timeout connect         30s
Feb 16 13:38:00 compute-0 ovn_metadata_agent[105355]:     timeout client          32s
Feb 16 13:38:00 compute-0 ovn_metadata_agent[105355]:     timeout server          32s
Feb 16 13:38:00 compute-0 ovn_metadata_agent[105355]:     timeout http-keep-alive 30s
Feb 16 13:38:00 compute-0 ovn_metadata_agent[105355]: 
Feb 16 13:38:00 compute-0 ovn_metadata_agent[105355]: 
Feb 16 13:38:00 compute-0 ovn_metadata_agent[105355]: listen listener
Feb 16 13:38:00 compute-0 ovn_metadata_agent[105355]:     bind 169.254.169.254:80
Feb 16 13:38:00 compute-0 ovn_metadata_agent[105355]:     server metadata /var/lib/neutron/metadata_proxy
Feb 16 13:38:00 compute-0 ovn_metadata_agent[105355]:     http-request add-header X-OVN-Network-ID 62a1ccdd-3048-4bbf-acc8-c791bff79ee8
Feb 16 13:38:00 compute-0 ovn_metadata_agent[105355]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Feb 16 13:38:00 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:38:00.828 105360 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'env', 'PROCESS_TAG=haproxy-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/62a1ccdd-3048-4bbf-acc8-c791bff79ee8.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Feb 16 13:38:00 compute-0 nova_compute[185723]: 2026-02-16 13:38:00.829 185727 DEBUG nova.compute.manager [req-6e6d16cc-9cf1-4539-9433-7bc67e523aa5 req-89805bae-28d6-4c84-b0df-045fb83cc179 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 81267e8d-93ab-405d-863c-176b83cabb76] Received event network-vif-plugged-844199d8-6751-444b-b1e3-c6bb692ad49f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:38:00 compute-0 nova_compute[185723]: 2026-02-16 13:38:00.830 185727 DEBUG oslo_concurrency.lockutils [req-6e6d16cc-9cf1-4539-9433-7bc67e523aa5 req-89805bae-28d6-4c84-b0df-045fb83cc179 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "81267e8d-93ab-405d-863c-176b83cabb76-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:38:00 compute-0 nova_compute[185723]: 2026-02-16 13:38:00.830 185727 DEBUG oslo_concurrency.lockutils [req-6e6d16cc-9cf1-4539-9433-7bc67e523aa5 req-89805bae-28d6-4c84-b0df-045fb83cc179 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "81267e8d-93ab-405d-863c-176b83cabb76-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:38:00 compute-0 nova_compute[185723]: 2026-02-16 13:38:00.830 185727 DEBUG oslo_concurrency.lockutils [req-6e6d16cc-9cf1-4539-9433-7bc67e523aa5 req-89805bae-28d6-4c84-b0df-045fb83cc179 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "81267e8d-93ab-405d-863c-176b83cabb76-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:38:00 compute-0 nova_compute[185723]: 2026-02-16 13:38:00.830 185727 DEBUG nova.compute.manager [req-6e6d16cc-9cf1-4539-9433-7bc67e523aa5 req-89805bae-28d6-4c84-b0df-045fb83cc179 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 81267e8d-93ab-405d-863c-176b83cabb76] Processing event network-vif-plugged-844199d8-6751-444b-b1e3-c6bb692ad49f _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Feb 16 13:38:00 compute-0 nova_compute[185723]: 2026-02-16 13:38:00.982 185727 DEBUG nova.compute.manager [None req-2c297189-30f1-4e5b-95bf-dfb5a798f4ac e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 81267e8d-93ab-405d-863c-176b83cabb76] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Feb 16 13:38:00 compute-0 nova_compute[185723]: 2026-02-16 13:38:00.983 185727 DEBUG nova.virt.driver [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] Emitting event <LifecycleEvent: 1771249080.9817524, 81267e8d-93ab-405d-863c-176b83cabb76 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 13:38:00 compute-0 nova_compute[185723]: 2026-02-16 13:38:00.983 185727 INFO nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: 81267e8d-93ab-405d-863c-176b83cabb76] VM Started (Lifecycle Event)
Feb 16 13:38:00 compute-0 nova_compute[185723]: 2026-02-16 13:38:00.986 185727 DEBUG nova.virt.libvirt.driver [None req-2c297189-30f1-4e5b-95bf-dfb5a798f4ac e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 81267e8d-93ab-405d-863c-176b83cabb76] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Feb 16 13:38:00 compute-0 nova_compute[185723]: 2026-02-16 13:38:00.989 185727 INFO nova.virt.libvirt.driver [-] [instance: 81267e8d-93ab-405d-863c-176b83cabb76] Instance spawned successfully.
Feb 16 13:38:00 compute-0 nova_compute[185723]: 2026-02-16 13:38:00.989 185727 DEBUG nova.virt.libvirt.driver [None req-2c297189-30f1-4e5b-95bf-dfb5a798f4ac e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 81267e8d-93ab-405d-863c-176b83cabb76] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Feb 16 13:38:01 compute-0 nova_compute[185723]: 2026-02-16 13:38:01.005 185727 DEBUG nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: 81267e8d-93ab-405d-863c-176b83cabb76] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:38:01 compute-0 nova_compute[185723]: 2026-02-16 13:38:01.009 185727 DEBUG nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: 81267e8d-93ab-405d-863c-176b83cabb76] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 16 13:38:01 compute-0 nova_compute[185723]: 2026-02-16 13:38:01.013 185727 DEBUG nova.virt.libvirt.driver [None req-2c297189-30f1-4e5b-95bf-dfb5a798f4ac e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 81267e8d-93ab-405d-863c-176b83cabb76] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:38:01 compute-0 nova_compute[185723]: 2026-02-16 13:38:01.013 185727 DEBUG nova.virt.libvirt.driver [None req-2c297189-30f1-4e5b-95bf-dfb5a798f4ac e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 81267e8d-93ab-405d-863c-176b83cabb76] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:38:01 compute-0 nova_compute[185723]: 2026-02-16 13:38:01.013 185727 DEBUG nova.virt.libvirt.driver [None req-2c297189-30f1-4e5b-95bf-dfb5a798f4ac e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 81267e8d-93ab-405d-863c-176b83cabb76] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:38:01 compute-0 nova_compute[185723]: 2026-02-16 13:38:01.014 185727 DEBUG nova.virt.libvirt.driver [None req-2c297189-30f1-4e5b-95bf-dfb5a798f4ac e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 81267e8d-93ab-405d-863c-176b83cabb76] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:38:01 compute-0 nova_compute[185723]: 2026-02-16 13:38:01.014 185727 DEBUG nova.virt.libvirt.driver [None req-2c297189-30f1-4e5b-95bf-dfb5a798f4ac e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 81267e8d-93ab-405d-863c-176b83cabb76] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:38:01 compute-0 nova_compute[185723]: 2026-02-16 13:38:01.014 185727 DEBUG nova.virt.libvirt.driver [None req-2c297189-30f1-4e5b-95bf-dfb5a798f4ac e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 81267e8d-93ab-405d-863c-176b83cabb76] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:38:01 compute-0 nova_compute[185723]: 2026-02-16 13:38:01.040 185727 INFO nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: 81267e8d-93ab-405d-863c-176b83cabb76] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 16 13:38:01 compute-0 nova_compute[185723]: 2026-02-16 13:38:01.041 185727 DEBUG nova.virt.driver [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] Emitting event <LifecycleEvent: 1771249080.9819987, 81267e8d-93ab-405d-863c-176b83cabb76 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 13:38:01 compute-0 nova_compute[185723]: 2026-02-16 13:38:01.041 185727 INFO nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: 81267e8d-93ab-405d-863c-176b83cabb76] VM Paused (Lifecycle Event)
Feb 16 13:38:01 compute-0 nova_compute[185723]: 2026-02-16 13:38:01.065 185727 DEBUG nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: 81267e8d-93ab-405d-863c-176b83cabb76] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:38:01 compute-0 nova_compute[185723]: 2026-02-16 13:38:01.069 185727 DEBUG nova.virt.driver [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] Emitting event <LifecycleEvent: 1771249080.9857624, 81267e8d-93ab-405d-863c-176b83cabb76 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 13:38:01 compute-0 nova_compute[185723]: 2026-02-16 13:38:01.070 185727 INFO nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: 81267e8d-93ab-405d-863c-176b83cabb76] VM Resumed (Lifecycle Event)
Feb 16 13:38:01 compute-0 nova_compute[185723]: 2026-02-16 13:38:01.075 185727 INFO nova.compute.manager [None req-2c297189-30f1-4e5b-95bf-dfb5a798f4ac e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 81267e8d-93ab-405d-863c-176b83cabb76] Took 8.79 seconds to spawn the instance on the hypervisor.
Feb 16 13:38:01 compute-0 nova_compute[185723]: 2026-02-16 13:38:01.075 185727 DEBUG nova.compute.manager [None req-2c297189-30f1-4e5b-95bf-dfb5a798f4ac e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 81267e8d-93ab-405d-863c-176b83cabb76] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:38:01 compute-0 nova_compute[185723]: 2026-02-16 13:38:01.099 185727 DEBUG nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: 81267e8d-93ab-405d-863c-176b83cabb76] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:38:01 compute-0 nova_compute[185723]: 2026-02-16 13:38:01.103 185727 DEBUG nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: 81267e8d-93ab-405d-863c-176b83cabb76] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 16 13:38:01 compute-0 nova_compute[185723]: 2026-02-16 13:38:01.128 185727 INFO nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: 81267e8d-93ab-405d-863c-176b83cabb76] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 16 13:38:01 compute-0 nova_compute[185723]: 2026-02-16 13:38:01.140 185727 INFO nova.compute.manager [None req-2c297189-30f1-4e5b-95bf-dfb5a798f4ac e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 81267e8d-93ab-405d-863c-176b83cabb76] Took 9.29 seconds to build instance.
Feb 16 13:38:01 compute-0 nova_compute[185723]: 2026-02-16 13:38:01.155 185727 DEBUG oslo_concurrency.lockutils [None req-2c297189-30f1-4e5b-95bf-dfb5a798f4ac e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "81267e8d-93ab-405d-863c-176b83cabb76" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.405s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:38:01 compute-0 podman[211632]: 2026-02-16 13:38:01.169133415 +0000 UTC m=+0.044916233 container create 0421dc033ff09d5637f4e4a4083dfc2b66d8d673943b17bb7638d65f3c7fe588 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0)
Feb 16 13:38:01 compute-0 systemd[1]: Started libpod-conmon-0421dc033ff09d5637f4e4a4083dfc2b66d8d673943b17bb7638d65f3c7fe588.scope.
Feb 16 13:38:01 compute-0 systemd[1]: Started libcrun container.
Feb 16 13:38:01 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/546a8eadafdc42fd135a470bfcb33779600f325b4b0af04b3f1ad8bb791dfa70/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 16 13:38:01 compute-0 podman[211632]: 2026-02-16 13:38:01.236692943 +0000 UTC m=+0.112475761 container init 0421dc033ff09d5637f4e4a4083dfc2b66d8d673943b17bb7638d65f3c7fe588 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true)
Feb 16 13:38:01 compute-0 podman[211632]: 2026-02-16 13:38:01.143544148 +0000 UTC m=+0.019326976 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Feb 16 13:38:01 compute-0 podman[211632]: 2026-02-16 13:38:01.242711681 +0000 UTC m=+0.118494479 container start 0421dc033ff09d5637f4e4a4083dfc2b66d8d673943b17bb7638d65f3c7fe588 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20260127, io.buildah.version=1.41.3)
Feb 16 13:38:01 compute-0 neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8[211647]: [NOTICE]   (211651) : New worker (211653) forked
Feb 16 13:38:01 compute-0 neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8[211647]: [NOTICE]   (211651) : Loading success.
Feb 16 13:38:01 compute-0 nova_compute[185723]: 2026-02-16 13:38:01.413 185727 DEBUG nova.network.neutron [req-3fe8e777-af39-4e57-ada4-e0d18a30becf req-5bc5e6bc-ee63-4c72-acf5-595ca3ea057f faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 81267e8d-93ab-405d-863c-176b83cabb76] Updated VIF entry in instance network info cache for port 844199d8-6751-444b-b1e3-c6bb692ad49f. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 16 13:38:01 compute-0 nova_compute[185723]: 2026-02-16 13:38:01.413 185727 DEBUG nova.network.neutron [req-3fe8e777-af39-4e57-ada4-e0d18a30becf req-5bc5e6bc-ee63-4c72-acf5-595ca3ea057f faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 81267e8d-93ab-405d-863c-176b83cabb76] Updating instance_info_cache with network_info: [{"id": "844199d8-6751-444b-b1e3-c6bb692ad49f", "address": "fa:16:3e:f6:00:5e", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap844199d8-67", "ovs_interfaceid": "844199d8-6751-444b-b1e3-c6bb692ad49f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 13:38:01 compute-0 openstack_network_exporter[197909]: ERROR   13:38:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:38:01 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:38:01 compute-0 openstack_network_exporter[197909]: ERROR   13:38:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:38:01 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:38:01 compute-0 nova_compute[185723]: 2026-02-16 13:38:01.431 185727 DEBUG oslo_concurrency.lockutils [req-3fe8e777-af39-4e57-ada4-e0d18a30becf req-5bc5e6bc-ee63-4c72-acf5-595ca3ea057f faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Releasing lock "refresh_cache-81267e8d-93ab-405d-863c-176b83cabb76" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 13:38:03 compute-0 nova_compute[185723]: 2026-02-16 13:38:03.055 185727 DEBUG nova.compute.manager [req-5fef961b-525c-4fdd-b0d9-f4ba56d77401 req-2117ff9d-7ebb-4003-b7de-eb4489f6f4e0 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 81267e8d-93ab-405d-863c-176b83cabb76] Received event network-vif-plugged-844199d8-6751-444b-b1e3-c6bb692ad49f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:38:03 compute-0 nova_compute[185723]: 2026-02-16 13:38:03.056 185727 DEBUG oslo_concurrency.lockutils [req-5fef961b-525c-4fdd-b0d9-f4ba56d77401 req-2117ff9d-7ebb-4003-b7de-eb4489f6f4e0 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "81267e8d-93ab-405d-863c-176b83cabb76-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:38:03 compute-0 nova_compute[185723]: 2026-02-16 13:38:03.056 185727 DEBUG oslo_concurrency.lockutils [req-5fef961b-525c-4fdd-b0d9-f4ba56d77401 req-2117ff9d-7ebb-4003-b7de-eb4489f6f4e0 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "81267e8d-93ab-405d-863c-176b83cabb76-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:38:03 compute-0 nova_compute[185723]: 2026-02-16 13:38:03.056 185727 DEBUG oslo_concurrency.lockutils [req-5fef961b-525c-4fdd-b0d9-f4ba56d77401 req-2117ff9d-7ebb-4003-b7de-eb4489f6f4e0 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "81267e8d-93ab-405d-863c-176b83cabb76-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:38:03 compute-0 nova_compute[185723]: 2026-02-16 13:38:03.057 185727 DEBUG nova.compute.manager [req-5fef961b-525c-4fdd-b0d9-f4ba56d77401 req-2117ff9d-7ebb-4003-b7de-eb4489f6f4e0 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 81267e8d-93ab-405d-863c-176b83cabb76] No waiting events found dispatching network-vif-plugged-844199d8-6751-444b-b1e3-c6bb692ad49f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:38:03 compute-0 nova_compute[185723]: 2026-02-16 13:38:03.057 185727 WARNING nova.compute.manager [req-5fef961b-525c-4fdd-b0d9-f4ba56d77401 req-2117ff9d-7ebb-4003-b7de-eb4489f6f4e0 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 81267e8d-93ab-405d-863c-176b83cabb76] Received unexpected event network-vif-plugged-844199d8-6751-444b-b1e3-c6bb692ad49f for instance with vm_state active and task_state None.
Feb 16 13:38:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:38:03.230 105360 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:38:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:38:03.231 105360 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:38:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:38:03.231 105360 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:38:04 compute-0 nova_compute[185723]: 2026-02-16 13:38:04.494 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:38:04 compute-0 nova_compute[185723]: 2026-02-16 13:38:04.522 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:38:05 compute-0 podman[211662]: 2026-02-16 13:38:05.015383616 +0000 UTC m=+0.050555482 container health_status 4ec6105d1727f201f656d7e6e358931be8ceabc5af37fb5ad779abc620122180 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 16 13:38:05 compute-0 nova_compute[185723]: 2026-02-16 13:38:05.433 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:38:05 compute-0 nova_compute[185723]: 2026-02-16 13:38:05.434 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Feb 16 13:38:05 compute-0 nova_compute[185723]: 2026-02-16 13:38:05.452 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Feb 16 13:38:09 compute-0 nova_compute[185723]: 2026-02-16 13:38:09.497 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:38:09 compute-0 nova_compute[185723]: 2026-02-16 13:38:09.526 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:38:12 compute-0 sshd-session[211693]: Invalid user admin from 64.227.72.94 port 35908
Feb 16 13:38:12 compute-0 sshd-session[211693]: Connection closed by invalid user admin 64.227.72.94 port 35908 [preauth]
Feb 16 13:38:14 compute-0 nova_compute[185723]: 2026-02-16 13:38:14.452 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:38:14 compute-0 nova_compute[185723]: 2026-02-16 13:38:14.453 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:38:14 compute-0 nova_compute[185723]: 2026-02-16 13:38:14.500 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:38:14 compute-0 nova_compute[185723]: 2026-02-16 13:38:14.527 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:38:14 compute-0 ovn_controller[96072]: 2026-02-16T13:38:14Z|00018|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:f6:00:5e 10.100.0.9
Feb 16 13:38:14 compute-0 ovn_controller[96072]: 2026-02-16T13:38:14Z|00019|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:f6:00:5e 10.100.0.9
Feb 16 13:38:15 compute-0 nova_compute[185723]: 2026-02-16 13:38:15.429 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:38:15 compute-0 nova_compute[185723]: 2026-02-16 13:38:15.432 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:38:15 compute-0 nova_compute[185723]: 2026-02-16 13:38:15.433 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 16 13:38:15 compute-0 nova_compute[185723]: 2026-02-16 13:38:15.433 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 16 13:38:16 compute-0 nova_compute[185723]: 2026-02-16 13:38:16.371 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Acquiring lock "refresh_cache-81267e8d-93ab-405d-863c-176b83cabb76" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 13:38:16 compute-0 nova_compute[185723]: 2026-02-16 13:38:16.372 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Acquired lock "refresh_cache-81267e8d-93ab-405d-863c-176b83cabb76" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 13:38:16 compute-0 nova_compute[185723]: 2026-02-16 13:38:16.372 185727 DEBUG nova.network.neutron [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] [instance: 81267e8d-93ab-405d-863c-176b83cabb76] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Feb 16 13:38:16 compute-0 nova_compute[185723]: 2026-02-16 13:38:16.372 185727 DEBUG nova.objects.instance [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 81267e8d-93ab-405d-863c-176b83cabb76 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 13:38:17 compute-0 sshd-session[211702]: Invalid user admin from 146.190.226.24 port 45634
Feb 16 13:38:18 compute-0 sshd-session[211702]: Connection closed by invalid user admin 146.190.226.24 port 45634 [preauth]
Feb 16 13:38:18 compute-0 nova_compute[185723]: 2026-02-16 13:38:18.796 185727 DEBUG nova.network.neutron [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] [instance: 81267e8d-93ab-405d-863c-176b83cabb76] Updating instance_info_cache with network_info: [{"id": "844199d8-6751-444b-b1e3-c6bb692ad49f", "address": "fa:16:3e:f6:00:5e", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap844199d8-67", "ovs_interfaceid": "844199d8-6751-444b-b1e3-c6bb692ad49f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 13:38:18 compute-0 nova_compute[185723]: 2026-02-16 13:38:18.822 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Releasing lock "refresh_cache-81267e8d-93ab-405d-863c-176b83cabb76" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 13:38:18 compute-0 nova_compute[185723]: 2026-02-16 13:38:18.822 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] [instance: 81267e8d-93ab-405d-863c-176b83cabb76] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Feb 16 13:38:18 compute-0 nova_compute[185723]: 2026-02-16 13:38:18.823 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:38:18 compute-0 nova_compute[185723]: 2026-02-16 13:38:18.823 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:38:18 compute-0 nova_compute[185723]: 2026-02-16 13:38:18.823 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:38:18 compute-0 nova_compute[185723]: 2026-02-16 13:38:18.824 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 16 13:38:18 compute-0 nova_compute[185723]: 2026-02-16 13:38:18.824 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:38:18 compute-0 nova_compute[185723]: 2026-02-16 13:38:18.849 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:38:18 compute-0 nova_compute[185723]: 2026-02-16 13:38:18.849 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:38:18 compute-0 nova_compute[185723]: 2026-02-16 13:38:18.850 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:38:18 compute-0 nova_compute[185723]: 2026-02-16 13:38:18.850 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 16 13:38:18 compute-0 nova_compute[185723]: 2026-02-16 13:38:18.962 185727 DEBUG oslo_concurrency.processutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/81267e8d-93ab-405d-863c-176b83cabb76/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:38:19 compute-0 nova_compute[185723]: 2026-02-16 13:38:19.021 185727 DEBUG oslo_concurrency.processutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/81267e8d-93ab-405d-863c-176b83cabb76/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:38:19 compute-0 nova_compute[185723]: 2026-02-16 13:38:19.022 185727 DEBUG oslo_concurrency.processutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/81267e8d-93ab-405d-863c-176b83cabb76/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:38:19 compute-0 nova_compute[185723]: 2026-02-16 13:38:19.087 185727 DEBUG oslo_concurrency.processutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/81267e8d-93ab-405d-863c-176b83cabb76/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:38:19 compute-0 nova_compute[185723]: 2026-02-16 13:38:19.216 185727 WARNING nova.virt.libvirt.driver [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 13:38:19 compute-0 nova_compute[185723]: 2026-02-16 13:38:19.218 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5678MB free_disk=73.19596481323242GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 16 13:38:19 compute-0 nova_compute[185723]: 2026-02-16 13:38:19.218 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:38:19 compute-0 nova_compute[185723]: 2026-02-16 13:38:19.218 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:38:19 compute-0 nova_compute[185723]: 2026-02-16 13:38:19.377 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Instance 81267e8d-93ab-405d-863c-176b83cabb76 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 16 13:38:19 compute-0 nova_compute[185723]: 2026-02-16 13:38:19.378 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 16 13:38:19 compute-0 nova_compute[185723]: 2026-02-16 13:38:19.378 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 16 13:38:19 compute-0 nova_compute[185723]: 2026-02-16 13:38:19.529 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 13:38:19 compute-0 nova_compute[185723]: 2026-02-16 13:38:19.531 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 13:38:19 compute-0 nova_compute[185723]: 2026-02-16 13:38:19.531 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 16 13:38:19 compute-0 nova_compute[185723]: 2026-02-16 13:38:19.532 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 16 13:38:19 compute-0 nova_compute[185723]: 2026-02-16 13:38:19.534 185727 DEBUG nova.compute.provider_tree [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Inventory has not changed in ProviderTree for provider: c9501a85-df32-4b8f-bce0-9425ef1e7866 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 13:38:19 compute-0 nova_compute[185723]: 2026-02-16 13:38:19.538 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:38:19 compute-0 nova_compute[185723]: 2026-02-16 13:38:19.539 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 16 13:38:19 compute-0 nova_compute[185723]: 2026-02-16 13:38:19.551 185727 DEBUG nova.scheduler.client.report [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Inventory has not changed for provider c9501a85-df32-4b8f-bce0-9425ef1e7866 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 13:38:19 compute-0 nova_compute[185723]: 2026-02-16 13:38:19.576 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 16 13:38:19 compute-0 nova_compute[185723]: 2026-02-16 13:38:19.576 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.358s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:38:21 compute-0 nova_compute[185723]: 2026-02-16 13:38:21.186 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:38:22 compute-0 podman[211713]: 2026-02-16 13:38:22.016992772 +0000 UTC m=+0.050751047 container health_status d69bd0be854ec8043fcba555b70c2cd311c7a2510d07d6bc9929fe7684abece9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 16 13:38:22 compute-0 podman[211712]: 2026-02-16 13:38:22.029454757 +0000 UTC m=+0.062711649 container health_status 93151dcbec9f159583042df5101bdd7b5b1bcb5f3117955b4bc9cd1c0e465b98 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9/ubi-minimal, release=1770267347, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-type=git, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, build-date=2026-02-05T04:57:10Z, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vendor=Red Hat, Inc., io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.created=2026-02-05T04:57:10Z, version=9.7, architecture=x86_64, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream)
Feb 16 13:38:24 compute-0 nova_compute[185723]: 2026-02-16 13:38:24.539 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:38:26 compute-0 podman[211750]: 2026-02-16 13:38:26.041464394 +0000 UTC m=+0.080909886 container health_status 19961a2f872cc72daf954f4dd556f980b5f58f137d145776df6132c167a8a93c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, container_name=ovn_controller)
Feb 16 13:38:26 compute-0 nova_compute[185723]: 2026-02-16 13:38:26.433 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:38:26 compute-0 nova_compute[185723]: 2026-02-16 13:38:26.434 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Feb 16 13:38:29 compute-0 nova_compute[185723]: 2026-02-16 13:38:29.540 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:38:29 compute-0 nova_compute[185723]: 2026-02-16 13:38:29.617 185727 DEBUG nova.virt.libvirt.driver [None req-3bcd745a-8ea6-43a2-b0f5-f4f1ec42f113 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] Creating tmpfile /var/lib/nova/instances/tmp4juvj7yy to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10041
Feb 16 13:38:29 compute-0 nova_compute[185723]: 2026-02-16 13:38:29.618 185727 DEBUG nova.compute.manager [None req-3bcd745a-8ea6-43a2-b0f5-f4f1ec42f113 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp4juvj7yy',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.9/site-packages/nova/compute/manager.py:8476
Feb 16 13:38:29 compute-0 podman[195053]: time="2026-02-16T13:38:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:38:29 compute-0 podman[195053]: @ - - [16/Feb/2026:13:38:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17245 "" "Go-http-client/1.1"
Feb 16 13:38:29 compute-0 podman[195053]: @ - - [16/Feb/2026:13:38:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2641 "" "Go-http-client/1.1"
Feb 16 13:38:30 compute-0 nova_compute[185723]: 2026-02-16 13:38:30.399 185727 DEBUG nova.compute.manager [None req-3bcd745a-8ea6-43a2-b0f5-f4f1ec42f113 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp4juvj7yy',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='3c7e1337-03a5-4860-9bdf-2ff0df92ca75',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8604
Feb 16 13:38:30 compute-0 nova_compute[185723]: 2026-02-16 13:38:30.427 185727 DEBUG oslo_concurrency.lockutils [None req-3bcd745a-8ea6-43a2-b0f5-f4f1ec42f113 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "refresh_cache-3c7e1337-03a5-4860-9bdf-2ff0df92ca75" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 13:38:30 compute-0 nova_compute[185723]: 2026-02-16 13:38:30.428 185727 DEBUG oslo_concurrency.lockutils [None req-3bcd745a-8ea6-43a2-b0f5-f4f1ec42f113 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Acquired lock "refresh_cache-3c7e1337-03a5-4860-9bdf-2ff0df92ca75" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 13:38:30 compute-0 nova_compute[185723]: 2026-02-16 13:38:30.428 185727 DEBUG nova.network.neutron [None req-3bcd745a-8ea6-43a2-b0f5-f4f1ec42f113 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 16 13:38:30 compute-0 ovn_controller[96072]: 2026-02-16T13:38:30Z|00123|memory_trim|INFO|Detected inactivity (last active 30005 ms ago): trimming memory
Feb 16 13:38:31 compute-0 nova_compute[185723]: 2026-02-16 13:38:31.408 185727 DEBUG nova.network.neutron [None req-3bcd745a-8ea6-43a2-b0f5-f4f1ec42f113 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] Updating instance_info_cache with network_info: [{"id": "6eb3ffb6-7a82-44c5-98d8-1fa609426d92", "address": "fa:16:3e:7c:9b:d1", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6eb3ffb6-7a", "ovs_interfaceid": "6eb3ffb6-7a82-44c5-98d8-1fa609426d92", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 13:38:31 compute-0 openstack_network_exporter[197909]: ERROR   13:38:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:38:31 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:38:31 compute-0 openstack_network_exporter[197909]: ERROR   13:38:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:38:31 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:38:31 compute-0 nova_compute[185723]: 2026-02-16 13:38:31.425 185727 DEBUG oslo_concurrency.lockutils [None req-3bcd745a-8ea6-43a2-b0f5-f4f1ec42f113 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Releasing lock "refresh_cache-3c7e1337-03a5-4860-9bdf-2ff0df92ca75" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 13:38:31 compute-0 nova_compute[185723]: 2026-02-16 13:38:31.428 185727 DEBUG nova.virt.libvirt.driver [None req-3bcd745a-8ea6-43a2-b0f5-f4f1ec42f113 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp4juvj7yy',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='3c7e1337-03a5-4860-9bdf-2ff0df92ca75',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10827
Feb 16 13:38:31 compute-0 nova_compute[185723]: 2026-02-16 13:38:31.428 185727 DEBUG nova.virt.libvirt.driver [None req-3bcd745a-8ea6-43a2-b0f5-f4f1ec42f113 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] Creating instance directory: /var/lib/nova/instances/3c7e1337-03a5-4860-9bdf-2ff0df92ca75 pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10840
Feb 16 13:38:31 compute-0 nova_compute[185723]: 2026-02-16 13:38:31.429 185727 DEBUG nova.virt.libvirt.driver [None req-3bcd745a-8ea6-43a2-b0f5-f4f1ec42f113 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] Creating disk.info with the contents: {'/var/lib/nova/instances/3c7e1337-03a5-4860-9bdf-2ff0df92ca75/disk': 'qcow2', '/var/lib/nova/instances/3c7e1337-03a5-4860-9bdf-2ff0df92ca75/disk.config': 'raw'} pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10854
Feb 16 13:38:31 compute-0 nova_compute[185723]: 2026-02-16 13:38:31.429 185727 DEBUG nova.virt.libvirt.driver [None req-3bcd745a-8ea6-43a2-b0f5-f4f1ec42f113 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10864
Feb 16 13:38:31 compute-0 nova_compute[185723]: 2026-02-16 13:38:31.430 185727 DEBUG nova.objects.instance [None req-3bcd745a-8ea6-43a2-b0f5-f4f1ec42f113 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lazy-loading 'trusted_certs' on Instance uuid 3c7e1337-03a5-4860-9bdf-2ff0df92ca75 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 13:38:31 compute-0 nova_compute[185723]: 2026-02-16 13:38:31.459 185727 DEBUG oslo_concurrency.processutils [None req-3bcd745a-8ea6-43a2-b0f5-f4f1ec42f113 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:38:31 compute-0 nova_compute[185723]: 2026-02-16 13:38:31.508 185727 DEBUG oslo_concurrency.processutils [None req-3bcd745a-8ea6-43a2-b0f5-f4f1ec42f113 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json" returned: 0 in 0.049s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:38:31 compute-0 nova_compute[185723]: 2026-02-16 13:38:31.509 185727 DEBUG oslo_concurrency.lockutils [None req-3bcd745a-8ea6-43a2-b0f5-f4f1ec42f113 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "755ae6cb63977e865a71a8c38a1a67ce95c9d0b7" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:38:31 compute-0 nova_compute[185723]: 2026-02-16 13:38:31.510 185727 DEBUG oslo_concurrency.lockutils [None req-3bcd745a-8ea6-43a2-b0f5-f4f1ec42f113 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "755ae6cb63977e865a71a8c38a1a67ce95c9d0b7" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:38:31 compute-0 nova_compute[185723]: 2026-02-16 13:38:31.520 185727 DEBUG oslo_concurrency.processutils [None req-3bcd745a-8ea6-43a2-b0f5-f4f1ec42f113 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:38:31 compute-0 nova_compute[185723]: 2026-02-16 13:38:31.569 185727 DEBUG oslo_concurrency.processutils [None req-3bcd745a-8ea6-43a2-b0f5-f4f1ec42f113 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json" returned: 0 in 0.049s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:38:31 compute-0 nova_compute[185723]: 2026-02-16 13:38:31.570 185727 DEBUG oslo_concurrency.processutils [None req-3bcd745a-8ea6-43a2-b0f5-f4f1ec42f113 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7,backing_fmt=raw /var/lib/nova/instances/3c7e1337-03a5-4860-9bdf-2ff0df92ca75/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:38:31 compute-0 nova_compute[185723]: 2026-02-16 13:38:31.600 185727 DEBUG oslo_concurrency.processutils [None req-3bcd745a-8ea6-43a2-b0f5-f4f1ec42f113 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7,backing_fmt=raw /var/lib/nova/instances/3c7e1337-03a5-4860-9bdf-2ff0df92ca75/disk 1073741824" returned: 0 in 0.030s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:38:31 compute-0 nova_compute[185723]: 2026-02-16 13:38:31.602 185727 DEBUG oslo_concurrency.lockutils [None req-3bcd745a-8ea6-43a2-b0f5-f4f1ec42f113 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "755ae6cb63977e865a71a8c38a1a67ce95c9d0b7" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.092s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:38:31 compute-0 nova_compute[185723]: 2026-02-16 13:38:31.603 185727 DEBUG oslo_concurrency.processutils [None req-3bcd745a-8ea6-43a2-b0f5-f4f1ec42f113 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:38:31 compute-0 nova_compute[185723]: 2026-02-16 13:38:31.670 185727 DEBUG oslo_concurrency.processutils [None req-3bcd745a-8ea6-43a2-b0f5-f4f1ec42f113 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:38:31 compute-0 nova_compute[185723]: 2026-02-16 13:38:31.671 185727 DEBUG nova.virt.disk.api [None req-3bcd745a-8ea6-43a2-b0f5-f4f1ec42f113 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Checking if we can resize image /var/lib/nova/instances/3c7e1337-03a5-4860-9bdf-2ff0df92ca75/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Feb 16 13:38:31 compute-0 nova_compute[185723]: 2026-02-16 13:38:31.671 185727 DEBUG oslo_concurrency.processutils [None req-3bcd745a-8ea6-43a2-b0f5-f4f1ec42f113 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3c7e1337-03a5-4860-9bdf-2ff0df92ca75/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:38:31 compute-0 nova_compute[185723]: 2026-02-16 13:38:31.721 185727 DEBUG oslo_concurrency.processutils [None req-3bcd745a-8ea6-43a2-b0f5-f4f1ec42f113 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3c7e1337-03a5-4860-9bdf-2ff0df92ca75/disk --force-share --output=json" returned: 0 in 0.049s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:38:31 compute-0 nova_compute[185723]: 2026-02-16 13:38:31.722 185727 DEBUG nova.virt.disk.api [None req-3bcd745a-8ea6-43a2-b0f5-f4f1ec42f113 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Cannot resize image /var/lib/nova/instances/3c7e1337-03a5-4860-9bdf-2ff0df92ca75/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Feb 16 13:38:31 compute-0 nova_compute[185723]: 2026-02-16 13:38:31.722 185727 DEBUG nova.objects.instance [None req-3bcd745a-8ea6-43a2-b0f5-f4f1ec42f113 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lazy-loading 'migration_context' on Instance uuid 3c7e1337-03a5-4860-9bdf-2ff0df92ca75 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 13:38:31 compute-0 nova_compute[185723]: 2026-02-16 13:38:31.737 185727 DEBUG oslo_concurrency.processutils [None req-3bcd745a-8ea6-43a2-b0f5-f4f1ec42f113 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/3c7e1337-03a5-4860-9bdf-2ff0df92ca75/disk.config 485376 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:38:31 compute-0 nova_compute[185723]: 2026-02-16 13:38:31.757 185727 DEBUG oslo_concurrency.processutils [None req-3bcd745a-8ea6-43a2-b0f5-f4f1ec42f113 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/3c7e1337-03a5-4860-9bdf-2ff0df92ca75/disk.config 485376" returned: 0 in 0.020s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:38:31 compute-0 nova_compute[185723]: 2026-02-16 13:38:31.758 185727 DEBUG nova.virt.libvirt.volume.remotefs [None req-3bcd745a-8ea6-43a2-b0f5-f4f1ec42f113 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Copying file compute-1.ctlplane.example.com:/var/lib/nova/instances/3c7e1337-03a5-4860-9bdf-2ff0df92ca75/disk.config to /var/lib/nova/instances/3c7e1337-03a5-4860-9bdf-2ff0df92ca75 copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Feb 16 13:38:31 compute-0 nova_compute[185723]: 2026-02-16 13:38:31.759 185727 DEBUG oslo_concurrency.processutils [None req-3bcd745a-8ea6-43a2-b0f5-f4f1ec42f113 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Running cmd (subprocess): scp -C -r compute-1.ctlplane.example.com:/var/lib/nova/instances/3c7e1337-03a5-4860-9bdf-2ff0df92ca75/disk.config /var/lib/nova/instances/3c7e1337-03a5-4860-9bdf-2ff0df92ca75 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:38:32 compute-0 nova_compute[185723]: 2026-02-16 13:38:32.225 185727 DEBUG oslo_concurrency.processutils [None req-3bcd745a-8ea6-43a2-b0f5-f4f1ec42f113 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] CMD "scp -C -r compute-1.ctlplane.example.com:/var/lib/nova/instances/3c7e1337-03a5-4860-9bdf-2ff0df92ca75/disk.config /var/lib/nova/instances/3c7e1337-03a5-4860-9bdf-2ff0df92ca75" returned: 0 in 0.466s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:38:32 compute-0 nova_compute[185723]: 2026-02-16 13:38:32.226 185727 DEBUG nova.virt.libvirt.driver [None req-3bcd745a-8ea6-43a2-b0f5-f4f1ec42f113 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10794
Feb 16 13:38:32 compute-0 nova_compute[185723]: 2026-02-16 13:38:32.227 185727 DEBUG nova.virt.libvirt.vif [None req-3bcd745a-8ea6-43a2-b0f5-f4f1ec42f113 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-16T13:37:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-931541268',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-931541268',id=13,image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-16T13:37:42Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='76c271745e704d5fa97fe16a7dcd4a81',ramdisk_id='',reservation_id='r-3jycgte6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-1085993185',owner_user_name='tempest-TestExecuteStrategies-1085993185-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2026-02-16T13:37:42Z,user_data=None,user_id='e19cd2d8a8894526ba620ca3249e9a63',uuid=3c7e1337-03a5-4860-9bdf-2ff0df92ca75,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "6eb3ffb6-7a82-44c5-98d8-1fa609426d92", "address": "fa:16:3e:7c:9b:d1", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap6eb3ffb6-7a", "ovs_interfaceid": "6eb3ffb6-7a82-44c5-98d8-1fa609426d92", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Feb 16 13:38:32 compute-0 nova_compute[185723]: 2026-02-16 13:38:32.228 185727 DEBUG nova.network.os_vif_util [None req-3bcd745a-8ea6-43a2-b0f5-f4f1ec42f113 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Converting VIF {"id": "6eb3ffb6-7a82-44c5-98d8-1fa609426d92", "address": "fa:16:3e:7c:9b:d1", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap6eb3ffb6-7a", "ovs_interfaceid": "6eb3ffb6-7a82-44c5-98d8-1fa609426d92", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 13:38:32 compute-0 nova_compute[185723]: 2026-02-16 13:38:32.228 185727 DEBUG nova.network.os_vif_util [None req-3bcd745a-8ea6-43a2-b0f5-f4f1ec42f113 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:7c:9b:d1,bridge_name='br-int',has_traffic_filtering=True,id=6eb3ffb6-7a82-44c5-98d8-1fa609426d92,network=Network(62a1ccdd-3048-4bbf-acc8-c791bff79ee8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6eb3ffb6-7a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 13:38:32 compute-0 nova_compute[185723]: 2026-02-16 13:38:32.229 185727 DEBUG os_vif [None req-3bcd745a-8ea6-43a2-b0f5-f4f1ec42f113 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:7c:9b:d1,bridge_name='br-int',has_traffic_filtering=True,id=6eb3ffb6-7a82-44c5-98d8-1fa609426d92,network=Network(62a1ccdd-3048-4bbf-acc8-c791bff79ee8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6eb3ffb6-7a') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Feb 16 13:38:32 compute-0 nova_compute[185723]: 2026-02-16 13:38:32.229 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:38:32 compute-0 nova_compute[185723]: 2026-02-16 13:38:32.230 185727 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:38:32 compute-0 nova_compute[185723]: 2026-02-16 13:38:32.230 185727 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 13:38:32 compute-0 nova_compute[185723]: 2026-02-16 13:38:32.235 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:38:32 compute-0 nova_compute[185723]: 2026-02-16 13:38:32.236 185727 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6eb3ffb6-7a, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:38:32 compute-0 nova_compute[185723]: 2026-02-16 13:38:32.237 185727 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap6eb3ffb6-7a, col_values=(('external_ids', {'iface-id': '6eb3ffb6-7a82-44c5-98d8-1fa609426d92', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:7c:9b:d1', 'vm-uuid': '3c7e1337-03a5-4860-9bdf-2ff0df92ca75'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:38:32 compute-0 nova_compute[185723]: 2026-02-16 13:38:32.239 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:38:32 compute-0 NetworkManager[56177]: <info>  [1771249112.2404] manager: (tap6eb3ffb6-7a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/53)
Feb 16 13:38:32 compute-0 nova_compute[185723]: 2026-02-16 13:38:32.241 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 13:38:32 compute-0 nova_compute[185723]: 2026-02-16 13:38:32.248 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:38:32 compute-0 nova_compute[185723]: 2026-02-16 13:38:32.249 185727 INFO os_vif [None req-3bcd745a-8ea6-43a2-b0f5-f4f1ec42f113 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:7c:9b:d1,bridge_name='br-int',has_traffic_filtering=True,id=6eb3ffb6-7a82-44c5-98d8-1fa609426d92,network=Network(62a1ccdd-3048-4bbf-acc8-c791bff79ee8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6eb3ffb6-7a')
Feb 16 13:38:32 compute-0 nova_compute[185723]: 2026-02-16 13:38:32.250 185727 DEBUG nova.virt.libvirt.driver [None req-3bcd745a-8ea6-43a2-b0f5-f4f1ec42f113 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10954
Feb 16 13:38:32 compute-0 nova_compute[185723]: 2026-02-16 13:38:32.250 185727 DEBUG nova.compute.manager [None req-3bcd745a-8ea6-43a2-b0f5-f4f1ec42f113 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp4juvj7yy',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='3c7e1337-03a5-4860-9bdf-2ff0df92ca75',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8668
Feb 16 13:38:34 compute-0 nova_compute[185723]: 2026-02-16 13:38:34.541 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:38:35 compute-0 sshd-session[211799]: Invalid user postgres from 188.166.42.159 port 52078
Feb 16 13:38:36 compute-0 podman[211801]: 2026-02-16 13:38:36.050559683 +0000 UTC m=+0.083294515 container health_status 4ec6105d1727f201f656d7e6e358931be8ceabc5af37fb5ad779abc620122180 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 16 13:38:36 compute-0 sshd-session[211799]: Connection closed by invalid user postgres 188.166.42.159 port 52078 [preauth]
Feb 16 13:38:37 compute-0 nova_compute[185723]: 2026-02-16 13:38:37.239 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:38:37 compute-0 nova_compute[185723]: 2026-02-16 13:38:37.446 185727 DEBUG nova.network.neutron [None req-3bcd745a-8ea6-43a2-b0f5-f4f1ec42f113 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] Port 6eb3ffb6-7a82-44c5-98d8-1fa609426d92 updated with migration profile {'migrating_to': 'compute-0.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.9/site-packages/nova/network/neutron.py:354
Feb 16 13:38:37 compute-0 nova_compute[185723]: 2026-02-16 13:38:37.448 185727 DEBUG nova.compute.manager [None req-3bcd745a-8ea6-43a2-b0f5-f4f1ec42f113 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp4juvj7yy',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='3c7e1337-03a5-4860-9bdf-2ff0df92ca75',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8723
Feb 16 13:38:37 compute-0 kernel: tap6eb3ffb6-7a: entered promiscuous mode
Feb 16 13:38:37 compute-0 NetworkManager[56177]: <info>  [1771249117.6864] manager: (tap6eb3ffb6-7a): new Tun device (/org/freedesktop/NetworkManager/Devices/54)
Feb 16 13:38:37 compute-0 nova_compute[185723]: 2026-02-16 13:38:37.731 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:38:37 compute-0 ovn_controller[96072]: 2026-02-16T13:38:37Z|00124|binding|INFO|Claiming lport 6eb3ffb6-7a82-44c5-98d8-1fa609426d92 for this additional chassis.
Feb 16 13:38:37 compute-0 ovn_controller[96072]: 2026-02-16T13:38:37Z|00125|binding|INFO|6eb3ffb6-7a82-44c5-98d8-1fa609426d92: Claiming fa:16:3e:7c:9b:d1 10.100.0.11
Feb 16 13:38:37 compute-0 ovn_controller[96072]: 2026-02-16T13:38:37Z|00126|binding|INFO|Setting lport 6eb3ffb6-7a82-44c5-98d8-1fa609426d92 ovn-installed in OVS
Feb 16 13:38:37 compute-0 nova_compute[185723]: 2026-02-16 13:38:37.737 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:38:37 compute-0 systemd-udevd[211841]: Network interface NamePolicy= disabled on kernel command line.
Feb 16 13:38:37 compute-0 systemd-machined[155229]: New machine qemu-11-instance-0000000d.
Feb 16 13:38:37 compute-0 NetworkManager[56177]: <info>  [1771249117.7653] device (tap6eb3ffb6-7a): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 16 13:38:37 compute-0 NetworkManager[56177]: <info>  [1771249117.7662] device (tap6eb3ffb6-7a): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Feb 16 13:38:37 compute-0 systemd[1]: Started Virtual Machine qemu-11-instance-0000000d.
Feb 16 13:38:38 compute-0 nova_compute[185723]: 2026-02-16 13:38:38.787 185727 DEBUG nova.virt.driver [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] Emitting event <LifecycleEvent: 1771249118.7864118, 3c7e1337-03a5-4860-9bdf-2ff0df92ca75 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 13:38:38 compute-0 nova_compute[185723]: 2026-02-16 13:38:38.787 185727 INFO nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] VM Started (Lifecycle Event)
Feb 16 13:38:38 compute-0 nova_compute[185723]: 2026-02-16 13:38:38.811 185727 DEBUG nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:38:39 compute-0 nova_compute[185723]: 2026-02-16 13:38:39.543 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:38:42 compute-0 nova_compute[185723]: 2026-02-16 13:38:42.242 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:38:42 compute-0 nova_compute[185723]: 2026-02-16 13:38:42.357 185727 DEBUG nova.virt.driver [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] Emitting event <LifecycleEvent: 1771249122.3571615, 3c7e1337-03a5-4860-9bdf-2ff0df92ca75 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 13:38:42 compute-0 nova_compute[185723]: 2026-02-16 13:38:42.358 185727 INFO nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] VM Resumed (Lifecycle Event)
Feb 16 13:38:42 compute-0 nova_compute[185723]: 2026-02-16 13:38:42.384 185727 DEBUG nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:38:42 compute-0 nova_compute[185723]: 2026-02-16 13:38:42.389 185727 DEBUG nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 16 13:38:42 compute-0 nova_compute[185723]: 2026-02-16 13:38:42.426 185727 INFO nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] During the sync_power process the instance has moved from host compute-1.ctlplane.example.com to host compute-0.ctlplane.example.com
Feb 16 13:38:42 compute-0 nova_compute[185723]: 2026-02-16 13:38:42.433 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:38:43 compute-0 nova_compute[185723]: 2026-02-16 13:38:43.598 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:38:43 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:38:43.598 105360 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=19, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '52:96:1b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '16:6c:7a:bd:49:39'}, ipsec=False) old=SB_Global(nb_cfg=18) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 13:38:43 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:38:43.600 105360 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 16 13:38:44 compute-0 nova_compute[185723]: 2026-02-16 13:38:44.546 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:38:44 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:38:44.602 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=b0e583b2-47d7-4bde-bbd6-282143e0c194, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '19'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:38:44 compute-0 ovn_controller[96072]: 2026-02-16T13:38:44Z|00127|binding|INFO|Claiming lport 6eb3ffb6-7a82-44c5-98d8-1fa609426d92 for this chassis.
Feb 16 13:38:44 compute-0 ovn_controller[96072]: 2026-02-16T13:38:44Z|00128|binding|INFO|6eb3ffb6-7a82-44c5-98d8-1fa609426d92: Claiming fa:16:3e:7c:9b:d1 10.100.0.11
Feb 16 13:38:44 compute-0 ovn_controller[96072]: 2026-02-16T13:38:44Z|00129|binding|INFO|Setting lport 6eb3ffb6-7a82-44c5-98d8-1fa609426d92 up in Southbound
Feb 16 13:38:44 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:38:44.793 105360 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7c:9b:d1 10.100.0.11'], port_security=['fa:16:3e:7c:9b:d1 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '3c7e1337-03a5-4860-9bdf-2ff0df92ca75', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '76c271745e704d5fa97fe16a7dcd4a81', 'neutron:revision_number': '11', 'neutron:security_group_ids': '832f9d98-96fb-45e2-8c11-0f4534711ee2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fb3ae2f5-242d-4288-83f4-c053c76499e5, chassis=[<ovs.db.idl.Row object at 0x7f2e53046760>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2e53046760>], logical_port=6eb3ffb6-7a82-44c5-98d8-1fa609426d92) old=Port_Binding(up=[False], additional_chassis=[<ovs.db.idl.Row object at 0x7f2e53046760>], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 13:38:44 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:38:44.795 105360 INFO neutron.agent.ovn.metadata.agent [-] Port 6eb3ffb6-7a82-44c5-98d8-1fa609426d92 in datapath 62a1ccdd-3048-4bbf-acc8-c791bff79ee8 bound to our chassis
Feb 16 13:38:44 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:38:44.796 105360 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 62a1ccdd-3048-4bbf-acc8-c791bff79ee8
Feb 16 13:38:44 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:38:44.814 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[2cd42ce1-1ed7-4223-9f14-74adf5383674]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:38:44 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:38:44.846 206452 DEBUG oslo.privsep.daemon [-] privsep: reply[0a913450-330c-41e7-a102-9adadff94dd1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:38:44 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:38:44.849 206452 DEBUG oslo.privsep.daemon [-] privsep: reply[9f84811a-1ae7-438b-aed5-2e5dbc82e914]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:38:44 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:38:44.871 206452 DEBUG oslo.privsep.daemon [-] privsep: reply[36bc94f2-588e-4dae-9420-c31a97e98c3c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:38:44 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:38:44.887 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[a67438a9-2cfd-4d8f-ac74-6bce8ecb7289]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap62a1ccdd-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a9:94:92'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 13, 'tx_packets': 5, 'rx_bytes': 826, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 13, 'tx_packets': 5, 'rx_bytes': 826, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 35], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 516408, 'reachable_time': 38827, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 211873, 'error': None, 'target': 'ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:38:44 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:38:44.901 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[7693448a-4d28-4cb5-9227-790e988eaca0]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap62a1ccdd-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 516418, 'tstamp': 516418}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 211874, 'error': None, 'target': 'ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap62a1ccdd-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 516420, 'tstamp': 516420}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 211874, 'error': None, 'target': 'ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:38:44 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:38:44.903 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap62a1ccdd-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:38:44 compute-0 nova_compute[185723]: 2026-02-16 13:38:44.905 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:38:44 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:38:44.907 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap62a1ccdd-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:38:44 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:38:44.907 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 13:38:44 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:38:44.907 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap62a1ccdd-30, col_values=(('external_ids', {'iface-id': 'ac21d57d-f71e-4560-b6aa-e9f6e3838308'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:38:44 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:38:44.908 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 13:38:45 compute-0 nova_compute[185723]: 2026-02-16 13:38:45.023 185727 INFO nova.compute.manager [None req-3bcd745a-8ea6-43a2-b0f5-f4f1ec42f113 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] Post operation of migration started
Feb 16 13:38:45 compute-0 nova_compute[185723]: 2026-02-16 13:38:45.632 185727 DEBUG oslo_concurrency.lockutils [None req-3bcd745a-8ea6-43a2-b0f5-f4f1ec42f113 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "refresh_cache-3c7e1337-03a5-4860-9bdf-2ff0df92ca75" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 13:38:45 compute-0 nova_compute[185723]: 2026-02-16 13:38:45.633 185727 DEBUG oslo_concurrency.lockutils [None req-3bcd745a-8ea6-43a2-b0f5-f4f1ec42f113 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Acquired lock "refresh_cache-3c7e1337-03a5-4860-9bdf-2ff0df92ca75" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 13:38:45 compute-0 nova_compute[185723]: 2026-02-16 13:38:45.633 185727 DEBUG nova.network.neutron [None req-3bcd745a-8ea6-43a2-b0f5-f4f1ec42f113 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 16 13:38:47 compute-0 nova_compute[185723]: 2026-02-16 13:38:47.245 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:38:48 compute-0 nova_compute[185723]: 2026-02-16 13:38:48.174 185727 DEBUG nova.network.neutron [None req-3bcd745a-8ea6-43a2-b0f5-f4f1ec42f113 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] Updating instance_info_cache with network_info: [{"id": "6eb3ffb6-7a82-44c5-98d8-1fa609426d92", "address": "fa:16:3e:7c:9b:d1", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6eb3ffb6-7a", "ovs_interfaceid": "6eb3ffb6-7a82-44c5-98d8-1fa609426d92", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 13:38:48 compute-0 nova_compute[185723]: 2026-02-16 13:38:48.208 185727 DEBUG oslo_concurrency.lockutils [None req-3bcd745a-8ea6-43a2-b0f5-f4f1ec42f113 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Releasing lock "refresh_cache-3c7e1337-03a5-4860-9bdf-2ff0df92ca75" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 13:38:48 compute-0 nova_compute[185723]: 2026-02-16 13:38:48.229 185727 DEBUG oslo_concurrency.lockutils [None req-3bcd745a-8ea6-43a2-b0f5-f4f1ec42f113 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:38:48 compute-0 nova_compute[185723]: 2026-02-16 13:38:48.230 185727 DEBUG oslo_concurrency.lockutils [None req-3bcd745a-8ea6-43a2-b0f5-f4f1ec42f113 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:38:48 compute-0 nova_compute[185723]: 2026-02-16 13:38:48.230 185727 DEBUG oslo_concurrency.lockutils [None req-3bcd745a-8ea6-43a2-b0f5-f4f1ec42f113 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:38:48 compute-0 nova_compute[185723]: 2026-02-16 13:38:48.236 185727 INFO nova.virt.libvirt.driver [None req-3bcd745a-8ea6-43a2-b0f5-f4f1ec42f113 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Feb 16 13:38:48 compute-0 virtqemud[184843]: Domain id=11 name='instance-0000000d' uuid=3c7e1337-03a5-4860-9bdf-2ff0df92ca75 is tainted: custom-monitor
Feb 16 13:38:49 compute-0 nova_compute[185723]: 2026-02-16 13:38:49.244 185727 INFO nova.virt.libvirt.driver [None req-3bcd745a-8ea6-43a2-b0f5-f4f1ec42f113 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Feb 16 13:38:49 compute-0 nova_compute[185723]: 2026-02-16 13:38:49.597 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:38:50 compute-0 nova_compute[185723]: 2026-02-16 13:38:50.251 185727 INFO nova.virt.libvirt.driver [None req-3bcd745a-8ea6-43a2-b0f5-f4f1ec42f113 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Feb 16 13:38:50 compute-0 nova_compute[185723]: 2026-02-16 13:38:50.257 185727 DEBUG nova.compute.manager [None req-3bcd745a-8ea6-43a2-b0f5-f4f1ec42f113 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:38:50 compute-0 nova_compute[185723]: 2026-02-16 13:38:50.288 185727 DEBUG nova.objects.instance [None req-3bcd745a-8ea6-43a2-b0f5-f4f1ec42f113 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Feb 16 13:38:52 compute-0 nova_compute[185723]: 2026-02-16 13:38:52.248 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:38:53 compute-0 podman[211876]: 2026-02-16 13:38:53.020165903 +0000 UTC m=+0.052902959 container health_status d69bd0be854ec8043fcba555b70c2cd311c7a2510d07d6bc9929fe7684abece9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 16 13:38:53 compute-0 podman[211875]: 2026-02-16 13:38:53.023110406 +0000 UTC m=+0.059047160 container health_status 93151dcbec9f159583042df5101bdd7b5b1bcb5f3117955b4bc9cd1c0e465b98 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, io.openshift.expose-services=, config_id=openstack_network_exporter, io.openshift.tags=minimal rhel9, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, org.opencontainers.image.created=2026-02-05T04:57:10Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, build-date=2026-02-05T04:57:10Z, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, release=1770267347, maintainer=Red Hat, Inc., version=9.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9/ubi-minimal, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Feb 16 13:38:54 compute-0 nova_compute[185723]: 2026-02-16 13:38:54.599 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:38:55 compute-0 nova_compute[185723]: 2026-02-16 13:38:55.874 185727 DEBUG oslo_concurrency.lockutils [None req-67a03f7a-3a84-4495-af3a-96d150f8de16 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Acquiring lock "81267e8d-93ab-405d-863c-176b83cabb76" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:38:55 compute-0 nova_compute[185723]: 2026-02-16 13:38:55.874 185727 DEBUG oslo_concurrency.lockutils [None req-67a03f7a-3a84-4495-af3a-96d150f8de16 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "81267e8d-93ab-405d-863c-176b83cabb76" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:38:55 compute-0 nova_compute[185723]: 2026-02-16 13:38:55.875 185727 DEBUG oslo_concurrency.lockutils [None req-67a03f7a-3a84-4495-af3a-96d150f8de16 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Acquiring lock "81267e8d-93ab-405d-863c-176b83cabb76-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:38:55 compute-0 nova_compute[185723]: 2026-02-16 13:38:55.875 185727 DEBUG oslo_concurrency.lockutils [None req-67a03f7a-3a84-4495-af3a-96d150f8de16 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "81267e8d-93ab-405d-863c-176b83cabb76-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:38:55 compute-0 nova_compute[185723]: 2026-02-16 13:38:55.875 185727 DEBUG oslo_concurrency.lockutils [None req-67a03f7a-3a84-4495-af3a-96d150f8de16 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "81267e8d-93ab-405d-863c-176b83cabb76-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:38:55 compute-0 nova_compute[185723]: 2026-02-16 13:38:55.877 185727 INFO nova.compute.manager [None req-67a03f7a-3a84-4495-af3a-96d150f8de16 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 81267e8d-93ab-405d-863c-176b83cabb76] Terminating instance
Feb 16 13:38:55 compute-0 nova_compute[185723]: 2026-02-16 13:38:55.878 185727 DEBUG nova.compute.manager [None req-67a03f7a-3a84-4495-af3a-96d150f8de16 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 81267e8d-93ab-405d-863c-176b83cabb76] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Feb 16 13:38:55 compute-0 kernel: tap844199d8-67 (unregistering): left promiscuous mode
Feb 16 13:38:55 compute-0 NetworkManager[56177]: <info>  [1771249135.9162] device (tap844199d8-67): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Feb 16 13:38:55 compute-0 nova_compute[185723]: 2026-02-16 13:38:55.926 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:38:55 compute-0 ovn_controller[96072]: 2026-02-16T13:38:55Z|00130|binding|INFO|Releasing lport 844199d8-6751-444b-b1e3-c6bb692ad49f from this chassis (sb_readonly=0)
Feb 16 13:38:55 compute-0 ovn_controller[96072]: 2026-02-16T13:38:55Z|00131|binding|INFO|Setting lport 844199d8-6751-444b-b1e3-c6bb692ad49f down in Southbound
Feb 16 13:38:55 compute-0 ovn_controller[96072]: 2026-02-16T13:38:55Z|00132|binding|INFO|Removing iface tap844199d8-67 ovn-installed in OVS
Feb 16 13:38:55 compute-0 nova_compute[185723]: 2026-02-16 13:38:55.932 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:38:55 compute-0 nova_compute[185723]: 2026-02-16 13:38:55.933 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:38:55 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:38:55.937 105360 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f6:00:5e 10.100.0.9'], port_security=['fa:16:3e:f6:00:5e 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '81267e8d-93ab-405d-863c-176b83cabb76', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '76c271745e704d5fa97fe16a7dcd4a81', 'neutron:revision_number': '4', 'neutron:security_group_ids': '832f9d98-96fb-45e2-8c11-0f4534711ee2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fb3ae2f5-242d-4288-83f4-c053c76499e5, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2e53046760>], logical_port=844199d8-6751-444b-b1e3-c6bb692ad49f) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2e53046760>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 13:38:55 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:38:55.938 105360 INFO neutron.agent.ovn.metadata.agent [-] Port 844199d8-6751-444b-b1e3-c6bb692ad49f in datapath 62a1ccdd-3048-4bbf-acc8-c791bff79ee8 unbound from our chassis
Feb 16 13:38:55 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:38:55.940 105360 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 62a1ccdd-3048-4bbf-acc8-c791bff79ee8
Feb 16 13:38:55 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:38:55.956 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[ac69d450-99e8-4b38-8a5d-65d91ff386b0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:38:55 compute-0 systemd[1]: machine-qemu\x2d10\x2dinstance\x2d0000000e.scope: Deactivated successfully.
Feb 16 13:38:55 compute-0 systemd[1]: machine-qemu\x2d10\x2dinstance\x2d0000000e.scope: Consumed 13.373s CPU time.
Feb 16 13:38:55 compute-0 systemd-machined[155229]: Machine qemu-10-instance-0000000e terminated.
Feb 16 13:38:55 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:38:55.984 206452 DEBUG oslo.privsep.daemon [-] privsep: reply[e17315e8-6db6-429b-9c67-44dc9043c44f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:38:55 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:38:55.988 206452 DEBUG oslo.privsep.daemon [-] privsep: reply[1f13fe2a-94f7-4850-8d8e-b84351e1fdf5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:38:56 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:38:56.011 206452 DEBUG oslo.privsep.daemon [-] privsep: reply[9547db31-d592-41cb-8714-e18d12fedf23]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:38:56 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:38:56.025 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[2ee8e8c8-9460-49b9-9f3d-1f68f854fef7]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap62a1ccdd-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a9:94:92'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 28, 'tx_packets': 7, 'rx_bytes': 1456, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 28, 'tx_packets': 7, 'rx_bytes': 1456, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 35], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 516408, 'reachable_time': 38827, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 211932, 'error': None, 'target': 'ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:38:56 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:38:56.038 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[e95db7e9-c677-4ae7-9cbf-1aedc2b377f9]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap62a1ccdd-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 516418, 'tstamp': 516418}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 211933, 'error': None, 'target': 'ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap62a1ccdd-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 516420, 'tstamp': 516420}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 211933, 'error': None, 'target': 'ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:38:56 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:38:56.040 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap62a1ccdd-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:38:56 compute-0 nova_compute[185723]: 2026-02-16 13:38:56.042 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:38:56 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:38:56.046 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap62a1ccdd-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:38:56 compute-0 nova_compute[185723]: 2026-02-16 13:38:56.046 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:38:56 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:38:56.047 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 13:38:56 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:38:56.047 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap62a1ccdd-30, col_values=(('external_ids', {'iface-id': 'ac21d57d-f71e-4560-b6aa-e9f6e3838308'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:38:56 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:38:56.048 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 13:38:56 compute-0 nova_compute[185723]: 2026-02-16 13:38:56.100 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:38:56 compute-0 nova_compute[185723]: 2026-02-16 13:38:56.103 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:38:56 compute-0 nova_compute[185723]: 2026-02-16 13:38:56.143 185727 INFO nova.virt.libvirt.driver [-] [instance: 81267e8d-93ab-405d-863c-176b83cabb76] Instance destroyed successfully.
Feb 16 13:38:56 compute-0 nova_compute[185723]: 2026-02-16 13:38:56.144 185727 DEBUG nova.objects.instance [None req-67a03f7a-3a84-4495-af3a-96d150f8de16 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lazy-loading 'resources' on Instance uuid 81267e8d-93ab-405d-863c-176b83cabb76 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 13:38:56 compute-0 nova_compute[185723]: 2026-02-16 13:38:56.166 185727 DEBUG nova.virt.libvirt.vif [None req-67a03f7a-3a84-4495-af3a-96d150f8de16 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-16T13:37:50Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-392044604',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-392044604',id=14,image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-16T13:38:01Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='76c271745e704d5fa97fe16a7dcd4a81',ramdisk_id='',reservation_id='r-d8munj1v',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-1085993185',owner_user_name='tempest-TestExecuteStrategies-1085993185-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-16T13:38:01Z,user_data=None,user_id='e19cd2d8a8894526ba620ca3249e9a63',uuid=81267e8d-93ab-405d-863c-176b83cabb76,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "844199d8-6751-444b-b1e3-c6bb692ad49f", "address": "fa:16:3e:f6:00:5e", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap844199d8-67", "ovs_interfaceid": "844199d8-6751-444b-b1e3-c6bb692ad49f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Feb 16 13:38:56 compute-0 nova_compute[185723]: 2026-02-16 13:38:56.166 185727 DEBUG nova.network.os_vif_util [None req-67a03f7a-3a84-4495-af3a-96d150f8de16 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Converting VIF {"id": "844199d8-6751-444b-b1e3-c6bb692ad49f", "address": "fa:16:3e:f6:00:5e", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap844199d8-67", "ovs_interfaceid": "844199d8-6751-444b-b1e3-c6bb692ad49f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 13:38:56 compute-0 nova_compute[185723]: 2026-02-16 13:38:56.167 185727 DEBUG nova.network.os_vif_util [None req-67a03f7a-3a84-4495-af3a-96d150f8de16 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:f6:00:5e,bridge_name='br-int',has_traffic_filtering=True,id=844199d8-6751-444b-b1e3-c6bb692ad49f,network=Network(62a1ccdd-3048-4bbf-acc8-c791bff79ee8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap844199d8-67') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 13:38:56 compute-0 nova_compute[185723]: 2026-02-16 13:38:56.167 185727 DEBUG os_vif [None req-67a03f7a-3a84-4495-af3a-96d150f8de16 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:f6:00:5e,bridge_name='br-int',has_traffic_filtering=True,id=844199d8-6751-444b-b1e3-c6bb692ad49f,network=Network(62a1ccdd-3048-4bbf-acc8-c791bff79ee8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap844199d8-67') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Feb 16 13:38:56 compute-0 nova_compute[185723]: 2026-02-16 13:38:56.169 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:38:56 compute-0 nova_compute[185723]: 2026-02-16 13:38:56.169 185727 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap844199d8-67, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:38:56 compute-0 nova_compute[185723]: 2026-02-16 13:38:56.171 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:38:56 compute-0 nova_compute[185723]: 2026-02-16 13:38:56.174 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 13:38:56 compute-0 nova_compute[185723]: 2026-02-16 13:38:56.176 185727 INFO os_vif [None req-67a03f7a-3a84-4495-af3a-96d150f8de16 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:f6:00:5e,bridge_name='br-int',has_traffic_filtering=True,id=844199d8-6751-444b-b1e3-c6bb692ad49f,network=Network(62a1ccdd-3048-4bbf-acc8-c791bff79ee8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap844199d8-67')
Feb 16 13:38:56 compute-0 nova_compute[185723]: 2026-02-16 13:38:56.177 185727 INFO nova.virt.libvirt.driver [None req-67a03f7a-3a84-4495-af3a-96d150f8de16 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 81267e8d-93ab-405d-863c-176b83cabb76] Deleting instance files /var/lib/nova/instances/81267e8d-93ab-405d-863c-176b83cabb76_del
Feb 16 13:38:56 compute-0 nova_compute[185723]: 2026-02-16 13:38:56.178 185727 INFO nova.virt.libvirt.driver [None req-67a03f7a-3a84-4495-af3a-96d150f8de16 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 81267e8d-93ab-405d-863c-176b83cabb76] Deletion of /var/lib/nova/instances/81267e8d-93ab-405d-863c-176b83cabb76_del complete
Feb 16 13:38:56 compute-0 podman[211950]: 2026-02-16 13:38:56.237942754 +0000 UTC m=+0.083849378 container health_status 19961a2f872cc72daf954f4dd556f980b5f58f137d145776df6132c167a8a93c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller)
Feb 16 13:38:56 compute-0 nova_compute[185723]: 2026-02-16 13:38:56.244 185727 INFO nova.compute.manager [None req-67a03f7a-3a84-4495-af3a-96d150f8de16 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 81267e8d-93ab-405d-863c-176b83cabb76] Took 0.37 seconds to destroy the instance on the hypervisor.
Feb 16 13:38:56 compute-0 nova_compute[185723]: 2026-02-16 13:38:56.244 185727 DEBUG oslo.service.loopingcall [None req-67a03f7a-3a84-4495-af3a-96d150f8de16 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Feb 16 13:38:56 compute-0 nova_compute[185723]: 2026-02-16 13:38:56.245 185727 DEBUG nova.compute.manager [-] [instance: 81267e8d-93ab-405d-863c-176b83cabb76] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Feb 16 13:38:56 compute-0 nova_compute[185723]: 2026-02-16 13:38:56.245 185727 DEBUG nova.network.neutron [-] [instance: 81267e8d-93ab-405d-863c-176b83cabb76] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Feb 16 13:38:56 compute-0 nova_compute[185723]: 2026-02-16 13:38:56.755 185727 DEBUG nova.compute.manager [req-7dddea4c-d8b0-4742-bf8f-1d3e20f4f239 req-c2591d9e-458e-491f-81a8-3e22581e241d faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 81267e8d-93ab-405d-863c-176b83cabb76] Received event network-vif-unplugged-844199d8-6751-444b-b1e3-c6bb692ad49f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:38:56 compute-0 nova_compute[185723]: 2026-02-16 13:38:56.755 185727 DEBUG oslo_concurrency.lockutils [req-7dddea4c-d8b0-4742-bf8f-1d3e20f4f239 req-c2591d9e-458e-491f-81a8-3e22581e241d faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "81267e8d-93ab-405d-863c-176b83cabb76-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:38:56 compute-0 nova_compute[185723]: 2026-02-16 13:38:56.756 185727 DEBUG oslo_concurrency.lockutils [req-7dddea4c-d8b0-4742-bf8f-1d3e20f4f239 req-c2591d9e-458e-491f-81a8-3e22581e241d faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "81267e8d-93ab-405d-863c-176b83cabb76-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:38:56 compute-0 nova_compute[185723]: 2026-02-16 13:38:56.756 185727 DEBUG oslo_concurrency.lockutils [req-7dddea4c-d8b0-4742-bf8f-1d3e20f4f239 req-c2591d9e-458e-491f-81a8-3e22581e241d faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "81267e8d-93ab-405d-863c-176b83cabb76-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:38:56 compute-0 nova_compute[185723]: 2026-02-16 13:38:56.756 185727 DEBUG nova.compute.manager [req-7dddea4c-d8b0-4742-bf8f-1d3e20f4f239 req-c2591d9e-458e-491f-81a8-3e22581e241d faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 81267e8d-93ab-405d-863c-176b83cabb76] No waiting events found dispatching network-vif-unplugged-844199d8-6751-444b-b1e3-c6bb692ad49f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:38:56 compute-0 nova_compute[185723]: 2026-02-16 13:38:56.757 185727 DEBUG nova.compute.manager [req-7dddea4c-d8b0-4742-bf8f-1d3e20f4f239 req-c2591d9e-458e-491f-81a8-3e22581e241d faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 81267e8d-93ab-405d-863c-176b83cabb76] Received event network-vif-unplugged-844199d8-6751-444b-b1e3-c6bb692ad49f for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Feb 16 13:38:57 compute-0 nova_compute[185723]: 2026-02-16 13:38:57.370 185727 DEBUG nova.network.neutron [-] [instance: 81267e8d-93ab-405d-863c-176b83cabb76] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 13:38:57 compute-0 nova_compute[185723]: 2026-02-16 13:38:57.400 185727 INFO nova.compute.manager [-] [instance: 81267e8d-93ab-405d-863c-176b83cabb76] Took 1.15 seconds to deallocate network for instance.
Feb 16 13:38:57 compute-0 nova_compute[185723]: 2026-02-16 13:38:57.733 185727 DEBUG oslo_concurrency.lockutils [None req-67a03f7a-3a84-4495-af3a-96d150f8de16 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:38:57 compute-0 nova_compute[185723]: 2026-02-16 13:38:57.734 185727 DEBUG oslo_concurrency.lockutils [None req-67a03f7a-3a84-4495-af3a-96d150f8de16 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:38:58 compute-0 nova_compute[185723]: 2026-02-16 13:38:58.359 185727 DEBUG nova.compute.provider_tree [None req-67a03f7a-3a84-4495-af3a-96d150f8de16 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Inventory has not changed in ProviderTree for provider: c9501a85-df32-4b8f-bce0-9425ef1e7866 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 13:38:58 compute-0 nova_compute[185723]: 2026-02-16 13:38:58.378 185727 DEBUG nova.scheduler.client.report [None req-67a03f7a-3a84-4495-af3a-96d150f8de16 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Inventory has not changed for provider c9501a85-df32-4b8f-bce0-9425ef1e7866 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 13:38:58 compute-0 nova_compute[185723]: 2026-02-16 13:38:58.410 185727 DEBUG oslo_concurrency.lockutils [None req-67a03f7a-3a84-4495-af3a-96d150f8de16 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.677s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:38:58 compute-0 nova_compute[185723]: 2026-02-16 13:38:58.437 185727 INFO nova.scheduler.client.report [None req-67a03f7a-3a84-4495-af3a-96d150f8de16 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Deleted allocations for instance 81267e8d-93ab-405d-863c-176b83cabb76
Feb 16 13:38:59 compute-0 nova_compute[185723]: 2026-02-16 13:38:59.180 185727 DEBUG oslo_concurrency.lockutils [None req-67a03f7a-3a84-4495-af3a-96d150f8de16 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "81267e8d-93ab-405d-863c-176b83cabb76" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.305s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:38:59 compute-0 nova_compute[185723]: 2026-02-16 13:38:59.185 185727 DEBUG nova.compute.manager [req-a6b66370-bb2a-4be4-b149-947b083df944 req-363669a9-f5c7-4913-ba3b-4617bd677d91 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 81267e8d-93ab-405d-863c-176b83cabb76] Received event network-vif-plugged-844199d8-6751-444b-b1e3-c6bb692ad49f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:38:59 compute-0 nova_compute[185723]: 2026-02-16 13:38:59.186 185727 DEBUG oslo_concurrency.lockutils [req-a6b66370-bb2a-4be4-b149-947b083df944 req-363669a9-f5c7-4913-ba3b-4617bd677d91 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "81267e8d-93ab-405d-863c-176b83cabb76-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:38:59 compute-0 nova_compute[185723]: 2026-02-16 13:38:59.186 185727 DEBUG oslo_concurrency.lockutils [req-a6b66370-bb2a-4be4-b149-947b083df944 req-363669a9-f5c7-4913-ba3b-4617bd677d91 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "81267e8d-93ab-405d-863c-176b83cabb76-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:38:59 compute-0 nova_compute[185723]: 2026-02-16 13:38:59.187 185727 DEBUG oslo_concurrency.lockutils [req-a6b66370-bb2a-4be4-b149-947b083df944 req-363669a9-f5c7-4913-ba3b-4617bd677d91 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "81267e8d-93ab-405d-863c-176b83cabb76-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:38:59 compute-0 nova_compute[185723]: 2026-02-16 13:38:59.187 185727 DEBUG nova.compute.manager [req-a6b66370-bb2a-4be4-b149-947b083df944 req-363669a9-f5c7-4913-ba3b-4617bd677d91 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 81267e8d-93ab-405d-863c-176b83cabb76] No waiting events found dispatching network-vif-plugged-844199d8-6751-444b-b1e3-c6bb692ad49f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:38:59 compute-0 nova_compute[185723]: 2026-02-16 13:38:59.187 185727 WARNING nova.compute.manager [req-a6b66370-bb2a-4be4-b149-947b083df944 req-363669a9-f5c7-4913-ba3b-4617bd677d91 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 81267e8d-93ab-405d-863c-176b83cabb76] Received unexpected event network-vif-plugged-844199d8-6751-444b-b1e3-c6bb692ad49f for instance with vm_state deleted and task_state None.
Feb 16 13:38:59 compute-0 nova_compute[185723]: 2026-02-16 13:38:59.188 185727 DEBUG nova.compute.manager [req-a6b66370-bb2a-4be4-b149-947b083df944 req-363669a9-f5c7-4913-ba3b-4617bd677d91 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 81267e8d-93ab-405d-863c-176b83cabb76] Received event network-vif-deleted-844199d8-6751-444b-b1e3-c6bb692ad49f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:38:59 compute-0 nova_compute[185723]: 2026-02-16 13:38:59.601 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:38:59 compute-0 podman[195053]: time="2026-02-16T13:38:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:38:59 compute-0 podman[195053]: @ - - [16/Feb/2026:13:38:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17245 "" "Go-http-client/1.1"
Feb 16 13:38:59 compute-0 podman[195053]: @ - - [16/Feb/2026:13:38:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2638 "" "Go-http-client/1.1"
Feb 16 13:39:00 compute-0 nova_compute[185723]: 2026-02-16 13:39:00.554 185727 DEBUG oslo_concurrency.lockutils [None req-093a78fb-7940-41b4-8165-3aa81094b130 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Acquiring lock "3c7e1337-03a5-4860-9bdf-2ff0df92ca75" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:39:00 compute-0 nova_compute[185723]: 2026-02-16 13:39:00.555 185727 DEBUG oslo_concurrency.lockutils [None req-093a78fb-7940-41b4-8165-3aa81094b130 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "3c7e1337-03a5-4860-9bdf-2ff0df92ca75" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:39:00 compute-0 nova_compute[185723]: 2026-02-16 13:39:00.555 185727 DEBUG oslo_concurrency.lockutils [None req-093a78fb-7940-41b4-8165-3aa81094b130 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Acquiring lock "3c7e1337-03a5-4860-9bdf-2ff0df92ca75-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:39:00 compute-0 nova_compute[185723]: 2026-02-16 13:39:00.556 185727 DEBUG oslo_concurrency.lockutils [None req-093a78fb-7940-41b4-8165-3aa81094b130 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "3c7e1337-03a5-4860-9bdf-2ff0df92ca75-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:39:00 compute-0 nova_compute[185723]: 2026-02-16 13:39:00.556 185727 DEBUG oslo_concurrency.lockutils [None req-093a78fb-7940-41b4-8165-3aa81094b130 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "3c7e1337-03a5-4860-9bdf-2ff0df92ca75-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:39:00 compute-0 nova_compute[185723]: 2026-02-16 13:39:00.557 185727 INFO nova.compute.manager [None req-093a78fb-7940-41b4-8165-3aa81094b130 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] Terminating instance
Feb 16 13:39:00 compute-0 nova_compute[185723]: 2026-02-16 13:39:00.558 185727 DEBUG nova.compute.manager [None req-093a78fb-7940-41b4-8165-3aa81094b130 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Feb 16 13:39:00 compute-0 kernel: tap6eb3ffb6-7a (unregistering): left promiscuous mode
Feb 16 13:39:00 compute-0 NetworkManager[56177]: <info>  [1771249140.5958] device (tap6eb3ffb6-7a): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Feb 16 13:39:00 compute-0 ovn_controller[96072]: 2026-02-16T13:39:00Z|00133|binding|INFO|Releasing lport 6eb3ffb6-7a82-44c5-98d8-1fa609426d92 from this chassis (sb_readonly=0)
Feb 16 13:39:00 compute-0 ovn_controller[96072]: 2026-02-16T13:39:00Z|00134|binding|INFO|Setting lport 6eb3ffb6-7a82-44c5-98d8-1fa609426d92 down in Southbound
Feb 16 13:39:00 compute-0 nova_compute[185723]: 2026-02-16 13:39:00.655 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:39:00 compute-0 ovn_controller[96072]: 2026-02-16T13:39:00Z|00135|binding|INFO|Removing iface tap6eb3ffb6-7a ovn-installed in OVS
Feb 16 13:39:00 compute-0 nova_compute[185723]: 2026-02-16 13:39:00.658 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:39:00 compute-0 nova_compute[185723]: 2026-02-16 13:39:00.660 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:39:00 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:39:00.665 105360 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7c:9b:d1 10.100.0.11'], port_security=['fa:16:3e:7c:9b:d1 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '3c7e1337-03a5-4860-9bdf-2ff0df92ca75', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '76c271745e704d5fa97fe16a7dcd4a81', 'neutron:revision_number': '13', 'neutron:security_group_ids': '832f9d98-96fb-45e2-8c11-0f4534711ee2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fb3ae2f5-242d-4288-83f4-c053c76499e5, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2e53046760>], logical_port=6eb3ffb6-7a82-44c5-98d8-1fa609426d92) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2e53046760>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 13:39:00 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:39:00.667 105360 INFO neutron.agent.ovn.metadata.agent [-] Port 6eb3ffb6-7a82-44c5-98d8-1fa609426d92 in datapath 62a1ccdd-3048-4bbf-acc8-c791bff79ee8 unbound from our chassis
Feb 16 13:39:00 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:39:00.668 105360 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 62a1ccdd-3048-4bbf-acc8-c791bff79ee8, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 16 13:39:00 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:39:00.669 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[3f67f760-c631-4418-9971-b60b6726a27b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:39:00 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:39:00.670 105360 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8 namespace which is not needed anymore
Feb 16 13:39:00 compute-0 systemd[1]: machine-qemu\x2d11\x2dinstance\x2d0000000d.scope: Deactivated successfully.
Feb 16 13:39:00 compute-0 systemd[1]: machine-qemu\x2d11\x2dinstance\x2d0000000d.scope: Consumed 2.425s CPU time.
Feb 16 13:39:00 compute-0 systemd-machined[155229]: Machine qemu-11-instance-0000000d terminated.
Feb 16 13:39:00 compute-0 nova_compute[185723]: 2026-02-16 13:39:00.777 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:39:00 compute-0 nova_compute[185723]: 2026-02-16 13:39:00.780 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:39:00 compute-0 sshd-session[211981]: Invalid user admin from 64.227.72.94 port 38004
Feb 16 13:39:00 compute-0 neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8[211647]: [NOTICE]   (211651) : haproxy version is 2.8.14-c23fe91
Feb 16 13:39:00 compute-0 neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8[211647]: [NOTICE]   (211651) : path to executable is /usr/sbin/haproxy
Feb 16 13:39:00 compute-0 neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8[211647]: [WARNING]  (211651) : Exiting Master process...
Feb 16 13:39:00 compute-0 neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8[211647]: [ALERT]    (211651) : Current worker (211653) exited with code 143 (Terminated)
Feb 16 13:39:00 compute-0 neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8[211647]: [WARNING]  (211651) : All workers exited. Exiting... (0)
Feb 16 13:39:00 compute-0 systemd[1]: libpod-0421dc033ff09d5637f4e4a4083dfc2b66d8d673943b17bb7638d65f3c7fe588.scope: Deactivated successfully.
Feb 16 13:39:00 compute-0 nova_compute[185723]: 2026-02-16 13:39:00.813 185727 INFO nova.virt.libvirt.driver [-] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] Instance destroyed successfully.
Feb 16 13:39:00 compute-0 nova_compute[185723]: 2026-02-16 13:39:00.814 185727 DEBUG nova.objects.instance [None req-093a78fb-7940-41b4-8165-3aa81094b130 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lazy-loading 'resources' on Instance uuid 3c7e1337-03a5-4860-9bdf-2ff0df92ca75 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 13:39:00 compute-0 podman[212007]: 2026-02-16 13:39:00.818435738 +0000 UTC m=+0.065553759 container died 0421dc033ff09d5637f4e4a4083dfc2b66d8d673943b17bb7638d65f3c7fe588 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Feb 16 13:39:00 compute-0 nova_compute[185723]: 2026-02-16 13:39:00.839 185727 DEBUG nova.virt.libvirt.vif [None req-093a78fb-7940-41b4-8165-3aa81094b130 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-02-16T13:37:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-931541268',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-931541268',id=13,image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-16T13:37:42Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='76c271745e704d5fa97fe16a7dcd4a81',ramdisk_id='',reservation_id='r-3jycgte6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-1085993185',owner_user_name='tempest-TestExecuteStrategies-1085993185-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-16T13:38:50Z,user_data=None,user_id='e19cd2d8a8894526ba620ca3249e9a63',uuid=3c7e1337-03a5-4860-9bdf-2ff0df92ca75,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "6eb3ffb6-7a82-44c5-98d8-1fa609426d92", "address": "fa:16:3e:7c:9b:d1", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6eb3ffb6-7a", "ovs_interfaceid": "6eb3ffb6-7a82-44c5-98d8-1fa609426d92", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Feb 16 13:39:00 compute-0 nova_compute[185723]: 2026-02-16 13:39:00.839 185727 DEBUG nova.network.os_vif_util [None req-093a78fb-7940-41b4-8165-3aa81094b130 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Converting VIF {"id": "6eb3ffb6-7a82-44c5-98d8-1fa609426d92", "address": "fa:16:3e:7c:9b:d1", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6eb3ffb6-7a", "ovs_interfaceid": "6eb3ffb6-7a82-44c5-98d8-1fa609426d92", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 13:39:00 compute-0 nova_compute[185723]: 2026-02-16 13:39:00.841 185727 DEBUG nova.network.os_vif_util [None req-093a78fb-7940-41b4-8165-3aa81094b130 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:7c:9b:d1,bridge_name='br-int',has_traffic_filtering=True,id=6eb3ffb6-7a82-44c5-98d8-1fa609426d92,network=Network(62a1ccdd-3048-4bbf-acc8-c791bff79ee8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6eb3ffb6-7a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 13:39:00 compute-0 nova_compute[185723]: 2026-02-16 13:39:00.841 185727 DEBUG os_vif [None req-093a78fb-7940-41b4-8165-3aa81094b130 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:7c:9b:d1,bridge_name='br-int',has_traffic_filtering=True,id=6eb3ffb6-7a82-44c5-98d8-1fa609426d92,network=Network(62a1ccdd-3048-4bbf-acc8-c791bff79ee8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6eb3ffb6-7a') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Feb 16 13:39:00 compute-0 nova_compute[185723]: 2026-02-16 13:39:00.844 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:39:00 compute-0 nova_compute[185723]: 2026-02-16 13:39:00.844 185727 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6eb3ffb6-7a, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:39:00 compute-0 nova_compute[185723]: 2026-02-16 13:39:00.846 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:39:00 compute-0 nova_compute[185723]: 2026-02-16 13:39:00.848 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:39:00 compute-0 nova_compute[185723]: 2026-02-16 13:39:00.850 185727 INFO os_vif [None req-093a78fb-7940-41b4-8165-3aa81094b130 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:7c:9b:d1,bridge_name='br-int',has_traffic_filtering=True,id=6eb3ffb6-7a82-44c5-98d8-1fa609426d92,network=Network(62a1ccdd-3048-4bbf-acc8-c791bff79ee8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6eb3ffb6-7a')
Feb 16 13:39:00 compute-0 nova_compute[185723]: 2026-02-16 13:39:00.851 185727 INFO nova.virt.libvirt.driver [None req-093a78fb-7940-41b4-8165-3aa81094b130 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] Deleting instance files /var/lib/nova/instances/3c7e1337-03a5-4860-9bdf-2ff0df92ca75_del
Feb 16 13:39:00 compute-0 nova_compute[185723]: 2026-02-16 13:39:00.852 185727 INFO nova.virt.libvirt.driver [None req-093a78fb-7940-41b4-8165-3aa81094b130 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] Deletion of /var/lib/nova/instances/3c7e1337-03a5-4860-9bdf-2ff0df92ca75_del complete
Feb 16 13:39:00 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-0421dc033ff09d5637f4e4a4083dfc2b66d8d673943b17bb7638d65f3c7fe588-userdata-shm.mount: Deactivated successfully.
Feb 16 13:39:00 compute-0 systemd[1]: var-lib-containers-storage-overlay-546a8eadafdc42fd135a470bfcb33779600f325b4b0af04b3f1ad8bb791dfa70-merged.mount: Deactivated successfully.
Feb 16 13:39:00 compute-0 sshd-session[211981]: Connection closed by invalid user admin 64.227.72.94 port 38004 [preauth]
Feb 16 13:39:00 compute-0 podman[212007]: 2026-02-16 13:39:00.869780838 +0000 UTC m=+0.116898859 container cleanup 0421dc033ff09d5637f4e4a4083dfc2b66d8d673943b17bb7638d65f3c7fe588 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Feb 16 13:39:00 compute-0 systemd[1]: libpod-conmon-0421dc033ff09d5637f4e4a4083dfc2b66d8d673943b17bb7638d65f3c7fe588.scope: Deactivated successfully.
Feb 16 13:39:00 compute-0 podman[212054]: 2026-02-16 13:39:00.921072576 +0000 UTC m=+0.033400710 container remove 0421dc033ff09d5637f4e4a4083dfc2b66d8d673943b17bb7638d65f3c7fe588 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Feb 16 13:39:00 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:39:00.925 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[53e51f53-9803-4341-bb96-b6eeb8ed2a2b]: (4, ('Mon Feb 16 01:39:00 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8 (0421dc033ff09d5637f4e4a4083dfc2b66d8d673943b17bb7638d65f3c7fe588)\n0421dc033ff09d5637f4e4a4083dfc2b66d8d673943b17bb7638d65f3c7fe588\nMon Feb 16 01:39:00 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8 (0421dc033ff09d5637f4e4a4083dfc2b66d8d673943b17bb7638d65f3c7fe588)\n0421dc033ff09d5637f4e4a4083dfc2b66d8d673943b17bb7638d65f3c7fe588\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:39:00 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:39:00.926 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[54e7c8a7-cd2a-4ae8-874a-7b5754776923]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:39:00 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:39:00.927 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap62a1ccdd-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:39:00 compute-0 nova_compute[185723]: 2026-02-16 13:39:00.929 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:39:00 compute-0 kernel: tap62a1ccdd-30: left promiscuous mode
Feb 16 13:39:00 compute-0 nova_compute[185723]: 2026-02-16 13:39:00.934 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:39:00 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:39:00.936 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[745e2944-592c-48f9-be76-7909d0d3d48d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:39:00 compute-0 nova_compute[185723]: 2026-02-16 13:39:00.945 185727 INFO nova.compute.manager [None req-093a78fb-7940-41b4-8165-3aa81094b130 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] Took 0.39 seconds to destroy the instance on the hypervisor.
Feb 16 13:39:00 compute-0 nova_compute[185723]: 2026-02-16 13:39:00.946 185727 DEBUG oslo.service.loopingcall [None req-093a78fb-7940-41b4-8165-3aa81094b130 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Feb 16 13:39:00 compute-0 nova_compute[185723]: 2026-02-16 13:39:00.946 185727 DEBUG nova.compute.manager [-] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Feb 16 13:39:00 compute-0 nova_compute[185723]: 2026-02-16 13:39:00.946 185727 DEBUG nova.network.neutron [-] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Feb 16 13:39:00 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:39:00.956 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[87e36156-2866-4400-82ef-c86fd4659a2f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:39:00 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:39:00.958 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[52cab9e5-cd09-49af-bf15-cc3eaa5ffad6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:39:00 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:39:00.967 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[a84902d8-e97d-4493-8a63-1c9badf1aa0b]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 516401, 'reachable_time': 41983, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 212070, 'error': None, 'target': 'ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:39:00 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:39:00.969 105762 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Feb 16 13:39:00 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:39:00.970 105762 DEBUG oslo.privsep.daemon [-] privsep: reply[8121a187-b9bb-4a7d-8784-e1bc2050f3c7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:39:00 compute-0 systemd[1]: run-netns-ovnmeta\x2d62a1ccdd\x2d3048\x2d4bbf\x2dacc8\x2dc791bff79ee8.mount: Deactivated successfully.
Feb 16 13:39:01 compute-0 nova_compute[185723]: 2026-02-16 13:39:01.088 185727 DEBUG nova.compute.manager [req-ce9c6bc2-c7e7-440a-8430-46566f8e65ec req-2f9f43a3-8c88-4f0a-935e-e14001be3253 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] Received event network-vif-unplugged-6eb3ffb6-7a82-44c5-98d8-1fa609426d92 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:39:01 compute-0 nova_compute[185723]: 2026-02-16 13:39:01.088 185727 DEBUG oslo_concurrency.lockutils [req-ce9c6bc2-c7e7-440a-8430-46566f8e65ec req-2f9f43a3-8c88-4f0a-935e-e14001be3253 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "3c7e1337-03a5-4860-9bdf-2ff0df92ca75-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:39:01 compute-0 nova_compute[185723]: 2026-02-16 13:39:01.088 185727 DEBUG oslo_concurrency.lockutils [req-ce9c6bc2-c7e7-440a-8430-46566f8e65ec req-2f9f43a3-8c88-4f0a-935e-e14001be3253 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "3c7e1337-03a5-4860-9bdf-2ff0df92ca75-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:39:01 compute-0 nova_compute[185723]: 2026-02-16 13:39:01.089 185727 DEBUG oslo_concurrency.lockutils [req-ce9c6bc2-c7e7-440a-8430-46566f8e65ec req-2f9f43a3-8c88-4f0a-935e-e14001be3253 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "3c7e1337-03a5-4860-9bdf-2ff0df92ca75-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:39:01 compute-0 nova_compute[185723]: 2026-02-16 13:39:01.089 185727 DEBUG nova.compute.manager [req-ce9c6bc2-c7e7-440a-8430-46566f8e65ec req-2f9f43a3-8c88-4f0a-935e-e14001be3253 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] No waiting events found dispatching network-vif-unplugged-6eb3ffb6-7a82-44c5-98d8-1fa609426d92 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:39:01 compute-0 nova_compute[185723]: 2026-02-16 13:39:01.089 185727 DEBUG nova.compute.manager [req-ce9c6bc2-c7e7-440a-8430-46566f8e65ec req-2f9f43a3-8c88-4f0a-935e-e14001be3253 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] Received event network-vif-unplugged-6eb3ffb6-7a82-44c5-98d8-1fa609426d92 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Feb 16 13:39:01 compute-0 openstack_network_exporter[197909]: ERROR   13:39:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:39:01 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:39:01 compute-0 openstack_network_exporter[197909]: ERROR   13:39:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:39:01 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:39:01 compute-0 nova_compute[185723]: 2026-02-16 13:39:01.905 185727 DEBUG nova.network.neutron [-] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 13:39:02 compute-0 nova_compute[185723]: 2026-02-16 13:39:02.230 185727 INFO nova.compute.manager [-] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] Took 1.28 seconds to deallocate network for instance.
Feb 16 13:39:02 compute-0 nova_compute[185723]: 2026-02-16 13:39:02.285 185727 DEBUG oslo_concurrency.lockutils [None req-093a78fb-7940-41b4-8165-3aa81094b130 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:39:02 compute-0 nova_compute[185723]: 2026-02-16 13:39:02.286 185727 DEBUG oslo_concurrency.lockutils [None req-093a78fb-7940-41b4-8165-3aa81094b130 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:39:02 compute-0 nova_compute[185723]: 2026-02-16 13:39:02.293 185727 DEBUG oslo_concurrency.lockutils [None req-093a78fb-7940-41b4-8165-3aa81094b130 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.007s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:39:02 compute-0 nova_compute[185723]: 2026-02-16 13:39:02.362 185727 INFO nova.scheduler.client.report [None req-093a78fb-7940-41b4-8165-3aa81094b130 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Deleted allocations for instance 3c7e1337-03a5-4860-9bdf-2ff0df92ca75
Feb 16 13:39:02 compute-0 nova_compute[185723]: 2026-02-16 13:39:02.454 185727 DEBUG oslo_concurrency.lockutils [None req-093a78fb-7940-41b4-8165-3aa81094b130 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "3c7e1337-03a5-4860-9bdf-2ff0df92ca75" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.899s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:39:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:39:03.231 105360 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:39:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:39:03.232 105360 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:39:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:39:03.233 105360 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:39:03 compute-0 nova_compute[185723]: 2026-02-16 13:39:03.261 185727 DEBUG nova.compute.manager [req-35d23d10-46e2-4794-9180-9392049f7d20 req-3c233650-d3db-4433-af2b-03c0a650e2fc faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] Received event network-vif-plugged-6eb3ffb6-7a82-44c5-98d8-1fa609426d92 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:39:03 compute-0 nova_compute[185723]: 2026-02-16 13:39:03.261 185727 DEBUG oslo_concurrency.lockutils [req-35d23d10-46e2-4794-9180-9392049f7d20 req-3c233650-d3db-4433-af2b-03c0a650e2fc faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "3c7e1337-03a5-4860-9bdf-2ff0df92ca75-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:39:03 compute-0 nova_compute[185723]: 2026-02-16 13:39:03.261 185727 DEBUG oslo_concurrency.lockutils [req-35d23d10-46e2-4794-9180-9392049f7d20 req-3c233650-d3db-4433-af2b-03c0a650e2fc faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "3c7e1337-03a5-4860-9bdf-2ff0df92ca75-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:39:03 compute-0 nova_compute[185723]: 2026-02-16 13:39:03.262 185727 DEBUG oslo_concurrency.lockutils [req-35d23d10-46e2-4794-9180-9392049f7d20 req-3c233650-d3db-4433-af2b-03c0a650e2fc faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "3c7e1337-03a5-4860-9bdf-2ff0df92ca75-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:39:03 compute-0 nova_compute[185723]: 2026-02-16 13:39:03.262 185727 DEBUG nova.compute.manager [req-35d23d10-46e2-4794-9180-9392049f7d20 req-3c233650-d3db-4433-af2b-03c0a650e2fc faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] No waiting events found dispatching network-vif-plugged-6eb3ffb6-7a82-44c5-98d8-1fa609426d92 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:39:03 compute-0 nova_compute[185723]: 2026-02-16 13:39:03.262 185727 WARNING nova.compute.manager [req-35d23d10-46e2-4794-9180-9392049f7d20 req-3c233650-d3db-4433-af2b-03c0a650e2fc faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] Received unexpected event network-vif-plugged-6eb3ffb6-7a82-44c5-98d8-1fa609426d92 for instance with vm_state deleted and task_state None.
Feb 16 13:39:03 compute-0 nova_compute[185723]: 2026-02-16 13:39:03.262 185727 DEBUG nova.compute.manager [req-35d23d10-46e2-4794-9180-9392049f7d20 req-3c233650-d3db-4433-af2b-03c0a650e2fc faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] Received event network-vif-deleted-6eb3ffb6-7a82-44c5-98d8-1fa609426d92 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:39:04 compute-0 nova_compute[185723]: 2026-02-16 13:39:04.604 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:39:05 compute-0 nova_compute[185723]: 2026-02-16 13:39:05.847 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:39:07 compute-0 podman[212071]: 2026-02-16 13:39:07.021231958 +0000 UTC m=+0.055981325 container health_status 4ec6105d1727f201f656d7e6e358931be8ceabc5af37fb5ad779abc620122180 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 16 13:39:09 compute-0 nova_compute[185723]: 2026-02-16 13:39:09.854 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:39:10 compute-0 nova_compute[185723]: 2026-02-16 13:39:10.850 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:39:11 compute-0 nova_compute[185723]: 2026-02-16 13:39:11.141 185727 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1771249136.1404314, 81267e8d-93ab-405d-863c-176b83cabb76 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 13:39:11 compute-0 nova_compute[185723]: 2026-02-16 13:39:11.142 185727 INFO nova.compute.manager [-] [instance: 81267e8d-93ab-405d-863c-176b83cabb76] VM Stopped (Lifecycle Event)
Feb 16 13:39:11 compute-0 nova_compute[185723]: 2026-02-16 13:39:11.178 185727 DEBUG nova.compute.manager [None req-bcfce839-30f0-4dd7-86e7-6b2836eac184 - - - - - -] [instance: 81267e8d-93ab-405d-863c-176b83cabb76] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:39:14 compute-0 nova_compute[185723]: 2026-02-16 13:39:14.856 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:39:15 compute-0 nova_compute[185723]: 2026-02-16 13:39:15.461 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:39:15 compute-0 nova_compute[185723]: 2026-02-16 13:39:15.813 185727 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1771249140.8099966, 3c7e1337-03a5-4860-9bdf-2ff0df92ca75 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 13:39:15 compute-0 nova_compute[185723]: 2026-02-16 13:39:15.814 185727 INFO nova.compute.manager [-] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] VM Stopped (Lifecycle Event)
Feb 16 13:39:15 compute-0 nova_compute[185723]: 2026-02-16 13:39:15.846 185727 DEBUG nova.compute.manager [None req-69391a56-f930-4284-9eb5-2b6c2ddea1f3 - - - - - -] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:39:15 compute-0 nova_compute[185723]: 2026-02-16 13:39:15.854 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:39:16 compute-0 nova_compute[185723]: 2026-02-16 13:39:16.429 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:39:16 compute-0 nova_compute[185723]: 2026-02-16 13:39:16.432 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:39:16 compute-0 nova_compute[185723]: 2026-02-16 13:39:16.432 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 16 13:39:16 compute-0 nova_compute[185723]: 2026-02-16 13:39:16.432 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 16 13:39:16 compute-0 nova_compute[185723]: 2026-02-16 13:39:16.471 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 16 13:39:16 compute-0 nova_compute[185723]: 2026-02-16 13:39:16.471 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:39:16 compute-0 nova_compute[185723]: 2026-02-16 13:39:16.471 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:39:16 compute-0 nova_compute[185723]: 2026-02-16 13:39:16.472 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:39:16 compute-0 nova_compute[185723]: 2026-02-16 13:39:16.521 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:39:16 compute-0 nova_compute[185723]: 2026-02-16 13:39:16.522 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:39:16 compute-0 nova_compute[185723]: 2026-02-16 13:39:16.522 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:39:16 compute-0 nova_compute[185723]: 2026-02-16 13:39:16.523 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 16 13:39:16 compute-0 nova_compute[185723]: 2026-02-16 13:39:16.712 185727 WARNING nova.virt.libvirt.driver [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 13:39:16 compute-0 nova_compute[185723]: 2026-02-16 13:39:16.713 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5847MB free_disk=73.22515106201172GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 16 13:39:16 compute-0 nova_compute[185723]: 2026-02-16 13:39:16.713 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:39:16 compute-0 nova_compute[185723]: 2026-02-16 13:39:16.713 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:39:16 compute-0 nova_compute[185723]: 2026-02-16 13:39:16.810 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 16 13:39:16 compute-0 nova_compute[185723]: 2026-02-16 13:39:16.811 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 16 13:39:16 compute-0 nova_compute[185723]: 2026-02-16 13:39:16.843 185727 DEBUG nova.compute.provider_tree [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Inventory has not changed in ProviderTree for provider: c9501a85-df32-4b8f-bce0-9425ef1e7866 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 13:39:16 compute-0 nova_compute[185723]: 2026-02-16 13:39:16.877 185727 DEBUG nova.scheduler.client.report [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Inventory has not changed for provider c9501a85-df32-4b8f-bce0-9425ef1e7866 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 13:39:16 compute-0 nova_compute[185723]: 2026-02-16 13:39:16.915 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 16 13:39:16 compute-0 nova_compute[185723]: 2026-02-16 13:39:16.916 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.203s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:39:18 compute-0 nova_compute[185723]: 2026-02-16 13:39:18.877 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:39:18 compute-0 nova_compute[185723]: 2026-02-16 13:39:18.902 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:39:19 compute-0 nova_compute[185723]: 2026-02-16 13:39:19.434 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:39:19 compute-0 nova_compute[185723]: 2026-02-16 13:39:19.434 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 16 13:39:19 compute-0 nova_compute[185723]: 2026-02-16 13:39:19.858 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:39:20 compute-0 nova_compute[185723]: 2026-02-16 13:39:20.858 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:39:21 compute-0 sshd-session[212096]: Invalid user hadoop from 146.190.22.227 port 53832
Feb 16 13:39:21 compute-0 nova_compute[185723]: 2026-02-16 13:39:21.435 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:39:22 compute-0 sshd-session[212096]: Connection closed by invalid user hadoop 146.190.22.227 port 53832 [preauth]
Feb 16 13:39:22 compute-0 sshd-session[212099]: Invalid user admin from 146.190.226.24 port 38718
Feb 16 13:39:22 compute-0 sshd-session[212099]: Connection closed by invalid user admin 146.190.226.24 port 38718 [preauth]
Feb 16 13:39:24 compute-0 podman[212102]: 2026-02-16 13:39:24.011088744 +0000 UTC m=+0.045280723 container health_status d69bd0be854ec8043fcba555b70c2cd311c7a2510d07d6bc9929fe7684abece9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 16 13:39:24 compute-0 podman[212101]: 2026-02-16 13:39:24.015057631 +0000 UTC m=+0.051456854 container health_status 93151dcbec9f159583042df5101bdd7b5b1bcb5f3117955b4bc9cd1c0e465b98 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1770267347, vendor=Red Hat, Inc., distribution-scope=public, maintainer=Red Hat, Inc., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9/ubi-minimal, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, org.opencontainers.image.created=2026-02-05T04:57:10Z, version=9.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.buildah.version=1.33.7, config_id=openstack_network_exporter, managed_by=edpm_ansible, build-date=2026-02-05T04:57:10Z, architecture=x86_64, io.openshift.tags=minimal rhel9)
Feb 16 13:39:24 compute-0 nova_compute[185723]: 2026-02-16 13:39:24.859 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:39:25 compute-0 nova_compute[185723]: 2026-02-16 13:39:25.964 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:39:27 compute-0 podman[212140]: 2026-02-16 13:39:27.088368087 +0000 UTC m=+0.132344468 container health_status 19961a2f872cc72daf954f4dd556f980b5f58f137d145776df6132c167a8a93c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2)
Feb 16 13:39:27 compute-0 sshd-session[212166]: Invalid user postgres from 188.166.42.159 port 50154
Feb 16 13:39:27 compute-0 sshd-session[212166]: Connection closed by invalid user postgres 188.166.42.159 port 50154 [preauth]
Feb 16 13:39:29 compute-0 podman[195053]: time="2026-02-16T13:39:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:39:29 compute-0 podman[195053]: @ - - [16/Feb/2026:13:39:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16012 "" "Go-http-client/1.1"
Feb 16 13:39:29 compute-0 podman[195053]: @ - - [16/Feb/2026:13:39:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2178 "" "Go-http-client/1.1"
Feb 16 13:39:29 compute-0 nova_compute[185723]: 2026-02-16 13:39:29.861 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:39:30 compute-0 nova_compute[185723]: 2026-02-16 13:39:30.969 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:39:31 compute-0 openstack_network_exporter[197909]: ERROR   13:39:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:39:31 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:39:31 compute-0 openstack_network_exporter[197909]: ERROR   13:39:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:39:31 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:39:34 compute-0 nova_compute[185723]: 2026-02-16 13:39:34.864 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:39:36 compute-0 nova_compute[185723]: 2026-02-16 13:39:36.124 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:39:38 compute-0 podman[212168]: 2026-02-16 13:39:38.009179886 +0000 UTC m=+0.045438186 container health_status 4ec6105d1727f201f656d7e6e358931be8ceabc5af37fb5ad779abc620122180 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 16 13:39:39 compute-0 nova_compute[185723]: 2026-02-16 13:39:39.865 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:39:41 compute-0 nova_compute[185723]: 2026-02-16 13:39:41.131 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:39:44 compute-0 nova_compute[185723]: 2026-02-16 13:39:44.867 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:39:46 compute-0 nova_compute[185723]: 2026-02-16 13:39:46.012 185727 DEBUG oslo_concurrency.lockutils [None req-c39ebf5d-c856-474e-9028-98b0a35bbe86 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Acquiring lock "93d211b1-f197-4c96-a994-900df3bf28e4" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:39:46 compute-0 nova_compute[185723]: 2026-02-16 13:39:46.013 185727 DEBUG oslo_concurrency.lockutils [None req-c39ebf5d-c856-474e-9028-98b0a35bbe86 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "93d211b1-f197-4c96-a994-900df3bf28e4" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:39:46 compute-0 nova_compute[185723]: 2026-02-16 13:39:46.048 185727 DEBUG nova.compute.manager [None req-c39ebf5d-c856-474e-9028-98b0a35bbe86 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 93d211b1-f197-4c96-a994-900df3bf28e4] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Feb 16 13:39:46 compute-0 nova_compute[185723]: 2026-02-16 13:39:46.132 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:39:46 compute-0 nova_compute[185723]: 2026-02-16 13:39:46.144 185727 DEBUG oslo_concurrency.lockutils [None req-c39ebf5d-c856-474e-9028-98b0a35bbe86 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:39:46 compute-0 nova_compute[185723]: 2026-02-16 13:39:46.144 185727 DEBUG oslo_concurrency.lockutils [None req-c39ebf5d-c856-474e-9028-98b0a35bbe86 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:39:46 compute-0 nova_compute[185723]: 2026-02-16 13:39:46.152 185727 DEBUG nova.virt.hardware [None req-c39ebf5d-c856-474e-9028-98b0a35bbe86 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Feb 16 13:39:46 compute-0 nova_compute[185723]: 2026-02-16 13:39:46.153 185727 INFO nova.compute.claims [None req-c39ebf5d-c856-474e-9028-98b0a35bbe86 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 93d211b1-f197-4c96-a994-900df3bf28e4] Claim successful on node compute-0.ctlplane.example.com
Feb 16 13:39:46 compute-0 nova_compute[185723]: 2026-02-16 13:39:46.284 185727 DEBUG nova.compute.provider_tree [None req-c39ebf5d-c856-474e-9028-98b0a35bbe86 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Inventory has not changed in ProviderTree for provider: c9501a85-df32-4b8f-bce0-9425ef1e7866 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 13:39:46 compute-0 nova_compute[185723]: 2026-02-16 13:39:46.303 185727 DEBUG nova.scheduler.client.report [None req-c39ebf5d-c856-474e-9028-98b0a35bbe86 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Inventory has not changed for provider c9501a85-df32-4b8f-bce0-9425ef1e7866 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 13:39:46 compute-0 nova_compute[185723]: 2026-02-16 13:39:46.348 185727 DEBUG oslo_concurrency.lockutils [None req-c39ebf5d-c856-474e-9028-98b0a35bbe86 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.203s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:39:46 compute-0 nova_compute[185723]: 2026-02-16 13:39:46.348 185727 DEBUG nova.compute.manager [None req-c39ebf5d-c856-474e-9028-98b0a35bbe86 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 93d211b1-f197-4c96-a994-900df3bf28e4] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Feb 16 13:39:46 compute-0 nova_compute[185723]: 2026-02-16 13:39:46.414 185727 DEBUG nova.compute.manager [None req-c39ebf5d-c856-474e-9028-98b0a35bbe86 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 93d211b1-f197-4c96-a994-900df3bf28e4] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Feb 16 13:39:46 compute-0 nova_compute[185723]: 2026-02-16 13:39:46.414 185727 DEBUG nova.network.neutron [None req-c39ebf5d-c856-474e-9028-98b0a35bbe86 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 93d211b1-f197-4c96-a994-900df3bf28e4] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Feb 16 13:39:46 compute-0 nova_compute[185723]: 2026-02-16 13:39:46.444 185727 INFO nova.virt.libvirt.driver [None req-c39ebf5d-c856-474e-9028-98b0a35bbe86 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 93d211b1-f197-4c96-a994-900df3bf28e4] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Feb 16 13:39:46 compute-0 nova_compute[185723]: 2026-02-16 13:39:46.467 185727 DEBUG nova.compute.manager [None req-c39ebf5d-c856-474e-9028-98b0a35bbe86 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 93d211b1-f197-4c96-a994-900df3bf28e4] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Feb 16 13:39:46 compute-0 nova_compute[185723]: 2026-02-16 13:39:46.571 185727 DEBUG nova.compute.manager [None req-c39ebf5d-c856-474e-9028-98b0a35bbe86 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 93d211b1-f197-4c96-a994-900df3bf28e4] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Feb 16 13:39:46 compute-0 nova_compute[185723]: 2026-02-16 13:39:46.572 185727 DEBUG nova.virt.libvirt.driver [None req-c39ebf5d-c856-474e-9028-98b0a35bbe86 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 93d211b1-f197-4c96-a994-900df3bf28e4] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Feb 16 13:39:46 compute-0 nova_compute[185723]: 2026-02-16 13:39:46.573 185727 INFO nova.virt.libvirt.driver [None req-c39ebf5d-c856-474e-9028-98b0a35bbe86 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 93d211b1-f197-4c96-a994-900df3bf28e4] Creating image(s)
Feb 16 13:39:46 compute-0 nova_compute[185723]: 2026-02-16 13:39:46.573 185727 DEBUG oslo_concurrency.lockutils [None req-c39ebf5d-c856-474e-9028-98b0a35bbe86 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Acquiring lock "/var/lib/nova/instances/93d211b1-f197-4c96-a994-900df3bf28e4/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:39:46 compute-0 nova_compute[185723]: 2026-02-16 13:39:46.574 185727 DEBUG oslo_concurrency.lockutils [None req-c39ebf5d-c856-474e-9028-98b0a35bbe86 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "/var/lib/nova/instances/93d211b1-f197-4c96-a994-900df3bf28e4/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:39:46 compute-0 nova_compute[185723]: 2026-02-16 13:39:46.574 185727 DEBUG oslo_concurrency.lockutils [None req-c39ebf5d-c856-474e-9028-98b0a35bbe86 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "/var/lib/nova/instances/93d211b1-f197-4c96-a994-900df3bf28e4/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:39:46 compute-0 nova_compute[185723]: 2026-02-16 13:39:46.588 185727 DEBUG oslo_concurrency.processutils [None req-c39ebf5d-c856-474e-9028-98b0a35bbe86 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:39:46 compute-0 nova_compute[185723]: 2026-02-16 13:39:46.635 185727 DEBUG oslo_concurrency.processutils [None req-c39ebf5d-c856-474e-9028-98b0a35bbe86 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json" returned: 0 in 0.047s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:39:46 compute-0 nova_compute[185723]: 2026-02-16 13:39:46.636 185727 DEBUG oslo_concurrency.lockutils [None req-c39ebf5d-c856-474e-9028-98b0a35bbe86 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Acquiring lock "755ae6cb63977e865a71a8c38a1a67ce95c9d0b7" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:39:46 compute-0 nova_compute[185723]: 2026-02-16 13:39:46.637 185727 DEBUG oslo_concurrency.lockutils [None req-c39ebf5d-c856-474e-9028-98b0a35bbe86 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "755ae6cb63977e865a71a8c38a1a67ce95c9d0b7" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:39:46 compute-0 nova_compute[185723]: 2026-02-16 13:39:46.652 185727 DEBUG oslo_concurrency.processutils [None req-c39ebf5d-c856-474e-9028-98b0a35bbe86 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:39:46 compute-0 nova_compute[185723]: 2026-02-16 13:39:46.696 185727 DEBUG oslo_concurrency.processutils [None req-c39ebf5d-c856-474e-9028-98b0a35bbe86 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json" returned: 0 in 0.045s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:39:46 compute-0 nova_compute[185723]: 2026-02-16 13:39:46.697 185727 DEBUG oslo_concurrency.processutils [None req-c39ebf5d-c856-474e-9028-98b0a35bbe86 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7,backing_fmt=raw /var/lib/nova/instances/93d211b1-f197-4c96-a994-900df3bf28e4/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:39:46 compute-0 nova_compute[185723]: 2026-02-16 13:39:46.729 185727 DEBUG oslo_concurrency.processutils [None req-c39ebf5d-c856-474e-9028-98b0a35bbe86 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7,backing_fmt=raw /var/lib/nova/instances/93d211b1-f197-4c96-a994-900df3bf28e4/disk 1073741824" returned: 0 in 0.031s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:39:46 compute-0 nova_compute[185723]: 2026-02-16 13:39:46.730 185727 DEBUG oslo_concurrency.lockutils [None req-c39ebf5d-c856-474e-9028-98b0a35bbe86 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "755ae6cb63977e865a71a8c38a1a67ce95c9d0b7" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.094s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:39:46 compute-0 nova_compute[185723]: 2026-02-16 13:39:46.731 185727 DEBUG oslo_concurrency.processutils [None req-c39ebf5d-c856-474e-9028-98b0a35bbe86 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:39:46 compute-0 nova_compute[185723]: 2026-02-16 13:39:46.779 185727 DEBUG oslo_concurrency.processutils [None req-c39ebf5d-c856-474e-9028-98b0a35bbe86 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json" returned: 0 in 0.048s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:39:46 compute-0 nova_compute[185723]: 2026-02-16 13:39:46.780 185727 DEBUG nova.virt.disk.api [None req-c39ebf5d-c856-474e-9028-98b0a35bbe86 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Checking if we can resize image /var/lib/nova/instances/93d211b1-f197-4c96-a994-900df3bf28e4/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Feb 16 13:39:46 compute-0 nova_compute[185723]: 2026-02-16 13:39:46.780 185727 DEBUG oslo_concurrency.processutils [None req-c39ebf5d-c856-474e-9028-98b0a35bbe86 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/93d211b1-f197-4c96-a994-900df3bf28e4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:39:46 compute-0 nova_compute[185723]: 2026-02-16 13:39:46.832 185727 DEBUG oslo_concurrency.processutils [None req-c39ebf5d-c856-474e-9028-98b0a35bbe86 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/93d211b1-f197-4c96-a994-900df3bf28e4/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:39:46 compute-0 nova_compute[185723]: 2026-02-16 13:39:46.834 185727 DEBUG nova.virt.disk.api [None req-c39ebf5d-c856-474e-9028-98b0a35bbe86 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Cannot resize image /var/lib/nova/instances/93d211b1-f197-4c96-a994-900df3bf28e4/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Feb 16 13:39:46 compute-0 nova_compute[185723]: 2026-02-16 13:39:46.835 185727 DEBUG nova.objects.instance [None req-c39ebf5d-c856-474e-9028-98b0a35bbe86 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lazy-loading 'migration_context' on Instance uuid 93d211b1-f197-4c96-a994-900df3bf28e4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 13:39:46 compute-0 nova_compute[185723]: 2026-02-16 13:39:46.855 185727 DEBUG nova.virt.libvirt.driver [None req-c39ebf5d-c856-474e-9028-98b0a35bbe86 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 93d211b1-f197-4c96-a994-900df3bf28e4] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Feb 16 13:39:46 compute-0 nova_compute[185723]: 2026-02-16 13:39:46.855 185727 DEBUG nova.virt.libvirt.driver [None req-c39ebf5d-c856-474e-9028-98b0a35bbe86 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 93d211b1-f197-4c96-a994-900df3bf28e4] Ensure instance console log exists: /var/lib/nova/instances/93d211b1-f197-4c96-a994-900df3bf28e4/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Feb 16 13:39:46 compute-0 nova_compute[185723]: 2026-02-16 13:39:46.856 185727 DEBUG oslo_concurrency.lockutils [None req-c39ebf5d-c856-474e-9028-98b0a35bbe86 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:39:46 compute-0 nova_compute[185723]: 2026-02-16 13:39:46.856 185727 DEBUG oslo_concurrency.lockutils [None req-c39ebf5d-c856-474e-9028-98b0a35bbe86 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:39:46 compute-0 nova_compute[185723]: 2026-02-16 13:39:46.856 185727 DEBUG oslo_concurrency.lockutils [None req-c39ebf5d-c856-474e-9028-98b0a35bbe86 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:39:47 compute-0 nova_compute[185723]: 2026-02-16 13:39:47.416 185727 DEBUG nova.policy [None req-c39ebf5d-c856-474e-9028-98b0a35bbe86 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'e19cd2d8a8894526ba620ca3249e9a63', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '76c271745e704d5fa97fe16a7dcd4a81', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Feb 16 13:39:47 compute-0 sshd-session[212209]: Invalid user admin from 64.227.72.94 port 51990
Feb 16 13:39:47 compute-0 sshd-session[212209]: Connection closed by invalid user admin 64.227.72.94 port 51990 [preauth]
Feb 16 13:39:48 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:39:48.995 105360 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=20, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '52:96:1b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '16:6c:7a:bd:49:39'}, ipsec=False) old=SB_Global(nb_cfg=19) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 13:39:48 compute-0 nova_compute[185723]: 2026-02-16 13:39:48.996 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:39:48 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:39:48.996 105360 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 16 13:39:49 compute-0 nova_compute[185723]: 2026-02-16 13:39:49.125 185727 DEBUG nova.network.neutron [None req-c39ebf5d-c856-474e-9028-98b0a35bbe86 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 93d211b1-f197-4c96-a994-900df3bf28e4] Successfully created port: 3dd3d50b-ad63-4bee-b823-c23750e7afc1 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Feb 16 13:39:49 compute-0 nova_compute[185723]: 2026-02-16 13:39:49.870 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:39:51 compute-0 nova_compute[185723]: 2026-02-16 13:39:51.134 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:39:51 compute-0 nova_compute[185723]: 2026-02-16 13:39:51.516 185727 DEBUG nova.network.neutron [None req-c39ebf5d-c856-474e-9028-98b0a35bbe86 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 93d211b1-f197-4c96-a994-900df3bf28e4] Successfully updated port: 3dd3d50b-ad63-4bee-b823-c23750e7afc1 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Feb 16 13:39:51 compute-0 nova_compute[185723]: 2026-02-16 13:39:51.548 185727 DEBUG oslo_concurrency.lockutils [None req-c39ebf5d-c856-474e-9028-98b0a35bbe86 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Acquiring lock "refresh_cache-93d211b1-f197-4c96-a994-900df3bf28e4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 13:39:51 compute-0 nova_compute[185723]: 2026-02-16 13:39:51.548 185727 DEBUG oslo_concurrency.lockutils [None req-c39ebf5d-c856-474e-9028-98b0a35bbe86 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Acquired lock "refresh_cache-93d211b1-f197-4c96-a994-900df3bf28e4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 13:39:51 compute-0 nova_compute[185723]: 2026-02-16 13:39:51.548 185727 DEBUG nova.network.neutron [None req-c39ebf5d-c856-474e-9028-98b0a35bbe86 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 93d211b1-f197-4c96-a994-900df3bf28e4] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 16 13:39:51 compute-0 nova_compute[185723]: 2026-02-16 13:39:51.725 185727 DEBUG nova.compute.manager [req-9210bd77-18eb-48f9-8d97-0d2aa4563078 req-61982301-bb6e-4d30-9fdc-64158dc27207 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 93d211b1-f197-4c96-a994-900df3bf28e4] Received event network-changed-3dd3d50b-ad63-4bee-b823-c23750e7afc1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:39:51 compute-0 nova_compute[185723]: 2026-02-16 13:39:51.726 185727 DEBUG nova.compute.manager [req-9210bd77-18eb-48f9-8d97-0d2aa4563078 req-61982301-bb6e-4d30-9fdc-64158dc27207 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 93d211b1-f197-4c96-a994-900df3bf28e4] Refreshing instance network info cache due to event network-changed-3dd3d50b-ad63-4bee-b823-c23750e7afc1. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 16 13:39:51 compute-0 nova_compute[185723]: 2026-02-16 13:39:51.726 185727 DEBUG oslo_concurrency.lockutils [req-9210bd77-18eb-48f9-8d97-0d2aa4563078 req-61982301-bb6e-4d30-9fdc-64158dc27207 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "refresh_cache-93d211b1-f197-4c96-a994-900df3bf28e4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 13:39:51 compute-0 nova_compute[185723]: 2026-02-16 13:39:51.844 185727 DEBUG nova.network.neutron [None req-c39ebf5d-c856-474e-9028-98b0a35bbe86 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 93d211b1-f197-4c96-a994-900df3bf28e4] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Feb 16 13:39:52 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:39:52.000 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=b0e583b2-47d7-4bde-bbd6-282143e0c194, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '20'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:39:53 compute-0 nova_compute[185723]: 2026-02-16 13:39:53.770 185727 DEBUG nova.network.neutron [None req-c39ebf5d-c856-474e-9028-98b0a35bbe86 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 93d211b1-f197-4c96-a994-900df3bf28e4] Updating instance_info_cache with network_info: [{"id": "3dd3d50b-ad63-4bee-b823-c23750e7afc1", "address": "fa:16:3e:dd:e6:11", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3dd3d50b-ad", "ovs_interfaceid": "3dd3d50b-ad63-4bee-b823-c23750e7afc1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 13:39:53 compute-0 nova_compute[185723]: 2026-02-16 13:39:53.799 185727 DEBUG oslo_concurrency.lockutils [None req-c39ebf5d-c856-474e-9028-98b0a35bbe86 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Releasing lock "refresh_cache-93d211b1-f197-4c96-a994-900df3bf28e4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 13:39:53 compute-0 nova_compute[185723]: 2026-02-16 13:39:53.800 185727 DEBUG nova.compute.manager [None req-c39ebf5d-c856-474e-9028-98b0a35bbe86 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 93d211b1-f197-4c96-a994-900df3bf28e4] Instance network_info: |[{"id": "3dd3d50b-ad63-4bee-b823-c23750e7afc1", "address": "fa:16:3e:dd:e6:11", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3dd3d50b-ad", "ovs_interfaceid": "3dd3d50b-ad63-4bee-b823-c23750e7afc1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Feb 16 13:39:53 compute-0 nova_compute[185723]: 2026-02-16 13:39:53.800 185727 DEBUG oslo_concurrency.lockutils [req-9210bd77-18eb-48f9-8d97-0d2aa4563078 req-61982301-bb6e-4d30-9fdc-64158dc27207 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquired lock "refresh_cache-93d211b1-f197-4c96-a994-900df3bf28e4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 13:39:53 compute-0 nova_compute[185723]: 2026-02-16 13:39:53.801 185727 DEBUG nova.network.neutron [req-9210bd77-18eb-48f9-8d97-0d2aa4563078 req-61982301-bb6e-4d30-9fdc-64158dc27207 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 93d211b1-f197-4c96-a994-900df3bf28e4] Refreshing network info cache for port 3dd3d50b-ad63-4bee-b823-c23750e7afc1 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 16 13:39:53 compute-0 nova_compute[185723]: 2026-02-16 13:39:53.803 185727 DEBUG nova.virt.libvirt.driver [None req-c39ebf5d-c856-474e-9028-98b0a35bbe86 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 93d211b1-f197-4c96-a994-900df3bf28e4] Start _get_guest_xml network_info=[{"id": "3dd3d50b-ad63-4bee-b823-c23750e7afc1", "address": "fa:16:3e:dd:e6:11", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3dd3d50b-ad", "ovs_interfaceid": "3dd3d50b-ad63-4bee-b823-c23750e7afc1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-16T13:17:01Z,direct_url=<?>,disk_format='qcow2',id=6fb9af7f-2971-4890-a777-6e99e888717f,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='fa8ec31824694513a42cc22c81880a5c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-16T13:17:03Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'encrypted': False, 'device_type': 'disk', 'disk_bus': 'virtio', 'encryption_options': None, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'size': 0, 'image_id': '6fb9af7f-2971-4890-a777-6e99e888717f'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Feb 16 13:39:53 compute-0 nova_compute[185723]: 2026-02-16 13:39:53.807 185727 WARNING nova.virt.libvirt.driver [None req-c39ebf5d-c856-474e-9028-98b0a35bbe86 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 13:39:53 compute-0 nova_compute[185723]: 2026-02-16 13:39:53.811 185727 DEBUG nova.virt.libvirt.host [None req-c39ebf5d-c856-474e-9028-98b0a35bbe86 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Feb 16 13:39:53 compute-0 nova_compute[185723]: 2026-02-16 13:39:53.812 185727 DEBUG nova.virt.libvirt.host [None req-c39ebf5d-c856-474e-9028-98b0a35bbe86 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Feb 16 13:39:53 compute-0 nova_compute[185723]: 2026-02-16 13:39:53.816 185727 DEBUG nova.virt.libvirt.host [None req-c39ebf5d-c856-474e-9028-98b0a35bbe86 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Feb 16 13:39:53 compute-0 nova_compute[185723]: 2026-02-16 13:39:53.817 185727 DEBUG nova.virt.libvirt.host [None req-c39ebf5d-c856-474e-9028-98b0a35bbe86 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Feb 16 13:39:53 compute-0 nova_compute[185723]: 2026-02-16 13:39:53.818 185727 DEBUG nova.virt.libvirt.driver [None req-c39ebf5d-c856-474e-9028-98b0a35bbe86 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Feb 16 13:39:53 compute-0 nova_compute[185723]: 2026-02-16 13:39:53.818 185727 DEBUG nova.virt.hardware [None req-c39ebf5d-c856-474e-9028-98b0a35bbe86 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-02-16T13:16:59Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='6d89f72c-1760-421e-a5f2-83dfc3723b84',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-16T13:17:01Z,direct_url=<?>,disk_format='qcow2',id=6fb9af7f-2971-4890-a777-6e99e888717f,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='fa8ec31824694513a42cc22c81880a5c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-16T13:17:03Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Feb 16 13:39:53 compute-0 nova_compute[185723]: 2026-02-16 13:39:53.819 185727 DEBUG nova.virt.hardware [None req-c39ebf5d-c856-474e-9028-98b0a35bbe86 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Feb 16 13:39:53 compute-0 nova_compute[185723]: 2026-02-16 13:39:53.819 185727 DEBUG nova.virt.hardware [None req-c39ebf5d-c856-474e-9028-98b0a35bbe86 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Feb 16 13:39:53 compute-0 nova_compute[185723]: 2026-02-16 13:39:53.819 185727 DEBUG nova.virt.hardware [None req-c39ebf5d-c856-474e-9028-98b0a35bbe86 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Feb 16 13:39:53 compute-0 nova_compute[185723]: 2026-02-16 13:39:53.819 185727 DEBUG nova.virt.hardware [None req-c39ebf5d-c856-474e-9028-98b0a35bbe86 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Feb 16 13:39:53 compute-0 nova_compute[185723]: 2026-02-16 13:39:53.820 185727 DEBUG nova.virt.hardware [None req-c39ebf5d-c856-474e-9028-98b0a35bbe86 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Feb 16 13:39:53 compute-0 nova_compute[185723]: 2026-02-16 13:39:53.820 185727 DEBUG nova.virt.hardware [None req-c39ebf5d-c856-474e-9028-98b0a35bbe86 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Feb 16 13:39:53 compute-0 nova_compute[185723]: 2026-02-16 13:39:53.820 185727 DEBUG nova.virt.hardware [None req-c39ebf5d-c856-474e-9028-98b0a35bbe86 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Feb 16 13:39:53 compute-0 nova_compute[185723]: 2026-02-16 13:39:53.821 185727 DEBUG nova.virt.hardware [None req-c39ebf5d-c856-474e-9028-98b0a35bbe86 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Feb 16 13:39:53 compute-0 nova_compute[185723]: 2026-02-16 13:39:53.821 185727 DEBUG nova.virt.hardware [None req-c39ebf5d-c856-474e-9028-98b0a35bbe86 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Feb 16 13:39:53 compute-0 nova_compute[185723]: 2026-02-16 13:39:53.821 185727 DEBUG nova.virt.hardware [None req-c39ebf5d-c856-474e-9028-98b0a35bbe86 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Feb 16 13:39:53 compute-0 nova_compute[185723]: 2026-02-16 13:39:53.824 185727 DEBUG nova.virt.libvirt.vif [None req-c39ebf5d-c856-474e-9028-98b0a35bbe86 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-16T13:39:44Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-804254472',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-804254472',id=16,image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='76c271745e704d5fa97fe16a7dcd4a81',ramdisk_id='',reservation_id='r-c0ijme3l',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-1085993185',owner_user_name='tempest-TestExecuteStrategies-1085993185-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-16T13:39:46Z,user_data=None,user_id='e19cd2d8a8894526ba620ca3249e9a63',uuid=93d211b1-f197-4c96-a994-900df3bf28e4,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3dd3d50b-ad63-4bee-b823-c23750e7afc1", "address": "fa:16:3e:dd:e6:11", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3dd3d50b-ad", "ovs_interfaceid": "3dd3d50b-ad63-4bee-b823-c23750e7afc1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Feb 16 13:39:53 compute-0 nova_compute[185723]: 2026-02-16 13:39:53.824 185727 DEBUG nova.network.os_vif_util [None req-c39ebf5d-c856-474e-9028-98b0a35bbe86 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Converting VIF {"id": "3dd3d50b-ad63-4bee-b823-c23750e7afc1", "address": "fa:16:3e:dd:e6:11", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3dd3d50b-ad", "ovs_interfaceid": "3dd3d50b-ad63-4bee-b823-c23750e7afc1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 13:39:53 compute-0 nova_compute[185723]: 2026-02-16 13:39:53.825 185727 DEBUG nova.network.os_vif_util [None req-c39ebf5d-c856-474e-9028-98b0a35bbe86 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:dd:e6:11,bridge_name='br-int',has_traffic_filtering=True,id=3dd3d50b-ad63-4bee-b823-c23750e7afc1,network=Network(62a1ccdd-3048-4bbf-acc8-c791bff79ee8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3dd3d50b-ad') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 13:39:53 compute-0 nova_compute[185723]: 2026-02-16 13:39:53.826 185727 DEBUG nova.objects.instance [None req-c39ebf5d-c856-474e-9028-98b0a35bbe86 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lazy-loading 'pci_devices' on Instance uuid 93d211b1-f197-4c96-a994-900df3bf28e4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 13:39:53 compute-0 nova_compute[185723]: 2026-02-16 13:39:53.866 185727 DEBUG nova.virt.libvirt.driver [None req-c39ebf5d-c856-474e-9028-98b0a35bbe86 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 93d211b1-f197-4c96-a994-900df3bf28e4] End _get_guest_xml xml=<domain type="kvm">
Feb 16 13:39:53 compute-0 nova_compute[185723]:   <uuid>93d211b1-f197-4c96-a994-900df3bf28e4</uuid>
Feb 16 13:39:53 compute-0 nova_compute[185723]:   <name>instance-00000010</name>
Feb 16 13:39:53 compute-0 nova_compute[185723]:   <memory>131072</memory>
Feb 16 13:39:53 compute-0 nova_compute[185723]:   <vcpu>1</vcpu>
Feb 16 13:39:53 compute-0 nova_compute[185723]:   <metadata>
Feb 16 13:39:53 compute-0 nova_compute[185723]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb 16 13:39:53 compute-0 nova_compute[185723]:       <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Feb 16 13:39:53 compute-0 nova_compute[185723]:       <nova:name>tempest-TestExecuteStrategies-server-804254472</nova:name>
Feb 16 13:39:53 compute-0 nova_compute[185723]:       <nova:creationTime>2026-02-16 13:39:53</nova:creationTime>
Feb 16 13:39:53 compute-0 nova_compute[185723]:       <nova:flavor name="m1.nano">
Feb 16 13:39:53 compute-0 nova_compute[185723]:         <nova:memory>128</nova:memory>
Feb 16 13:39:53 compute-0 nova_compute[185723]:         <nova:disk>1</nova:disk>
Feb 16 13:39:53 compute-0 nova_compute[185723]:         <nova:swap>0</nova:swap>
Feb 16 13:39:53 compute-0 nova_compute[185723]:         <nova:ephemeral>0</nova:ephemeral>
Feb 16 13:39:53 compute-0 nova_compute[185723]:         <nova:vcpus>1</nova:vcpus>
Feb 16 13:39:53 compute-0 nova_compute[185723]:       </nova:flavor>
Feb 16 13:39:53 compute-0 nova_compute[185723]:       <nova:owner>
Feb 16 13:39:53 compute-0 nova_compute[185723]:         <nova:user uuid="e19cd2d8a8894526ba620ca3249e9a63">tempest-TestExecuteStrategies-1085993185-project-member</nova:user>
Feb 16 13:39:53 compute-0 nova_compute[185723]:         <nova:project uuid="76c271745e704d5fa97fe16a7dcd4a81">tempest-TestExecuteStrategies-1085993185</nova:project>
Feb 16 13:39:53 compute-0 nova_compute[185723]:       </nova:owner>
Feb 16 13:39:53 compute-0 nova_compute[185723]:       <nova:root type="image" uuid="6fb9af7f-2971-4890-a777-6e99e888717f"/>
Feb 16 13:39:53 compute-0 nova_compute[185723]:       <nova:ports>
Feb 16 13:39:53 compute-0 nova_compute[185723]:         <nova:port uuid="3dd3d50b-ad63-4bee-b823-c23750e7afc1">
Feb 16 13:39:53 compute-0 nova_compute[185723]:           <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Feb 16 13:39:53 compute-0 nova_compute[185723]:         </nova:port>
Feb 16 13:39:53 compute-0 nova_compute[185723]:       </nova:ports>
Feb 16 13:39:53 compute-0 nova_compute[185723]:     </nova:instance>
Feb 16 13:39:53 compute-0 nova_compute[185723]:   </metadata>
Feb 16 13:39:53 compute-0 nova_compute[185723]:   <sysinfo type="smbios">
Feb 16 13:39:53 compute-0 nova_compute[185723]:     <system>
Feb 16 13:39:53 compute-0 nova_compute[185723]:       <entry name="manufacturer">RDO</entry>
Feb 16 13:39:53 compute-0 nova_compute[185723]:       <entry name="product">OpenStack Compute</entry>
Feb 16 13:39:53 compute-0 nova_compute[185723]:       <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Feb 16 13:39:53 compute-0 nova_compute[185723]:       <entry name="serial">93d211b1-f197-4c96-a994-900df3bf28e4</entry>
Feb 16 13:39:53 compute-0 nova_compute[185723]:       <entry name="uuid">93d211b1-f197-4c96-a994-900df3bf28e4</entry>
Feb 16 13:39:53 compute-0 nova_compute[185723]:       <entry name="family">Virtual Machine</entry>
Feb 16 13:39:53 compute-0 nova_compute[185723]:     </system>
Feb 16 13:39:53 compute-0 nova_compute[185723]:   </sysinfo>
Feb 16 13:39:53 compute-0 nova_compute[185723]:   <os>
Feb 16 13:39:53 compute-0 nova_compute[185723]:     <type arch="x86_64" machine="q35">hvm</type>
Feb 16 13:39:53 compute-0 nova_compute[185723]:     <boot dev="hd"/>
Feb 16 13:39:53 compute-0 nova_compute[185723]:     <smbios mode="sysinfo"/>
Feb 16 13:39:53 compute-0 nova_compute[185723]:   </os>
Feb 16 13:39:53 compute-0 nova_compute[185723]:   <features>
Feb 16 13:39:53 compute-0 nova_compute[185723]:     <acpi/>
Feb 16 13:39:53 compute-0 nova_compute[185723]:     <apic/>
Feb 16 13:39:53 compute-0 nova_compute[185723]:     <vmcoreinfo/>
Feb 16 13:39:53 compute-0 nova_compute[185723]:   </features>
Feb 16 13:39:53 compute-0 nova_compute[185723]:   <clock offset="utc">
Feb 16 13:39:53 compute-0 nova_compute[185723]:     <timer name="pit" tickpolicy="delay"/>
Feb 16 13:39:53 compute-0 nova_compute[185723]:     <timer name="rtc" tickpolicy="catchup"/>
Feb 16 13:39:53 compute-0 nova_compute[185723]:     <timer name="hpet" present="no"/>
Feb 16 13:39:53 compute-0 nova_compute[185723]:   </clock>
Feb 16 13:39:53 compute-0 nova_compute[185723]:   <cpu mode="custom" match="exact">
Feb 16 13:39:53 compute-0 nova_compute[185723]:     <model>Nehalem</model>
Feb 16 13:39:53 compute-0 nova_compute[185723]:     <topology sockets="1" cores="1" threads="1"/>
Feb 16 13:39:53 compute-0 nova_compute[185723]:   </cpu>
Feb 16 13:39:53 compute-0 nova_compute[185723]:   <devices>
Feb 16 13:39:53 compute-0 nova_compute[185723]:     <disk type="file" device="disk">
Feb 16 13:39:53 compute-0 nova_compute[185723]:       <driver name="qemu" type="qcow2" cache="none"/>
Feb 16 13:39:53 compute-0 nova_compute[185723]:       <source file="/var/lib/nova/instances/93d211b1-f197-4c96-a994-900df3bf28e4/disk"/>
Feb 16 13:39:53 compute-0 nova_compute[185723]:       <target dev="vda" bus="virtio"/>
Feb 16 13:39:53 compute-0 nova_compute[185723]:     </disk>
Feb 16 13:39:53 compute-0 nova_compute[185723]:     <disk type="file" device="cdrom">
Feb 16 13:39:53 compute-0 nova_compute[185723]:       <driver name="qemu" type="raw" cache="none"/>
Feb 16 13:39:53 compute-0 nova_compute[185723]:       <source file="/var/lib/nova/instances/93d211b1-f197-4c96-a994-900df3bf28e4/disk.config"/>
Feb 16 13:39:53 compute-0 nova_compute[185723]:       <target dev="sda" bus="sata"/>
Feb 16 13:39:53 compute-0 nova_compute[185723]:     </disk>
Feb 16 13:39:53 compute-0 nova_compute[185723]:     <interface type="ethernet">
Feb 16 13:39:53 compute-0 nova_compute[185723]:       <mac address="fa:16:3e:dd:e6:11"/>
Feb 16 13:39:53 compute-0 nova_compute[185723]:       <model type="virtio"/>
Feb 16 13:39:53 compute-0 nova_compute[185723]:       <driver name="vhost" rx_queue_size="512"/>
Feb 16 13:39:53 compute-0 nova_compute[185723]:       <mtu size="1442"/>
Feb 16 13:39:53 compute-0 nova_compute[185723]:       <target dev="tap3dd3d50b-ad"/>
Feb 16 13:39:53 compute-0 nova_compute[185723]:     </interface>
Feb 16 13:39:53 compute-0 nova_compute[185723]:     <serial type="pty">
Feb 16 13:39:53 compute-0 nova_compute[185723]:       <log file="/var/lib/nova/instances/93d211b1-f197-4c96-a994-900df3bf28e4/console.log" append="off"/>
Feb 16 13:39:53 compute-0 nova_compute[185723]:     </serial>
Feb 16 13:39:53 compute-0 nova_compute[185723]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Feb 16 13:39:53 compute-0 nova_compute[185723]:     <video>
Feb 16 13:39:53 compute-0 nova_compute[185723]:       <model type="virtio"/>
Feb 16 13:39:53 compute-0 nova_compute[185723]:     </video>
Feb 16 13:39:53 compute-0 nova_compute[185723]:     <input type="tablet" bus="usb"/>
Feb 16 13:39:53 compute-0 nova_compute[185723]:     <rng model="virtio">
Feb 16 13:39:53 compute-0 nova_compute[185723]:       <backend model="random">/dev/urandom</backend>
Feb 16 13:39:53 compute-0 nova_compute[185723]:     </rng>
Feb 16 13:39:53 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root"/>
Feb 16 13:39:53 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:39:53 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:39:53 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:39:53 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:39:53 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:39:53 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:39:53 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:39:53 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:39:53 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:39:53 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:39:53 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:39:53 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:39:53 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:39:53 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:39:53 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:39:53 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:39:53 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:39:53 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:39:53 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:39:53 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:39:53 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:39:53 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:39:53 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:39:53 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:39:53 compute-0 nova_compute[185723]:     <controller type="usb" index="0"/>
Feb 16 13:39:53 compute-0 nova_compute[185723]:     <memballoon model="virtio">
Feb 16 13:39:53 compute-0 nova_compute[185723]:       <stats period="10"/>
Feb 16 13:39:53 compute-0 nova_compute[185723]:     </memballoon>
Feb 16 13:39:53 compute-0 nova_compute[185723]:   </devices>
Feb 16 13:39:53 compute-0 nova_compute[185723]: </domain>
Feb 16 13:39:53 compute-0 nova_compute[185723]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Feb 16 13:39:53 compute-0 nova_compute[185723]: 2026-02-16 13:39:53.867 185727 DEBUG nova.compute.manager [None req-c39ebf5d-c856-474e-9028-98b0a35bbe86 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 93d211b1-f197-4c96-a994-900df3bf28e4] Preparing to wait for external event network-vif-plugged-3dd3d50b-ad63-4bee-b823-c23750e7afc1 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Feb 16 13:39:53 compute-0 nova_compute[185723]: 2026-02-16 13:39:53.867 185727 DEBUG oslo_concurrency.lockutils [None req-c39ebf5d-c856-474e-9028-98b0a35bbe86 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Acquiring lock "93d211b1-f197-4c96-a994-900df3bf28e4-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:39:53 compute-0 nova_compute[185723]: 2026-02-16 13:39:53.868 185727 DEBUG oslo_concurrency.lockutils [None req-c39ebf5d-c856-474e-9028-98b0a35bbe86 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "93d211b1-f197-4c96-a994-900df3bf28e4-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:39:53 compute-0 nova_compute[185723]: 2026-02-16 13:39:53.868 185727 DEBUG oslo_concurrency.lockutils [None req-c39ebf5d-c856-474e-9028-98b0a35bbe86 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "93d211b1-f197-4c96-a994-900df3bf28e4-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:39:53 compute-0 nova_compute[185723]: 2026-02-16 13:39:53.868 185727 DEBUG nova.virt.libvirt.vif [None req-c39ebf5d-c856-474e-9028-98b0a35bbe86 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-16T13:39:44Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-804254472',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-804254472',id=16,image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='76c271745e704d5fa97fe16a7dcd4a81',ramdisk_id='',reservation_id='r-c0ijme3l',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-1085993185',owner_user_name='tempest-TestExecuteStrategies-1085993185-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-16T13:39:46Z,user_data=None,user_id='e19cd2d8a8894526ba620ca3249e9a63',uuid=93d211b1-f197-4c96-a994-900df3bf28e4,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3dd3d50b-ad63-4bee-b823-c23750e7afc1", "address": "fa:16:3e:dd:e6:11", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3dd3d50b-ad", "ovs_interfaceid": "3dd3d50b-ad63-4bee-b823-c23750e7afc1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Feb 16 13:39:53 compute-0 nova_compute[185723]: 2026-02-16 13:39:53.869 185727 DEBUG nova.network.os_vif_util [None req-c39ebf5d-c856-474e-9028-98b0a35bbe86 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Converting VIF {"id": "3dd3d50b-ad63-4bee-b823-c23750e7afc1", "address": "fa:16:3e:dd:e6:11", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3dd3d50b-ad", "ovs_interfaceid": "3dd3d50b-ad63-4bee-b823-c23750e7afc1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 13:39:53 compute-0 nova_compute[185723]: 2026-02-16 13:39:53.869 185727 DEBUG nova.network.os_vif_util [None req-c39ebf5d-c856-474e-9028-98b0a35bbe86 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:dd:e6:11,bridge_name='br-int',has_traffic_filtering=True,id=3dd3d50b-ad63-4bee-b823-c23750e7afc1,network=Network(62a1ccdd-3048-4bbf-acc8-c791bff79ee8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3dd3d50b-ad') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 13:39:53 compute-0 nova_compute[185723]: 2026-02-16 13:39:53.870 185727 DEBUG os_vif [None req-c39ebf5d-c856-474e-9028-98b0a35bbe86 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:dd:e6:11,bridge_name='br-int',has_traffic_filtering=True,id=3dd3d50b-ad63-4bee-b823-c23750e7afc1,network=Network(62a1ccdd-3048-4bbf-acc8-c791bff79ee8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3dd3d50b-ad') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Feb 16 13:39:53 compute-0 nova_compute[185723]: 2026-02-16 13:39:53.870 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:39:53 compute-0 nova_compute[185723]: 2026-02-16 13:39:53.870 185727 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:39:53 compute-0 nova_compute[185723]: 2026-02-16 13:39:53.871 185727 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 13:39:53 compute-0 nova_compute[185723]: 2026-02-16 13:39:53.873 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:39:53 compute-0 nova_compute[185723]: 2026-02-16 13:39:53.873 185727 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3dd3d50b-ad, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:39:53 compute-0 nova_compute[185723]: 2026-02-16 13:39:53.873 185727 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap3dd3d50b-ad, col_values=(('external_ids', {'iface-id': '3dd3d50b-ad63-4bee-b823-c23750e7afc1', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:dd:e6:11', 'vm-uuid': '93d211b1-f197-4c96-a994-900df3bf28e4'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:39:53 compute-0 nova_compute[185723]: 2026-02-16 13:39:53.875 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:39:53 compute-0 NetworkManager[56177]: <info>  [1771249193.8761] manager: (tap3dd3d50b-ad): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/55)
Feb 16 13:39:53 compute-0 nova_compute[185723]: 2026-02-16 13:39:53.877 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 13:39:53 compute-0 nova_compute[185723]: 2026-02-16 13:39:53.880 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:39:53 compute-0 nova_compute[185723]: 2026-02-16 13:39:53.881 185727 INFO os_vif [None req-c39ebf5d-c856-474e-9028-98b0a35bbe86 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:dd:e6:11,bridge_name='br-int',has_traffic_filtering=True,id=3dd3d50b-ad63-4bee-b823-c23750e7afc1,network=Network(62a1ccdd-3048-4bbf-acc8-c791bff79ee8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3dd3d50b-ad')
Feb 16 13:39:53 compute-0 nova_compute[185723]: 2026-02-16 13:39:53.939 185727 DEBUG nova.virt.libvirt.driver [None req-c39ebf5d-c856-474e-9028-98b0a35bbe86 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 16 13:39:53 compute-0 nova_compute[185723]: 2026-02-16 13:39:53.940 185727 DEBUG nova.virt.libvirt.driver [None req-c39ebf5d-c856-474e-9028-98b0a35bbe86 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 16 13:39:53 compute-0 nova_compute[185723]: 2026-02-16 13:39:53.940 185727 DEBUG nova.virt.libvirt.driver [None req-c39ebf5d-c856-474e-9028-98b0a35bbe86 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] No VIF found with MAC fa:16:3e:dd:e6:11, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Feb 16 13:39:53 compute-0 nova_compute[185723]: 2026-02-16 13:39:53.940 185727 INFO nova.virt.libvirt.driver [None req-c39ebf5d-c856-474e-9028-98b0a35bbe86 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 93d211b1-f197-4c96-a994-900df3bf28e4] Using config drive
Feb 16 13:39:54 compute-0 nova_compute[185723]: 2026-02-16 13:39:54.871 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:39:54 compute-0 nova_compute[185723]: 2026-02-16 13:39:54.964 185727 INFO nova.virt.libvirt.driver [None req-c39ebf5d-c856-474e-9028-98b0a35bbe86 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 93d211b1-f197-4c96-a994-900df3bf28e4] Creating config drive at /var/lib/nova/instances/93d211b1-f197-4c96-a994-900df3bf28e4/disk.config
Feb 16 13:39:54 compute-0 nova_compute[185723]: 2026-02-16 13:39:54.968 185727 DEBUG oslo_concurrency.processutils [None req-c39ebf5d-c856-474e-9028-98b0a35bbe86 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/93d211b1-f197-4c96-a994-900df3bf28e4/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpvcfufbf1 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:39:55 compute-0 podman[212213]: 2026-02-16 13:39:55.017065556 +0000 UTC m=+0.053145905 container health_status 93151dcbec9f159583042df5101bdd7b5b1bcb5f3117955b4bc9cd1c0e465b98 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, release=1770267347, com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9/ubi-minimal, build-date=2026-02-05T04:57:10Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., org.opencontainers.image.created=2026-02-05T04:57:10Z, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, architecture=x86_64, vcs-type=git, config_id=openstack_network_exporter, container_name=openstack_network_exporter, version=9.7)
Feb 16 13:39:55 compute-0 podman[212214]: 2026-02-16 13:39:55.039546858 +0000 UTC m=+0.072522261 container health_status d69bd0be854ec8043fcba555b70c2cd311c7a2510d07d6bc9929fe7684abece9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible)
Feb 16 13:39:55 compute-0 nova_compute[185723]: 2026-02-16 13:39:55.089 185727 DEBUG oslo_concurrency.processutils [None req-c39ebf5d-c856-474e-9028-98b0a35bbe86 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/93d211b1-f197-4c96-a994-900df3bf28e4/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpvcfufbf1" returned: 0 in 0.121s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:39:55 compute-0 kernel: tap3dd3d50b-ad: entered promiscuous mode
Feb 16 13:39:55 compute-0 NetworkManager[56177]: <info>  [1771249195.1306] manager: (tap3dd3d50b-ad): new Tun device (/org/freedesktop/NetworkManager/Devices/56)
Feb 16 13:39:55 compute-0 ovn_controller[96072]: 2026-02-16T13:39:55Z|00136|binding|INFO|Claiming lport 3dd3d50b-ad63-4bee-b823-c23750e7afc1 for this chassis.
Feb 16 13:39:55 compute-0 ovn_controller[96072]: 2026-02-16T13:39:55Z|00137|binding|INFO|3dd3d50b-ad63-4bee-b823-c23750e7afc1: Claiming fa:16:3e:dd:e6:11 10.100.0.8
Feb 16 13:39:55 compute-0 nova_compute[185723]: 2026-02-16 13:39:55.131 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:39:55 compute-0 ovn_controller[96072]: 2026-02-16T13:39:55Z|00138|binding|INFO|Setting lport 3dd3d50b-ad63-4bee-b823-c23750e7afc1 ovn-installed in OVS
Feb 16 13:39:55 compute-0 ovn_controller[96072]: 2026-02-16T13:39:55Z|00139|binding|INFO|Setting lport 3dd3d50b-ad63-4bee-b823-c23750e7afc1 up in Southbound
Feb 16 13:39:55 compute-0 nova_compute[185723]: 2026-02-16 13:39:55.138 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:39:55 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:39:55.138 105360 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:dd:e6:11 10.100.0.8'], port_security=['fa:16:3e:dd:e6:11 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '93d211b1-f197-4c96-a994-900df3bf28e4', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '76c271745e704d5fa97fe16a7dcd4a81', 'neutron:revision_number': '2', 'neutron:security_group_ids': '832f9d98-96fb-45e2-8c11-0f4534711ee2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fb3ae2f5-242d-4288-83f4-c053c76499e5, chassis=[<ovs.db.idl.Row object at 0x7f2e53046760>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2e53046760>], logical_port=3dd3d50b-ad63-4bee-b823-c23750e7afc1) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 13:39:55 compute-0 nova_compute[185723]: 2026-02-16 13:39:55.139 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:39:55 compute-0 nova_compute[185723]: 2026-02-16 13:39:55.140 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:39:55 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:39:55.140 105360 INFO neutron.agent.ovn.metadata.agent [-] Port 3dd3d50b-ad63-4bee-b823-c23750e7afc1 in datapath 62a1ccdd-3048-4bbf-acc8-c791bff79ee8 bound to our chassis
Feb 16 13:39:55 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:39:55.141 105360 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 62a1ccdd-3048-4bbf-acc8-c791bff79ee8
Feb 16 13:39:55 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:39:55.151 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[cb49903b-bd38-45f4-b573-ad95740158ee]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:39:55 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:39:55.151 105360 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap62a1ccdd-31 in ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Feb 16 13:39:55 compute-0 systemd-udevd[212268]: Network interface NamePolicy= disabled on kernel command line.
Feb 16 13:39:55 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:39:55.156 206438 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap62a1ccdd-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Feb 16 13:39:55 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:39:55.156 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[45c52d9d-97a4-4edc-9e11-ac6244a919db]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:39:55 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:39:55.157 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[491fc139-ea96-4d70-87ff-1cd8bd9437fc]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:39:55 compute-0 systemd-machined[155229]: New machine qemu-12-instance-00000010.
Feb 16 13:39:55 compute-0 NetworkManager[56177]: <info>  [1771249195.1678] device (tap3dd3d50b-ad): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 16 13:39:55 compute-0 NetworkManager[56177]: <info>  [1771249195.1684] device (tap3dd3d50b-ad): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Feb 16 13:39:55 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:39:55.167 105762 DEBUG oslo.privsep.daemon [-] privsep: reply[6cb9590f-b05e-4ac4-860f-dbf805b21e83]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:39:55 compute-0 systemd[1]: Started Virtual Machine qemu-12-instance-00000010.
Feb 16 13:39:55 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:39:55.186 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[7dc59528-cb6a-4cec-aa49-5b3276b3c8b3]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:39:55 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:39:55.208 206452 DEBUG oslo.privsep.daemon [-] privsep: reply[2cd91fde-5f9c-4e8e-a0b0-edbc4916e319]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:39:55 compute-0 NetworkManager[56177]: <info>  [1771249195.2141] manager: (tap62a1ccdd-30): new Veth device (/org/freedesktop/NetworkManager/Devices/57)
Feb 16 13:39:55 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:39:55.213 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[fffd44df-1e95-4329-932d-61029af161a3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:39:55 compute-0 systemd-udevd[212272]: Network interface NamePolicy= disabled on kernel command line.
Feb 16 13:39:55 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:39:55.240 206452 DEBUG oslo.privsep.daemon [-] privsep: reply[a3fb007f-4181-4867-b1fd-c4b568a758c7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:39:55 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:39:55.244 206452 DEBUG oslo.privsep.daemon [-] privsep: reply[781a49da-746f-4b91-a4ce-349c6aba0be6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:39:55 compute-0 NetworkManager[56177]: <info>  [1771249195.2622] device (tap62a1ccdd-30): carrier: link connected
Feb 16 13:39:55 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:39:55.267 206452 DEBUG oslo.privsep.daemon [-] privsep: reply[646cc8ca-5e5d-4bd6-ba33-f2e9e1164256]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:39:55 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:39:55.282 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[bcf4659d-339a-40a0-8054-1fffd5dc700a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap62a1ccdd-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a9:94:92'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 40], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 527865, 'reachable_time': 43811, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 212301, 'error': None, 'target': 'ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:39:55 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:39:55.294 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[0f75347d-db0a-4108-a2da-43b366d56fe3]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fea9:9492'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 527865, 'tstamp': 527865}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 212302, 'error': None, 'target': 'ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:39:55 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:39:55.308 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[279fd7e9-52d3-4795-bb18-7ef94994924e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap62a1ccdd-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a9:94:92'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 40], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 527865, 'reachable_time': 43811, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 212303, 'error': None, 'target': 'ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:39:55 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:39:55.329 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[10f65b6b-b279-4bc0-8d80-504adfba1dd6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:39:55 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:39:55.371 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[e325a576-f139-4eb5-956c-e23dac5bcc2d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:39:55 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:39:55.372 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap62a1ccdd-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:39:55 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:39:55.373 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 13:39:55 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:39:55.373 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap62a1ccdd-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:39:55 compute-0 kernel: tap62a1ccdd-30: entered promiscuous mode
Feb 16 13:39:55 compute-0 nova_compute[185723]: 2026-02-16 13:39:55.418 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:39:55 compute-0 NetworkManager[56177]: <info>  [1771249195.4220] manager: (tap62a1ccdd-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/58)
Feb 16 13:39:55 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:39:55.423 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap62a1ccdd-30, col_values=(('external_ids', {'iface-id': 'ac21d57d-f71e-4560-b6aa-e9f6e3838308'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:39:55 compute-0 nova_compute[185723]: 2026-02-16 13:39:55.424 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:39:55 compute-0 ovn_controller[96072]: 2026-02-16T13:39:55Z|00140|binding|INFO|Releasing lport ac21d57d-f71e-4560-b6aa-e9f6e3838308 from this chassis (sb_readonly=0)
Feb 16 13:39:55 compute-0 nova_compute[185723]: 2026-02-16 13:39:55.431 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:39:55 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:39:55.431 105360 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/62a1ccdd-3048-4bbf-acc8-c791bff79ee8.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/62a1ccdd-3048-4bbf-acc8-c791bff79ee8.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Feb 16 13:39:55 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:39:55.432 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[83895aaa-cb86-4c56-a2da-ca956ec1eca7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:39:55 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:39:55.433 105360 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Feb 16 13:39:55 compute-0 ovn_metadata_agent[105355]: global
Feb 16 13:39:55 compute-0 ovn_metadata_agent[105355]:     log         /dev/log local0 debug
Feb 16 13:39:55 compute-0 ovn_metadata_agent[105355]:     log-tag     haproxy-metadata-proxy-62a1ccdd-3048-4bbf-acc8-c791bff79ee8
Feb 16 13:39:55 compute-0 ovn_metadata_agent[105355]:     user        root
Feb 16 13:39:55 compute-0 ovn_metadata_agent[105355]:     group       root
Feb 16 13:39:55 compute-0 ovn_metadata_agent[105355]:     maxconn     1024
Feb 16 13:39:55 compute-0 ovn_metadata_agent[105355]:     pidfile     /var/lib/neutron/external/pids/62a1ccdd-3048-4bbf-acc8-c791bff79ee8.pid.haproxy
Feb 16 13:39:55 compute-0 ovn_metadata_agent[105355]:     daemon
Feb 16 13:39:55 compute-0 ovn_metadata_agent[105355]: 
Feb 16 13:39:55 compute-0 ovn_metadata_agent[105355]: defaults
Feb 16 13:39:55 compute-0 ovn_metadata_agent[105355]:     log global
Feb 16 13:39:55 compute-0 ovn_metadata_agent[105355]:     mode http
Feb 16 13:39:55 compute-0 ovn_metadata_agent[105355]:     option httplog
Feb 16 13:39:55 compute-0 ovn_metadata_agent[105355]:     option dontlognull
Feb 16 13:39:55 compute-0 ovn_metadata_agent[105355]:     option http-server-close
Feb 16 13:39:55 compute-0 ovn_metadata_agent[105355]:     option forwardfor
Feb 16 13:39:55 compute-0 ovn_metadata_agent[105355]:     retries                 3
Feb 16 13:39:55 compute-0 ovn_metadata_agent[105355]:     timeout http-request    30s
Feb 16 13:39:55 compute-0 ovn_metadata_agent[105355]:     timeout connect         30s
Feb 16 13:39:55 compute-0 ovn_metadata_agent[105355]:     timeout client          32s
Feb 16 13:39:55 compute-0 ovn_metadata_agent[105355]:     timeout server          32s
Feb 16 13:39:55 compute-0 ovn_metadata_agent[105355]:     timeout http-keep-alive 30s
Feb 16 13:39:55 compute-0 ovn_metadata_agent[105355]: 
Feb 16 13:39:55 compute-0 ovn_metadata_agent[105355]: 
Feb 16 13:39:55 compute-0 ovn_metadata_agent[105355]: listen listener
Feb 16 13:39:55 compute-0 ovn_metadata_agent[105355]:     bind 169.254.169.254:80
Feb 16 13:39:55 compute-0 ovn_metadata_agent[105355]:     server metadata /var/lib/neutron/metadata_proxy
Feb 16 13:39:55 compute-0 ovn_metadata_agent[105355]:     http-request add-header X-OVN-Network-ID 62a1ccdd-3048-4bbf-acc8-c791bff79ee8
Feb 16 13:39:55 compute-0 ovn_metadata_agent[105355]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Feb 16 13:39:55 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:39:55.434 105360 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'env', 'PROCESS_TAG=haproxy-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/62a1ccdd-3048-4bbf-acc8-c791bff79ee8.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Feb 16 13:39:55 compute-0 podman[212335]: 2026-02-16 13:39:55.766487474 +0000 UTC m=+0.048972953 container create 8c9b7905a09b21da81a7ff46af6afeb97f451a14a24b69ccc0f5f0ca26396a73 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127)
Feb 16 13:39:55 compute-0 systemd[1]: Started libpod-conmon-8c9b7905a09b21da81a7ff46af6afeb97f451a14a24b69ccc0f5f0ca26396a73.scope.
Feb 16 13:39:55 compute-0 systemd[1]: Started libcrun container.
Feb 16 13:39:55 compute-0 podman[212335]: 2026-02-16 13:39:55.737783199 +0000 UTC m=+0.020268688 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Feb 16 13:39:55 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3843e236d64219c7463eab8571ee742251b40a3e3c27e3f041c4eaefe86f58f9/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 16 13:39:55 compute-0 podman[212335]: 2026-02-16 13:39:55.847718327 +0000 UTC m=+0.130203806 container init 8c9b7905a09b21da81a7ff46af6afeb97f451a14a24b69ccc0f5f0ca26396a73 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Feb 16 13:39:55 compute-0 podman[212335]: 2026-02-16 13:39:55.853087748 +0000 UTC m=+0.135573247 container start 8c9b7905a09b21da81a7ff46af6afeb97f451a14a24b69ccc0f5f0ca26396a73 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS)
Feb 16 13:39:55 compute-0 neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8[212351]: [NOTICE]   (212355) : New worker (212358) forked
Feb 16 13:39:55 compute-0 neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8[212351]: [NOTICE]   (212355) : Loading success.
Feb 16 13:39:55 compute-0 nova_compute[185723]: 2026-02-16 13:39:55.875 185727 DEBUG nova.compute.manager [req-eb31cafc-c042-4ddd-8a0c-fcb66b9308f4 req-9734733d-9947-4577-b0f3-28aa0b0c980f faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 93d211b1-f197-4c96-a994-900df3bf28e4] Received event network-vif-plugged-3dd3d50b-ad63-4bee-b823-c23750e7afc1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:39:55 compute-0 nova_compute[185723]: 2026-02-16 13:39:55.876 185727 DEBUG oslo_concurrency.lockutils [req-eb31cafc-c042-4ddd-8a0c-fcb66b9308f4 req-9734733d-9947-4577-b0f3-28aa0b0c980f faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "93d211b1-f197-4c96-a994-900df3bf28e4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:39:55 compute-0 nova_compute[185723]: 2026-02-16 13:39:55.876 185727 DEBUG oslo_concurrency.lockutils [req-eb31cafc-c042-4ddd-8a0c-fcb66b9308f4 req-9734733d-9947-4577-b0f3-28aa0b0c980f faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "93d211b1-f197-4c96-a994-900df3bf28e4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:39:55 compute-0 nova_compute[185723]: 2026-02-16 13:39:55.876 185727 DEBUG oslo_concurrency.lockutils [req-eb31cafc-c042-4ddd-8a0c-fcb66b9308f4 req-9734733d-9947-4577-b0f3-28aa0b0c980f faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "93d211b1-f197-4c96-a994-900df3bf28e4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:39:55 compute-0 nova_compute[185723]: 2026-02-16 13:39:55.876 185727 DEBUG nova.compute.manager [req-eb31cafc-c042-4ddd-8a0c-fcb66b9308f4 req-9734733d-9947-4577-b0f3-28aa0b0c980f faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 93d211b1-f197-4c96-a994-900df3bf28e4] Processing event network-vif-plugged-3dd3d50b-ad63-4bee-b823-c23750e7afc1 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Feb 16 13:39:55 compute-0 nova_compute[185723]: 2026-02-16 13:39:55.963 185727 DEBUG nova.compute.manager [None req-c39ebf5d-c856-474e-9028-98b0a35bbe86 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 93d211b1-f197-4c96-a994-900df3bf28e4] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Feb 16 13:39:55 compute-0 nova_compute[185723]: 2026-02-16 13:39:55.964 185727 DEBUG nova.virt.driver [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] Emitting event <LifecycleEvent: 1771249195.9622886, 93d211b1-f197-4c96-a994-900df3bf28e4 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 13:39:55 compute-0 nova_compute[185723]: 2026-02-16 13:39:55.965 185727 INFO nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: 93d211b1-f197-4c96-a994-900df3bf28e4] VM Started (Lifecycle Event)
Feb 16 13:39:55 compute-0 nova_compute[185723]: 2026-02-16 13:39:55.970 185727 DEBUG nova.virt.libvirt.driver [None req-c39ebf5d-c856-474e-9028-98b0a35bbe86 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 93d211b1-f197-4c96-a994-900df3bf28e4] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Feb 16 13:39:55 compute-0 nova_compute[185723]: 2026-02-16 13:39:55.974 185727 INFO nova.virt.libvirt.driver [-] [instance: 93d211b1-f197-4c96-a994-900df3bf28e4] Instance spawned successfully.
Feb 16 13:39:55 compute-0 nova_compute[185723]: 2026-02-16 13:39:55.974 185727 DEBUG nova.virt.libvirt.driver [None req-c39ebf5d-c856-474e-9028-98b0a35bbe86 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 93d211b1-f197-4c96-a994-900df3bf28e4] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Feb 16 13:39:55 compute-0 nova_compute[185723]: 2026-02-16 13:39:55.997 185727 DEBUG nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: 93d211b1-f197-4c96-a994-900df3bf28e4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:39:56 compute-0 nova_compute[185723]: 2026-02-16 13:39:56.004 185727 DEBUG nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: 93d211b1-f197-4c96-a994-900df3bf28e4] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 16 13:39:56 compute-0 nova_compute[185723]: 2026-02-16 13:39:56.010 185727 DEBUG nova.virt.libvirt.driver [None req-c39ebf5d-c856-474e-9028-98b0a35bbe86 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 93d211b1-f197-4c96-a994-900df3bf28e4] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:39:56 compute-0 nova_compute[185723]: 2026-02-16 13:39:56.011 185727 DEBUG nova.virt.libvirt.driver [None req-c39ebf5d-c856-474e-9028-98b0a35bbe86 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 93d211b1-f197-4c96-a994-900df3bf28e4] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:39:56 compute-0 nova_compute[185723]: 2026-02-16 13:39:56.012 185727 DEBUG nova.virt.libvirt.driver [None req-c39ebf5d-c856-474e-9028-98b0a35bbe86 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 93d211b1-f197-4c96-a994-900df3bf28e4] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:39:56 compute-0 nova_compute[185723]: 2026-02-16 13:39:56.012 185727 DEBUG nova.virt.libvirt.driver [None req-c39ebf5d-c856-474e-9028-98b0a35bbe86 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 93d211b1-f197-4c96-a994-900df3bf28e4] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:39:56 compute-0 nova_compute[185723]: 2026-02-16 13:39:56.012 185727 DEBUG nova.virt.libvirt.driver [None req-c39ebf5d-c856-474e-9028-98b0a35bbe86 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 93d211b1-f197-4c96-a994-900df3bf28e4] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:39:56 compute-0 nova_compute[185723]: 2026-02-16 13:39:56.013 185727 DEBUG nova.virt.libvirt.driver [None req-c39ebf5d-c856-474e-9028-98b0a35bbe86 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 93d211b1-f197-4c96-a994-900df3bf28e4] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:39:56 compute-0 nova_compute[185723]: 2026-02-16 13:39:56.054 185727 INFO nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: 93d211b1-f197-4c96-a994-900df3bf28e4] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 16 13:39:56 compute-0 nova_compute[185723]: 2026-02-16 13:39:56.055 185727 DEBUG nova.virt.driver [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] Emitting event <LifecycleEvent: 1771249195.9639096, 93d211b1-f197-4c96-a994-900df3bf28e4 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 13:39:56 compute-0 nova_compute[185723]: 2026-02-16 13:39:56.055 185727 INFO nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: 93d211b1-f197-4c96-a994-900df3bf28e4] VM Paused (Lifecycle Event)
Feb 16 13:39:56 compute-0 nova_compute[185723]: 2026-02-16 13:39:56.084 185727 DEBUG nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: 93d211b1-f197-4c96-a994-900df3bf28e4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:39:56 compute-0 nova_compute[185723]: 2026-02-16 13:39:56.085 185727 DEBUG nova.network.neutron [req-9210bd77-18eb-48f9-8d97-0d2aa4563078 req-61982301-bb6e-4d30-9fdc-64158dc27207 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 93d211b1-f197-4c96-a994-900df3bf28e4] Updated VIF entry in instance network info cache for port 3dd3d50b-ad63-4bee-b823-c23750e7afc1. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 16 13:39:56 compute-0 nova_compute[185723]: 2026-02-16 13:39:56.086 185727 DEBUG nova.network.neutron [req-9210bd77-18eb-48f9-8d97-0d2aa4563078 req-61982301-bb6e-4d30-9fdc-64158dc27207 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 93d211b1-f197-4c96-a994-900df3bf28e4] Updating instance_info_cache with network_info: [{"id": "3dd3d50b-ad63-4bee-b823-c23750e7afc1", "address": "fa:16:3e:dd:e6:11", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3dd3d50b-ad", "ovs_interfaceid": "3dd3d50b-ad63-4bee-b823-c23750e7afc1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 13:39:56 compute-0 nova_compute[185723]: 2026-02-16 13:39:56.090 185727 DEBUG nova.virt.driver [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] Emitting event <LifecycleEvent: 1771249195.9700058, 93d211b1-f197-4c96-a994-900df3bf28e4 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 13:39:56 compute-0 nova_compute[185723]: 2026-02-16 13:39:56.090 185727 INFO nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: 93d211b1-f197-4c96-a994-900df3bf28e4] VM Resumed (Lifecycle Event)
Feb 16 13:39:56 compute-0 nova_compute[185723]: 2026-02-16 13:39:56.105 185727 INFO nova.compute.manager [None req-c39ebf5d-c856-474e-9028-98b0a35bbe86 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 93d211b1-f197-4c96-a994-900df3bf28e4] Took 9.53 seconds to spawn the instance on the hypervisor.
Feb 16 13:39:56 compute-0 nova_compute[185723]: 2026-02-16 13:39:56.106 185727 DEBUG nova.compute.manager [None req-c39ebf5d-c856-474e-9028-98b0a35bbe86 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 93d211b1-f197-4c96-a994-900df3bf28e4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:39:56 compute-0 nova_compute[185723]: 2026-02-16 13:39:56.112 185727 DEBUG oslo_concurrency.lockutils [req-9210bd77-18eb-48f9-8d97-0d2aa4563078 req-61982301-bb6e-4d30-9fdc-64158dc27207 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Releasing lock "refresh_cache-93d211b1-f197-4c96-a994-900df3bf28e4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 13:39:56 compute-0 nova_compute[185723]: 2026-02-16 13:39:56.115 185727 DEBUG nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: 93d211b1-f197-4c96-a994-900df3bf28e4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:39:56 compute-0 nova_compute[185723]: 2026-02-16 13:39:56.117 185727 DEBUG nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: 93d211b1-f197-4c96-a994-900df3bf28e4] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 16 13:39:56 compute-0 nova_compute[185723]: 2026-02-16 13:39:56.140 185727 INFO nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: 93d211b1-f197-4c96-a994-900df3bf28e4] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 16 13:39:56 compute-0 nova_compute[185723]: 2026-02-16 13:39:56.185 185727 INFO nova.compute.manager [None req-c39ebf5d-c856-474e-9028-98b0a35bbe86 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 93d211b1-f197-4c96-a994-900df3bf28e4] Took 10.08 seconds to build instance.
Feb 16 13:39:56 compute-0 nova_compute[185723]: 2026-02-16 13:39:56.210 185727 DEBUG oslo_concurrency.lockutils [None req-c39ebf5d-c856-474e-9028-98b0a35bbe86 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "93d211b1-f197-4c96-a994-900df3bf28e4" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.197s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:39:57 compute-0 nova_compute[185723]: 2026-02-16 13:39:57.987 185727 DEBUG nova.compute.manager [req-94e0c960-5a56-49a7-8969-eaba508317ed req-0b102f4a-4db5-4d04-a149-3f8acf087452 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 93d211b1-f197-4c96-a994-900df3bf28e4] Received event network-vif-plugged-3dd3d50b-ad63-4bee-b823-c23750e7afc1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:39:57 compute-0 nova_compute[185723]: 2026-02-16 13:39:57.988 185727 DEBUG oslo_concurrency.lockutils [req-94e0c960-5a56-49a7-8969-eaba508317ed req-0b102f4a-4db5-4d04-a149-3f8acf087452 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "93d211b1-f197-4c96-a994-900df3bf28e4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:39:57 compute-0 nova_compute[185723]: 2026-02-16 13:39:57.988 185727 DEBUG oslo_concurrency.lockutils [req-94e0c960-5a56-49a7-8969-eaba508317ed req-0b102f4a-4db5-4d04-a149-3f8acf087452 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "93d211b1-f197-4c96-a994-900df3bf28e4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:39:57 compute-0 nova_compute[185723]: 2026-02-16 13:39:57.988 185727 DEBUG oslo_concurrency.lockutils [req-94e0c960-5a56-49a7-8969-eaba508317ed req-0b102f4a-4db5-4d04-a149-3f8acf087452 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "93d211b1-f197-4c96-a994-900df3bf28e4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:39:57 compute-0 nova_compute[185723]: 2026-02-16 13:39:57.988 185727 DEBUG nova.compute.manager [req-94e0c960-5a56-49a7-8969-eaba508317ed req-0b102f4a-4db5-4d04-a149-3f8acf087452 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 93d211b1-f197-4c96-a994-900df3bf28e4] No waiting events found dispatching network-vif-plugged-3dd3d50b-ad63-4bee-b823-c23750e7afc1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:39:57 compute-0 nova_compute[185723]: 2026-02-16 13:39:57.989 185727 WARNING nova.compute.manager [req-94e0c960-5a56-49a7-8969-eaba508317ed req-0b102f4a-4db5-4d04-a149-3f8acf087452 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 93d211b1-f197-4c96-a994-900df3bf28e4] Received unexpected event network-vif-plugged-3dd3d50b-ad63-4bee-b823-c23750e7afc1 for instance with vm_state active and task_state None.
Feb 16 13:39:58 compute-0 podman[212373]: 2026-02-16 13:39:58.073407494 +0000 UTC m=+0.115101245 container health_status 19961a2f872cc72daf954f4dd556f980b5f58f137d145776df6132c167a8a93c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Feb 16 13:39:58 compute-0 nova_compute[185723]: 2026-02-16 13:39:58.876 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:39:59 compute-0 podman[195053]: time="2026-02-16T13:39:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:39:59 compute-0 podman[195053]: @ - - [16/Feb/2026:13:39:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17245 "" "Go-http-client/1.1"
Feb 16 13:39:59 compute-0 podman[195053]: @ - - [16/Feb/2026:13:39:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2637 "" "Go-http-client/1.1"
Feb 16 13:39:59 compute-0 nova_compute[185723]: 2026-02-16 13:39:59.873 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:40:01 compute-0 openstack_network_exporter[197909]: ERROR   13:40:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:40:01 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:40:01 compute-0 openstack_network_exporter[197909]: ERROR   13:40:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:40:01 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:40:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:40:03.232 105360 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:40:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:40:03.233 105360 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:40:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:40:03.234 105360 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:40:03 compute-0 nova_compute[185723]: 2026-02-16 13:40:03.878 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:40:04 compute-0 nova_compute[185723]: 2026-02-16 13:40:04.876 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:40:08 compute-0 nova_compute[185723]: 2026-02-16 13:40:08.880 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:40:09 compute-0 podman[212415]: 2026-02-16 13:40:09.001095112 +0000 UTC m=+0.041878669 container health_status 4ec6105d1727f201f656d7e6e358931be8ceabc5af37fb5ad779abc620122180 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 16 13:40:09 compute-0 ovn_controller[96072]: 2026-02-16T13:40:09Z|00020|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:dd:e6:11 10.100.0.8
Feb 16 13:40:09 compute-0 ovn_controller[96072]: 2026-02-16T13:40:09Z|00021|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:dd:e6:11 10.100.0.8
Feb 16 13:40:09 compute-0 nova_compute[185723]: 2026-02-16 13:40:09.916 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:40:13 compute-0 nova_compute[185723]: 2026-02-16 13:40:13.900 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:40:14 compute-0 nova_compute[185723]: 2026-02-16 13:40:14.920 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:40:16 compute-0 nova_compute[185723]: 2026-02-16 13:40:16.434 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:40:16 compute-0 nova_compute[185723]: 2026-02-16 13:40:16.434 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:40:17 compute-0 nova_compute[185723]: 2026-02-16 13:40:17.433 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:40:17 compute-0 nova_compute[185723]: 2026-02-16 13:40:17.434 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 16 13:40:17 compute-0 nova_compute[185723]: 2026-02-16 13:40:17.434 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 16 13:40:18 compute-0 nova_compute[185723]: 2026-02-16 13:40:18.490 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Acquiring lock "refresh_cache-93d211b1-f197-4c96-a994-900df3bf28e4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 13:40:18 compute-0 nova_compute[185723]: 2026-02-16 13:40:18.490 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Acquired lock "refresh_cache-93d211b1-f197-4c96-a994-900df3bf28e4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 13:40:18 compute-0 nova_compute[185723]: 2026-02-16 13:40:18.490 185727 DEBUG nova.network.neutron [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] [instance: 93d211b1-f197-4c96-a994-900df3bf28e4] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Feb 16 13:40:18 compute-0 nova_compute[185723]: 2026-02-16 13:40:18.491 185727 DEBUG nova.objects.instance [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 93d211b1-f197-4c96-a994-900df3bf28e4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 13:40:18 compute-0 nova_compute[185723]: 2026-02-16 13:40:18.902 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:40:19 compute-0 nova_compute[185723]: 2026-02-16 13:40:19.923 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:40:20 compute-0 nova_compute[185723]: 2026-02-16 13:40:20.805 185727 DEBUG nova.network.neutron [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] [instance: 93d211b1-f197-4c96-a994-900df3bf28e4] Updating instance_info_cache with network_info: [{"id": "3dd3d50b-ad63-4bee-b823-c23750e7afc1", "address": "fa:16:3e:dd:e6:11", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3dd3d50b-ad", "ovs_interfaceid": "3dd3d50b-ad63-4bee-b823-c23750e7afc1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 13:40:20 compute-0 nova_compute[185723]: 2026-02-16 13:40:20.849 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Releasing lock "refresh_cache-93d211b1-f197-4c96-a994-900df3bf28e4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 13:40:20 compute-0 nova_compute[185723]: 2026-02-16 13:40:20.850 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] [instance: 93d211b1-f197-4c96-a994-900df3bf28e4] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Feb 16 13:40:20 compute-0 nova_compute[185723]: 2026-02-16 13:40:20.850 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:40:20 compute-0 nova_compute[185723]: 2026-02-16 13:40:20.851 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:40:20 compute-0 nova_compute[185723]: 2026-02-16 13:40:20.851 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:40:20 compute-0 nova_compute[185723]: 2026-02-16 13:40:20.851 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 16 13:40:20 compute-0 nova_compute[185723]: 2026-02-16 13:40:20.851 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:40:20 compute-0 nova_compute[185723]: 2026-02-16 13:40:20.874 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:40:20 compute-0 nova_compute[185723]: 2026-02-16 13:40:20.874 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:40:20 compute-0 nova_compute[185723]: 2026-02-16 13:40:20.875 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:40:20 compute-0 nova_compute[185723]: 2026-02-16 13:40:20.875 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 16 13:40:20 compute-0 sshd-session[212439]: Invalid user postgres from 188.166.42.159 port 51950
Feb 16 13:40:20 compute-0 nova_compute[185723]: 2026-02-16 13:40:20.943 185727 DEBUG oslo_concurrency.processutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/93d211b1-f197-4c96-a994-900df3bf28e4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:40:21 compute-0 nova_compute[185723]: 2026-02-16 13:40:21.000 185727 DEBUG oslo_concurrency.processutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/93d211b1-f197-4c96-a994-900df3bf28e4/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:40:21 compute-0 nova_compute[185723]: 2026-02-16 13:40:21.001 185727 DEBUG oslo_concurrency.processutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/93d211b1-f197-4c96-a994-900df3bf28e4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:40:21 compute-0 nova_compute[185723]: 2026-02-16 13:40:21.051 185727 DEBUG oslo_concurrency.processutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/93d211b1-f197-4c96-a994-900df3bf28e4/disk --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:40:21 compute-0 nova_compute[185723]: 2026-02-16 13:40:21.106 185727 DEBUG nova.compute.manager [None req-103b6117-0bd1-424f-9535-ba5c2d3491d6 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Adding trait COMPUTE_STATUS_DISABLED to compute node resource provider c9501a85-df32-4b8f-bce0-9425ef1e7866 in placement. update_compute_provider_status /usr/lib/python3.9/site-packages/nova/compute/manager.py:610
Feb 16 13:40:21 compute-0 sshd-session[212439]: Connection closed by invalid user postgres 188.166.42.159 port 51950 [preauth]
Feb 16 13:40:21 compute-0 nova_compute[185723]: 2026-02-16 13:40:21.183 185727 DEBUG nova.compute.provider_tree [None req-103b6117-0bd1-424f-9535-ba5c2d3491d6 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Updating resource provider c9501a85-df32-4b8f-bce0-9425ef1e7866 generation from 19 to 24 during operation: update_traits _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Feb 16 13:40:21 compute-0 nova_compute[185723]: 2026-02-16 13:40:21.238 185727 WARNING nova.virt.libvirt.driver [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 13:40:21 compute-0 nova_compute[185723]: 2026-02-16 13:40:21.239 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5650MB free_disk=73.1964225769043GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 16 13:40:21 compute-0 nova_compute[185723]: 2026-02-16 13:40:21.240 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:40:21 compute-0 nova_compute[185723]: 2026-02-16 13:40:21.240 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:40:21 compute-0 nova_compute[185723]: 2026-02-16 13:40:21.420 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Instance 93d211b1-f197-4c96-a994-900df3bf28e4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 16 13:40:21 compute-0 nova_compute[185723]: 2026-02-16 13:40:21.422 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 16 13:40:21 compute-0 nova_compute[185723]: 2026-02-16 13:40:21.422 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 16 13:40:21 compute-0 nova_compute[185723]: 2026-02-16 13:40:21.517 185727 DEBUG nova.compute.provider_tree [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Inventory has not changed in ProviderTree for provider: c9501a85-df32-4b8f-bce0-9425ef1e7866 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 13:40:21 compute-0 nova_compute[185723]: 2026-02-16 13:40:21.548 185727 DEBUG nova.scheduler.client.report [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Inventory has not changed for provider c9501a85-df32-4b8f-bce0-9425ef1e7866 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 13:40:21 compute-0 nova_compute[185723]: 2026-02-16 13:40:21.577 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 16 13:40:21 compute-0 nova_compute[185723]: 2026-02-16 13:40:21.577 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.337s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:40:22 compute-0 nova_compute[185723]: 2026-02-16 13:40:22.572 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:40:22 compute-0 nova_compute[185723]: 2026-02-16 13:40:22.573 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:40:23 compute-0 nova_compute[185723]: 2026-02-16 13:40:23.904 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:40:24 compute-0 nova_compute[185723]: 2026-02-16 13:40:24.926 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:40:25 compute-0 ovn_controller[96072]: 2026-02-16T13:40:25Z|00141|memory_trim|INFO|Detected inactivity (last active 30024 ms ago): trimming memory
Feb 16 13:40:26 compute-0 podman[212450]: 2026-02-16 13:40:26.018588328 +0000 UTC m=+0.053668368 container health_status d69bd0be854ec8043fcba555b70c2cd311c7a2510d07d6bc9929fe7684abece9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible)
Feb 16 13:40:26 compute-0 podman[212449]: 2026-02-16 13:40:26.023519929 +0000 UTC m=+0.062046404 container health_status 93151dcbec9f159583042df5101bdd7b5b1bcb5f3117955b4bc9cd1c0e465b98 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, org.opencontainers.image.created=2026-02-05T04:57:10Z, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, release=1770267347, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.7, config_id=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, io.openshift.expose-services=, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, io.buildah.version=1.33.7, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2026-02-05T04:57:10Z, vendor=Red Hat, Inc., container_name=openstack_network_exporter, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible)
Feb 16 13:40:27 compute-0 nova_compute[185723]: 2026-02-16 13:40:27.358 185727 DEBUG nova.virt.libvirt.driver [None req-04e0a887-f5eb-43c8-b042-ca4d450ef6b8 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 93d211b1-f197-4c96-a994-900df3bf28e4] Check if temp file /var/lib/nova/instances/tmp332w3zu3 exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10065
Feb 16 13:40:27 compute-0 nova_compute[185723]: 2026-02-16 13:40:27.359 185727 DEBUG nova.compute.manager [None req-04e0a887-f5eb-43c8-b042-ca4d450ef6b8 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp332w3zu3',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='93d211b1-f197-4c96-a994-900df3bf28e4',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.9/site-packages/nova/compute/manager.py:8587
Feb 16 13:40:28 compute-0 nova_compute[185723]: 2026-02-16 13:40:28.907 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:40:29 compute-0 podman[212490]: 2026-02-16 13:40:29.039012266 +0000 UTC m=+0.074764175 container health_status 19961a2f872cc72daf954f4dd556f980b5f58f137d145776df6132c167a8a93c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_controller, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Feb 16 13:40:29 compute-0 nova_compute[185723]: 2026-02-16 13:40:29.343 185727 DEBUG oslo_concurrency.processutils [None req-04e0a887-f5eb-43c8-b042-ca4d450ef6b8 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/93d211b1-f197-4c96-a994-900df3bf28e4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:40:29 compute-0 nova_compute[185723]: 2026-02-16 13:40:29.418 185727 DEBUG oslo_concurrency.processutils [None req-04e0a887-f5eb-43c8-b042-ca4d450ef6b8 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/93d211b1-f197-4c96-a994-900df3bf28e4/disk --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:40:29 compute-0 nova_compute[185723]: 2026-02-16 13:40:29.419 185727 DEBUG oslo_concurrency.processutils [None req-04e0a887-f5eb-43c8-b042-ca4d450ef6b8 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/93d211b1-f197-4c96-a994-900df3bf28e4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:40:29 compute-0 nova_compute[185723]: 2026-02-16 13:40:29.474 185727 DEBUG oslo_concurrency.processutils [None req-04e0a887-f5eb-43c8-b042-ca4d450ef6b8 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/93d211b1-f197-4c96-a994-900df3bf28e4/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:40:29 compute-0 sshd-session[212489]: Invalid user admin from 146.190.226.24 port 50192
Feb 16 13:40:29 compute-0 sshd-session[212489]: Connection closed by invalid user admin 146.190.226.24 port 50192 [preauth]
Feb 16 13:40:29 compute-0 podman[195053]: time="2026-02-16T13:40:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:40:29 compute-0 podman[195053]: @ - - [16/Feb/2026:13:40:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17245 "" "Go-http-client/1.1"
Feb 16 13:40:29 compute-0 podman[195053]: @ - - [16/Feb/2026:13:40:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2644 "" "Go-http-client/1.1"
Feb 16 13:40:29 compute-0 nova_compute[185723]: 2026-02-16 13:40:29.935 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:40:31 compute-0 openstack_network_exporter[197909]: ERROR   13:40:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:40:31 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:40:31 compute-0 openstack_network_exporter[197909]: ERROR   13:40:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:40:31 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:40:33 compute-0 sshd-session[212524]: Invalid user admin from 64.227.72.94 port 54830
Feb 16 13:40:33 compute-0 sshd-session[212524]: Connection closed by invalid user admin 64.227.72.94 port 54830 [preauth]
Feb 16 13:40:33 compute-0 sshd-session[212526]: Accepted publickey for nova from 192.168.122.101 port 39350 ssh2: ECDSA SHA256:U309eBAZgvPXicX2lI3ib2903RjOpPXbPKVddWOb314
Feb 16 13:40:33 compute-0 systemd-logind[818]: New session 31 of user nova.
Feb 16 13:40:33 compute-0 systemd[1]: Created slice User Slice of UID 42436.
Feb 16 13:40:33 compute-0 systemd[1]: Starting User Runtime Directory /run/user/42436...
Feb 16 13:40:33 compute-0 systemd[1]: Finished User Runtime Directory /run/user/42436.
Feb 16 13:40:33 compute-0 systemd[1]: Starting User Manager for UID 42436...
Feb 16 13:40:33 compute-0 systemd[212530]: pam_unix(systemd-user:session): session opened for user nova(uid=42436) by nova(uid=0)
Feb 16 13:40:33 compute-0 systemd[212530]: Queued start job for default target Main User Target.
Feb 16 13:40:33 compute-0 systemd[212530]: Created slice User Application Slice.
Feb 16 13:40:33 compute-0 systemd[212530]: Started Mark boot as successful after the user session has run 2 minutes.
Feb 16 13:40:33 compute-0 systemd[212530]: Started Daily Cleanup of User's Temporary Directories.
Feb 16 13:40:33 compute-0 systemd[212530]: Reached target Paths.
Feb 16 13:40:33 compute-0 systemd[212530]: Reached target Timers.
Feb 16 13:40:33 compute-0 systemd[212530]: Starting D-Bus User Message Bus Socket...
Feb 16 13:40:33 compute-0 systemd[212530]: Starting Create User's Volatile Files and Directories...
Feb 16 13:40:33 compute-0 systemd[212530]: Listening on D-Bus User Message Bus Socket.
Feb 16 13:40:33 compute-0 systemd[212530]: Reached target Sockets.
Feb 16 13:40:33 compute-0 systemd[212530]: Finished Create User's Volatile Files and Directories.
Feb 16 13:40:33 compute-0 systemd[212530]: Reached target Basic System.
Feb 16 13:40:33 compute-0 systemd[212530]: Reached target Main User Target.
Feb 16 13:40:33 compute-0 systemd[212530]: Startup finished in 125ms.
Feb 16 13:40:33 compute-0 systemd[1]: Started User Manager for UID 42436.
Feb 16 13:40:33 compute-0 systemd[1]: Started Session 31 of User nova.
Feb 16 13:40:33 compute-0 sshd-session[212526]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Feb 16 13:40:33 compute-0 nova_compute[185723]: 2026-02-16 13:40:33.909 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:40:33 compute-0 sshd-session[212546]: Received disconnect from 192.168.122.101 port 39350:11: disconnected by user
Feb 16 13:40:33 compute-0 sshd-session[212546]: Disconnected from user nova 192.168.122.101 port 39350
Feb 16 13:40:33 compute-0 sshd-session[212526]: pam_unix(sshd:session): session closed for user nova
Feb 16 13:40:33 compute-0 systemd[1]: session-31.scope: Deactivated successfully.
Feb 16 13:40:33 compute-0 systemd-logind[818]: Session 31 logged out. Waiting for processes to exit.
Feb 16 13:40:33 compute-0 systemd-logind[818]: Removed session 31.
Feb 16 13:40:34 compute-0 nova_compute[185723]: 2026-02-16 13:40:34.937 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:40:35 compute-0 nova_compute[185723]: 2026-02-16 13:40:35.058 185727 DEBUG nova.compute.manager [req-5058700b-b034-4d98-be28-7aeada04e7ad req-a2c7c20d-94c2-46c4-8888-12b873801d20 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 93d211b1-f197-4c96-a994-900df3bf28e4] Received event network-vif-unplugged-3dd3d50b-ad63-4bee-b823-c23750e7afc1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:40:35 compute-0 nova_compute[185723]: 2026-02-16 13:40:35.059 185727 DEBUG oslo_concurrency.lockutils [req-5058700b-b034-4d98-be28-7aeada04e7ad req-a2c7c20d-94c2-46c4-8888-12b873801d20 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "93d211b1-f197-4c96-a994-900df3bf28e4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:40:35 compute-0 nova_compute[185723]: 2026-02-16 13:40:35.059 185727 DEBUG oslo_concurrency.lockutils [req-5058700b-b034-4d98-be28-7aeada04e7ad req-a2c7c20d-94c2-46c4-8888-12b873801d20 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "93d211b1-f197-4c96-a994-900df3bf28e4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:40:35 compute-0 nova_compute[185723]: 2026-02-16 13:40:35.059 185727 DEBUG oslo_concurrency.lockutils [req-5058700b-b034-4d98-be28-7aeada04e7ad req-a2c7c20d-94c2-46c4-8888-12b873801d20 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "93d211b1-f197-4c96-a994-900df3bf28e4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:40:35 compute-0 nova_compute[185723]: 2026-02-16 13:40:35.059 185727 DEBUG nova.compute.manager [req-5058700b-b034-4d98-be28-7aeada04e7ad req-a2c7c20d-94c2-46c4-8888-12b873801d20 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 93d211b1-f197-4c96-a994-900df3bf28e4] No waiting events found dispatching network-vif-unplugged-3dd3d50b-ad63-4bee-b823-c23750e7afc1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:40:35 compute-0 nova_compute[185723]: 2026-02-16 13:40:35.059 185727 DEBUG nova.compute.manager [req-5058700b-b034-4d98-be28-7aeada04e7ad req-a2c7c20d-94c2-46c4-8888-12b873801d20 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 93d211b1-f197-4c96-a994-900df3bf28e4] Received event network-vif-unplugged-3dd3d50b-ad63-4bee-b823-c23750e7afc1 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Feb 16 13:40:36 compute-0 nova_compute[185723]: 2026-02-16 13:40:36.037 185727 INFO nova.compute.manager [None req-04e0a887-f5eb-43c8-b042-ca4d450ef6b8 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 93d211b1-f197-4c96-a994-900df3bf28e4] Took 6.56 seconds for pre_live_migration on destination host compute-1.ctlplane.example.com.
Feb 16 13:40:36 compute-0 nova_compute[185723]: 2026-02-16 13:40:36.038 185727 DEBUG nova.compute.manager [None req-04e0a887-f5eb-43c8-b042-ca4d450ef6b8 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 93d211b1-f197-4c96-a994-900df3bf28e4] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Feb 16 13:40:36 compute-0 nova_compute[185723]: 2026-02-16 13:40:36.067 185727 DEBUG nova.compute.manager [None req-04e0a887-f5eb-43c8-b042-ca4d450ef6b8 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp332w3zu3',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='93d211b1-f197-4c96-a994-900df3bf28e4',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(3dc052c0-2e01-427f-91f3-ee707b871d5a),old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8939
Feb 16 13:40:36 compute-0 nova_compute[185723]: 2026-02-16 13:40:36.097 185727 DEBUG nova.objects.instance [None req-04e0a887-f5eb-43c8-b042-ca4d450ef6b8 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lazy-loading 'migration_context' on Instance uuid 93d211b1-f197-4c96-a994-900df3bf28e4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 13:40:36 compute-0 nova_compute[185723]: 2026-02-16 13:40:36.099 185727 DEBUG nova.virt.libvirt.driver [None req-04e0a887-f5eb-43c8-b042-ca4d450ef6b8 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 93d211b1-f197-4c96-a994-900df3bf28e4] Starting monitoring of live migration _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10639
Feb 16 13:40:36 compute-0 nova_compute[185723]: 2026-02-16 13:40:36.102 185727 DEBUG nova.virt.libvirt.driver [None req-04e0a887-f5eb-43c8-b042-ca4d450ef6b8 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 93d211b1-f197-4c96-a994-900df3bf28e4] Operation thread is still running _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10440
Feb 16 13:40:36 compute-0 nova_compute[185723]: 2026-02-16 13:40:36.102 185727 DEBUG nova.virt.libvirt.driver [None req-04e0a887-f5eb-43c8-b042-ca4d450ef6b8 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 93d211b1-f197-4c96-a994-900df3bf28e4] Migration not running yet _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10449
Feb 16 13:40:36 compute-0 nova_compute[185723]: 2026-02-16 13:40:36.134 185727 DEBUG nova.virt.libvirt.vif [None req-04e0a887-f5eb-43c8-b042-ca4d450ef6b8 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-16T13:39:44Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-804254472',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-804254472',id=16,image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-16T13:39:56Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='76c271745e704d5fa97fe16a7dcd4a81',ramdisk_id='',reservation_id='r-c0ijme3l',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-1085993185',owner_user_name='tempest-TestExecuteStrategies-1085993185-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-16T13:39:56Z,user_data=None,user_id='e19cd2d8a8894526ba620ca3249e9a63',uuid=93d211b1-f197-4c96-a994-900df3bf28e4,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3dd3d50b-ad63-4bee-b823-c23750e7afc1", "address": "fa:16:3e:dd:e6:11", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap3dd3d50b-ad", "ovs_interfaceid": "3dd3d50b-ad63-4bee-b823-c23750e7afc1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Feb 16 13:40:36 compute-0 nova_compute[185723]: 2026-02-16 13:40:36.135 185727 DEBUG nova.network.os_vif_util [None req-04e0a887-f5eb-43c8-b042-ca4d450ef6b8 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Converting VIF {"id": "3dd3d50b-ad63-4bee-b823-c23750e7afc1", "address": "fa:16:3e:dd:e6:11", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap3dd3d50b-ad", "ovs_interfaceid": "3dd3d50b-ad63-4bee-b823-c23750e7afc1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 13:40:36 compute-0 nova_compute[185723]: 2026-02-16 13:40:36.136 185727 DEBUG nova.network.os_vif_util [None req-04e0a887-f5eb-43c8-b042-ca4d450ef6b8 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:dd:e6:11,bridge_name='br-int',has_traffic_filtering=True,id=3dd3d50b-ad63-4bee-b823-c23750e7afc1,network=Network(62a1ccdd-3048-4bbf-acc8-c791bff79ee8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3dd3d50b-ad') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 13:40:36 compute-0 nova_compute[185723]: 2026-02-16 13:40:36.137 185727 DEBUG nova.virt.libvirt.migration [None req-04e0a887-f5eb-43c8-b042-ca4d450ef6b8 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 93d211b1-f197-4c96-a994-900df3bf28e4] Updating guest XML with vif config: <interface type="ethernet">
Feb 16 13:40:36 compute-0 nova_compute[185723]:   <mac address="fa:16:3e:dd:e6:11"/>
Feb 16 13:40:36 compute-0 nova_compute[185723]:   <model type="virtio"/>
Feb 16 13:40:36 compute-0 nova_compute[185723]:   <driver name="vhost" rx_queue_size="512"/>
Feb 16 13:40:36 compute-0 nova_compute[185723]:   <mtu size="1442"/>
Feb 16 13:40:36 compute-0 nova_compute[185723]:   <target dev="tap3dd3d50b-ad"/>
Feb 16 13:40:36 compute-0 nova_compute[185723]: </interface>
Feb 16 13:40:36 compute-0 nova_compute[185723]:  _update_vif_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:388
Feb 16 13:40:36 compute-0 nova_compute[185723]: 2026-02-16 13:40:36.138 185727 DEBUG nova.virt.libvirt.driver [None req-04e0a887-f5eb-43c8-b042-ca4d450ef6b8 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 93d211b1-f197-4c96-a994-900df3bf28e4] About to invoke the migrate API _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10272
Feb 16 13:40:36 compute-0 nova_compute[185723]: 2026-02-16 13:40:36.606 185727 DEBUG nova.virt.libvirt.migration [None req-04e0a887-f5eb-43c8-b042-ca4d450ef6b8 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 93d211b1-f197-4c96-a994-900df3bf28e4] Current None elapsed 0 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Feb 16 13:40:36 compute-0 nova_compute[185723]: 2026-02-16 13:40:36.607 185727 INFO nova.virt.libvirt.migration [None req-04e0a887-f5eb-43c8-b042-ca4d450ef6b8 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 93d211b1-f197-4c96-a994-900df3bf28e4] Increasing downtime to 50 ms after 0 sec elapsed time
Feb 16 13:40:36 compute-0 nova_compute[185723]: 2026-02-16 13:40:36.745 185727 INFO nova.virt.libvirt.driver [None req-04e0a887-f5eb-43c8-b042-ca4d450ef6b8 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 93d211b1-f197-4c96-a994-900df3bf28e4] Migration running for 0 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).
Feb 16 13:40:37 compute-0 nova_compute[185723]: 2026-02-16 13:40:37.183 185727 DEBUG nova.compute.manager [req-10d52664-a131-4eff-a023-664089845cf4 req-40163452-b404-4ccb-bffd-252813d08963 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 93d211b1-f197-4c96-a994-900df3bf28e4] Received event network-vif-plugged-3dd3d50b-ad63-4bee-b823-c23750e7afc1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:40:37 compute-0 nova_compute[185723]: 2026-02-16 13:40:37.184 185727 DEBUG oslo_concurrency.lockutils [req-10d52664-a131-4eff-a023-664089845cf4 req-40163452-b404-4ccb-bffd-252813d08963 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "93d211b1-f197-4c96-a994-900df3bf28e4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:40:37 compute-0 nova_compute[185723]: 2026-02-16 13:40:37.184 185727 DEBUG oslo_concurrency.lockutils [req-10d52664-a131-4eff-a023-664089845cf4 req-40163452-b404-4ccb-bffd-252813d08963 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "93d211b1-f197-4c96-a994-900df3bf28e4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:40:37 compute-0 nova_compute[185723]: 2026-02-16 13:40:37.185 185727 DEBUG oslo_concurrency.lockutils [req-10d52664-a131-4eff-a023-664089845cf4 req-40163452-b404-4ccb-bffd-252813d08963 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "93d211b1-f197-4c96-a994-900df3bf28e4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:40:37 compute-0 nova_compute[185723]: 2026-02-16 13:40:37.185 185727 DEBUG nova.compute.manager [req-10d52664-a131-4eff-a023-664089845cf4 req-40163452-b404-4ccb-bffd-252813d08963 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 93d211b1-f197-4c96-a994-900df3bf28e4] No waiting events found dispatching network-vif-plugged-3dd3d50b-ad63-4bee-b823-c23750e7afc1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:40:37 compute-0 nova_compute[185723]: 2026-02-16 13:40:37.185 185727 WARNING nova.compute.manager [req-10d52664-a131-4eff-a023-664089845cf4 req-40163452-b404-4ccb-bffd-252813d08963 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 93d211b1-f197-4c96-a994-900df3bf28e4] Received unexpected event network-vif-plugged-3dd3d50b-ad63-4bee-b823-c23750e7afc1 for instance with vm_state active and task_state migrating.
Feb 16 13:40:37 compute-0 nova_compute[185723]: 2026-02-16 13:40:37.186 185727 DEBUG nova.compute.manager [req-10d52664-a131-4eff-a023-664089845cf4 req-40163452-b404-4ccb-bffd-252813d08963 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 93d211b1-f197-4c96-a994-900df3bf28e4] Received event network-changed-3dd3d50b-ad63-4bee-b823-c23750e7afc1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:40:37 compute-0 nova_compute[185723]: 2026-02-16 13:40:37.186 185727 DEBUG nova.compute.manager [req-10d52664-a131-4eff-a023-664089845cf4 req-40163452-b404-4ccb-bffd-252813d08963 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 93d211b1-f197-4c96-a994-900df3bf28e4] Refreshing instance network info cache due to event network-changed-3dd3d50b-ad63-4bee-b823-c23750e7afc1. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 16 13:40:37 compute-0 nova_compute[185723]: 2026-02-16 13:40:37.187 185727 DEBUG oslo_concurrency.lockutils [req-10d52664-a131-4eff-a023-664089845cf4 req-40163452-b404-4ccb-bffd-252813d08963 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "refresh_cache-93d211b1-f197-4c96-a994-900df3bf28e4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 13:40:37 compute-0 nova_compute[185723]: 2026-02-16 13:40:37.187 185727 DEBUG oslo_concurrency.lockutils [req-10d52664-a131-4eff-a023-664089845cf4 req-40163452-b404-4ccb-bffd-252813d08963 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquired lock "refresh_cache-93d211b1-f197-4c96-a994-900df3bf28e4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 13:40:37 compute-0 nova_compute[185723]: 2026-02-16 13:40:37.188 185727 DEBUG nova.network.neutron [req-10d52664-a131-4eff-a023-664089845cf4 req-40163452-b404-4ccb-bffd-252813d08963 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 93d211b1-f197-4c96-a994-900df3bf28e4] Refreshing network info cache for port 3dd3d50b-ad63-4bee-b823-c23750e7afc1 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 16 13:40:37 compute-0 nova_compute[185723]: 2026-02-16 13:40:37.248 185727 DEBUG nova.virt.libvirt.migration [None req-04e0a887-f5eb-43c8-b042-ca4d450ef6b8 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 93d211b1-f197-4c96-a994-900df3bf28e4] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Feb 16 13:40:37 compute-0 nova_compute[185723]: 2026-02-16 13:40:37.249 185727 DEBUG nova.virt.libvirt.migration [None req-04e0a887-f5eb-43c8-b042-ca4d450ef6b8 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 93d211b1-f197-4c96-a994-900df3bf28e4] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Feb 16 13:40:37 compute-0 nova_compute[185723]: 2026-02-16 13:40:37.526 185727 DEBUG nova.virt.driver [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] Emitting event <LifecycleEvent: 1771249237.525849, 93d211b1-f197-4c96-a994-900df3bf28e4 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 13:40:37 compute-0 nova_compute[185723]: 2026-02-16 13:40:37.526 185727 INFO nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: 93d211b1-f197-4c96-a994-900df3bf28e4] VM Paused (Lifecycle Event)
Feb 16 13:40:37 compute-0 nova_compute[185723]: 2026-02-16 13:40:37.552 185727 DEBUG nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: 93d211b1-f197-4c96-a994-900df3bf28e4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:40:37 compute-0 nova_compute[185723]: 2026-02-16 13:40:37.557 185727 DEBUG nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: 93d211b1-f197-4c96-a994-900df3bf28e4] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 16 13:40:37 compute-0 nova_compute[185723]: 2026-02-16 13:40:37.585 185727 INFO nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: 93d211b1-f197-4c96-a994-900df3bf28e4] During sync_power_state the instance has a pending task (migrating). Skip.
Feb 16 13:40:37 compute-0 nova_compute[185723]: 2026-02-16 13:40:37.752 185727 DEBUG nova.virt.libvirt.migration [None req-04e0a887-f5eb-43c8-b042-ca4d450ef6b8 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 93d211b1-f197-4c96-a994-900df3bf28e4] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Feb 16 13:40:37 compute-0 nova_compute[185723]: 2026-02-16 13:40:37.753 185727 DEBUG nova.virt.libvirt.migration [None req-04e0a887-f5eb-43c8-b042-ca4d450ef6b8 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 93d211b1-f197-4c96-a994-900df3bf28e4] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Feb 16 13:40:37 compute-0 kernel: tap3dd3d50b-ad (unregistering): left promiscuous mode
Feb 16 13:40:37 compute-0 NetworkManager[56177]: <info>  [1771249237.7718] device (tap3dd3d50b-ad): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Feb 16 13:40:37 compute-0 nova_compute[185723]: 2026-02-16 13:40:37.778 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:40:37 compute-0 ovn_controller[96072]: 2026-02-16T13:40:37Z|00142|binding|INFO|Releasing lport 3dd3d50b-ad63-4bee-b823-c23750e7afc1 from this chassis (sb_readonly=0)
Feb 16 13:40:37 compute-0 ovn_controller[96072]: 2026-02-16T13:40:37Z|00143|binding|INFO|Setting lport 3dd3d50b-ad63-4bee-b823-c23750e7afc1 down in Southbound
Feb 16 13:40:37 compute-0 ovn_controller[96072]: 2026-02-16T13:40:37Z|00144|binding|INFO|Removing iface tap3dd3d50b-ad ovn-installed in OVS
Feb 16 13:40:37 compute-0 nova_compute[185723]: 2026-02-16 13:40:37.784 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:40:37 compute-0 nova_compute[185723]: 2026-02-16 13:40:37.788 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:40:37 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:40:37.789 105360 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:dd:e6:11 10.100.0.8'], port_security=['fa:16:3e:dd:e6:11 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com,compute-1.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': '54c1a259-778a-4222-b2c6-8422ea19a065'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '93d211b1-f197-4c96-a994-900df3bf28e4', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '76c271745e704d5fa97fe16a7dcd4a81', 'neutron:revision_number': '8', 'neutron:security_group_ids': '832f9d98-96fb-45e2-8c11-0f4534711ee2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fb3ae2f5-242d-4288-83f4-c053c76499e5, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2e53046760>], logical_port=3dd3d50b-ad63-4bee-b823-c23750e7afc1) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2e53046760>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 13:40:37 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:40:37.793 105360 INFO neutron.agent.ovn.metadata.agent [-] Port 3dd3d50b-ad63-4bee-b823-c23750e7afc1 in datapath 62a1ccdd-3048-4bbf-acc8-c791bff79ee8 unbound from our chassis
Feb 16 13:40:37 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:40:37.795 105360 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 62a1ccdd-3048-4bbf-acc8-c791bff79ee8, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 16 13:40:37 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:40:37.796 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[d5e324f8-1bb7-46f2-bc63-3c56105fc17a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:40:37 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:40:37.797 105360 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8 namespace which is not needed anymore
Feb 16 13:40:37 compute-0 systemd[1]: machine-qemu\x2d12\x2dinstance\x2d00000010.scope: Deactivated successfully.
Feb 16 13:40:37 compute-0 systemd[1]: machine-qemu\x2d12\x2dinstance\x2d00000010.scope: Consumed 14.988s CPU time.
Feb 16 13:40:37 compute-0 systemd-machined[155229]: Machine qemu-12-instance-00000010 terminated.
Feb 16 13:40:37 compute-0 neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8[212351]: [NOTICE]   (212355) : haproxy version is 2.8.14-c23fe91
Feb 16 13:40:37 compute-0 neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8[212351]: [NOTICE]   (212355) : path to executable is /usr/sbin/haproxy
Feb 16 13:40:37 compute-0 neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8[212351]: [WARNING]  (212355) : Exiting Master process...
Feb 16 13:40:37 compute-0 neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8[212351]: [ALERT]    (212355) : Current worker (212358) exited with code 143 (Terminated)
Feb 16 13:40:37 compute-0 neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8[212351]: [WARNING]  (212355) : All workers exited. Exiting... (0)
Feb 16 13:40:37 compute-0 systemd[1]: libpod-8c9b7905a09b21da81a7ff46af6afeb97f451a14a24b69ccc0f5f0ca26396a73.scope: Deactivated successfully.
Feb 16 13:40:37 compute-0 podman[212583]: 2026-02-16 13:40:37.936557856 +0000 UTC m=+0.049102528 container died 8c9b7905a09b21da81a7ff46af6afeb97f451a14a24b69ccc0f5f0ca26396a73 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8, org.label-schema.build-date=20260127, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Feb 16 13:40:37 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-8c9b7905a09b21da81a7ff46af6afeb97f451a14a24b69ccc0f5f0ca26396a73-userdata-shm.mount: Deactivated successfully.
Feb 16 13:40:37 compute-0 systemd[1]: var-lib-containers-storage-overlay-3843e236d64219c7463eab8571ee742251b40a3e3c27e3f041c4eaefe86f58f9-merged.mount: Deactivated successfully.
Feb 16 13:40:37 compute-0 podman[212583]: 2026-02-16 13:40:37.985416067 +0000 UTC m=+0.097960749 container cleanup 8c9b7905a09b21da81a7ff46af6afeb97f451a14a24b69ccc0f5f0ca26396a73 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127)
Feb 16 13:40:37 compute-0 systemd[1]: libpod-conmon-8c9b7905a09b21da81a7ff46af6afeb97f451a14a24b69ccc0f5f0ca26396a73.scope: Deactivated successfully.
Feb 16 13:40:38 compute-0 nova_compute[185723]: 2026-02-16 13:40:38.010 185727 DEBUG nova.virt.libvirt.driver [None req-04e0a887-f5eb-43c8-b042-ca4d450ef6b8 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 93d211b1-f197-4c96-a994-900df3bf28e4] Migrate API has completed _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10279
Feb 16 13:40:38 compute-0 nova_compute[185723]: 2026-02-16 13:40:38.010 185727 DEBUG nova.virt.libvirt.driver [None req-04e0a887-f5eb-43c8-b042-ca4d450ef6b8 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 93d211b1-f197-4c96-a994-900df3bf28e4] Migration operation thread has finished _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10327
Feb 16 13:40:38 compute-0 nova_compute[185723]: 2026-02-16 13:40:38.010 185727 DEBUG nova.virt.libvirt.driver [None req-04e0a887-f5eb-43c8-b042-ca4d450ef6b8 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 93d211b1-f197-4c96-a994-900df3bf28e4] Migration operation thread notification thread_finished /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10630
Feb 16 13:40:38 compute-0 podman[212628]: 2026-02-16 13:40:38.051463665 +0000 UTC m=+0.044491724 container remove 8c9b7905a09b21da81a7ff46af6afeb97f451a14a24b69ccc0f5f0ca26396a73 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Feb 16 13:40:38 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:40:38.056 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[52fa4a8a-fb7f-4367-8e75-c4a0b465bc76]: (4, ('Mon Feb 16 01:40:37 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8 (8c9b7905a09b21da81a7ff46af6afeb97f451a14a24b69ccc0f5f0ca26396a73)\n8c9b7905a09b21da81a7ff46af6afeb97f451a14a24b69ccc0f5f0ca26396a73\nMon Feb 16 01:40:37 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8 (8c9b7905a09b21da81a7ff46af6afeb97f451a14a24b69ccc0f5f0ca26396a73)\n8c9b7905a09b21da81a7ff46af6afeb97f451a14a24b69ccc0f5f0ca26396a73\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:40:38 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:40:38.058 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[22c1aa5f-3aa8-435c-a6fd-3d054725494d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:40:38 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:40:38.060 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap62a1ccdd-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:40:38 compute-0 nova_compute[185723]: 2026-02-16 13:40:38.063 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:40:38 compute-0 kernel: tap62a1ccdd-30: left promiscuous mode
Feb 16 13:40:38 compute-0 nova_compute[185723]: 2026-02-16 13:40:38.071 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:40:38 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:40:38.074 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[bb0dd9c2-9ef4-4a31-bf55-55d08a1801a0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:40:38 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:40:38.089 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[80fbd298-6d6f-46f6-bc37-6cbf428c468f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:40:38 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:40:38.091 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[31d73e42-b550-402c-9bdd-5b1e904cca3c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:40:38 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:40:38.103 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[f12a30bf-e427-4e20-a584-78fbfeee61aa]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 527859, 'reachable_time': 25436, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 212649, 'error': None, 'target': 'ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:40:38 compute-0 systemd[1]: run-netns-ovnmeta\x2d62a1ccdd\x2d3048\x2d4bbf\x2dacc8\x2dc791bff79ee8.mount: Deactivated successfully.
Feb 16 13:40:38 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:40:38.108 105762 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Feb 16 13:40:38 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:40:38.108 105762 DEBUG oslo.privsep.daemon [-] privsep: reply[046dc23a-4711-401d-a4ab-0128262cd35a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:40:38 compute-0 nova_compute[185723]: 2026-02-16 13:40:38.255 185727 DEBUG nova.virt.libvirt.guest [None req-04e0a887-f5eb-43c8-b042-ca4d450ef6b8 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Domain has shutdown/gone away: Domain not found: no domain with matching uuid '93d211b1-f197-4c96-a994-900df3bf28e4' (instance-00000010) get_job_info /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:688
Feb 16 13:40:38 compute-0 nova_compute[185723]: 2026-02-16 13:40:38.256 185727 INFO nova.virt.libvirt.driver [None req-04e0a887-f5eb-43c8-b042-ca4d450ef6b8 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 93d211b1-f197-4c96-a994-900df3bf28e4] Migration operation has completed
Feb 16 13:40:38 compute-0 nova_compute[185723]: 2026-02-16 13:40:38.256 185727 INFO nova.compute.manager [None req-04e0a887-f5eb-43c8-b042-ca4d450ef6b8 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 93d211b1-f197-4c96-a994-900df3bf28e4] _post_live_migration() is started..
Feb 16 13:40:38 compute-0 nova_compute[185723]: 2026-02-16 13:40:38.912 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:40:39 compute-0 nova_compute[185723]: 2026-02-16 13:40:39.364 185727 DEBUG nova.compute.manager [req-1a03a5eb-61f6-4355-8824-fa550e063261 req-e6b00766-f4c2-435f-8b85-f683fdb8fbf0 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 93d211b1-f197-4c96-a994-900df3bf28e4] Received event network-vif-unplugged-3dd3d50b-ad63-4bee-b823-c23750e7afc1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:40:39 compute-0 nova_compute[185723]: 2026-02-16 13:40:39.364 185727 DEBUG oslo_concurrency.lockutils [req-1a03a5eb-61f6-4355-8824-fa550e063261 req-e6b00766-f4c2-435f-8b85-f683fdb8fbf0 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "93d211b1-f197-4c96-a994-900df3bf28e4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:40:39 compute-0 nova_compute[185723]: 2026-02-16 13:40:39.365 185727 DEBUG oslo_concurrency.lockutils [req-1a03a5eb-61f6-4355-8824-fa550e063261 req-e6b00766-f4c2-435f-8b85-f683fdb8fbf0 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "93d211b1-f197-4c96-a994-900df3bf28e4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:40:39 compute-0 nova_compute[185723]: 2026-02-16 13:40:39.365 185727 DEBUG oslo_concurrency.lockutils [req-1a03a5eb-61f6-4355-8824-fa550e063261 req-e6b00766-f4c2-435f-8b85-f683fdb8fbf0 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "93d211b1-f197-4c96-a994-900df3bf28e4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:40:39 compute-0 nova_compute[185723]: 2026-02-16 13:40:39.365 185727 DEBUG nova.compute.manager [req-1a03a5eb-61f6-4355-8824-fa550e063261 req-e6b00766-f4c2-435f-8b85-f683fdb8fbf0 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 93d211b1-f197-4c96-a994-900df3bf28e4] No waiting events found dispatching network-vif-unplugged-3dd3d50b-ad63-4bee-b823-c23750e7afc1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:40:39 compute-0 nova_compute[185723]: 2026-02-16 13:40:39.365 185727 DEBUG nova.compute.manager [req-1a03a5eb-61f6-4355-8824-fa550e063261 req-e6b00766-f4c2-435f-8b85-f683fdb8fbf0 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 93d211b1-f197-4c96-a994-900df3bf28e4] Received event network-vif-unplugged-3dd3d50b-ad63-4bee-b823-c23750e7afc1 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Feb 16 13:40:39 compute-0 nova_compute[185723]: 2026-02-16 13:40:39.366 185727 DEBUG nova.compute.manager [req-1a03a5eb-61f6-4355-8824-fa550e063261 req-e6b00766-f4c2-435f-8b85-f683fdb8fbf0 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 93d211b1-f197-4c96-a994-900df3bf28e4] Received event network-vif-plugged-3dd3d50b-ad63-4bee-b823-c23750e7afc1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:40:39 compute-0 nova_compute[185723]: 2026-02-16 13:40:39.366 185727 DEBUG oslo_concurrency.lockutils [req-1a03a5eb-61f6-4355-8824-fa550e063261 req-e6b00766-f4c2-435f-8b85-f683fdb8fbf0 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "93d211b1-f197-4c96-a994-900df3bf28e4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:40:39 compute-0 nova_compute[185723]: 2026-02-16 13:40:39.366 185727 DEBUG oslo_concurrency.lockutils [req-1a03a5eb-61f6-4355-8824-fa550e063261 req-e6b00766-f4c2-435f-8b85-f683fdb8fbf0 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "93d211b1-f197-4c96-a994-900df3bf28e4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:40:39 compute-0 nova_compute[185723]: 2026-02-16 13:40:39.366 185727 DEBUG oslo_concurrency.lockutils [req-1a03a5eb-61f6-4355-8824-fa550e063261 req-e6b00766-f4c2-435f-8b85-f683fdb8fbf0 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "93d211b1-f197-4c96-a994-900df3bf28e4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:40:39 compute-0 nova_compute[185723]: 2026-02-16 13:40:39.366 185727 DEBUG nova.compute.manager [req-1a03a5eb-61f6-4355-8824-fa550e063261 req-e6b00766-f4c2-435f-8b85-f683fdb8fbf0 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 93d211b1-f197-4c96-a994-900df3bf28e4] No waiting events found dispatching network-vif-plugged-3dd3d50b-ad63-4bee-b823-c23750e7afc1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:40:39 compute-0 nova_compute[185723]: 2026-02-16 13:40:39.367 185727 WARNING nova.compute.manager [req-1a03a5eb-61f6-4355-8824-fa550e063261 req-e6b00766-f4c2-435f-8b85-f683fdb8fbf0 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 93d211b1-f197-4c96-a994-900df3bf28e4] Received unexpected event network-vif-plugged-3dd3d50b-ad63-4bee-b823-c23750e7afc1 for instance with vm_state active and task_state migrating.
Feb 16 13:40:39 compute-0 nova_compute[185723]: 2026-02-16 13:40:39.367 185727 DEBUG nova.compute.manager [req-1a03a5eb-61f6-4355-8824-fa550e063261 req-e6b00766-f4c2-435f-8b85-f683fdb8fbf0 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 93d211b1-f197-4c96-a994-900df3bf28e4] Received event network-vif-plugged-3dd3d50b-ad63-4bee-b823-c23750e7afc1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:40:39 compute-0 nova_compute[185723]: 2026-02-16 13:40:39.367 185727 DEBUG oslo_concurrency.lockutils [req-1a03a5eb-61f6-4355-8824-fa550e063261 req-e6b00766-f4c2-435f-8b85-f683fdb8fbf0 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "93d211b1-f197-4c96-a994-900df3bf28e4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:40:39 compute-0 nova_compute[185723]: 2026-02-16 13:40:39.367 185727 DEBUG oslo_concurrency.lockutils [req-1a03a5eb-61f6-4355-8824-fa550e063261 req-e6b00766-f4c2-435f-8b85-f683fdb8fbf0 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "93d211b1-f197-4c96-a994-900df3bf28e4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:40:39 compute-0 nova_compute[185723]: 2026-02-16 13:40:39.368 185727 DEBUG oslo_concurrency.lockutils [req-1a03a5eb-61f6-4355-8824-fa550e063261 req-e6b00766-f4c2-435f-8b85-f683fdb8fbf0 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "93d211b1-f197-4c96-a994-900df3bf28e4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:40:39 compute-0 nova_compute[185723]: 2026-02-16 13:40:39.368 185727 DEBUG nova.compute.manager [req-1a03a5eb-61f6-4355-8824-fa550e063261 req-e6b00766-f4c2-435f-8b85-f683fdb8fbf0 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 93d211b1-f197-4c96-a994-900df3bf28e4] No waiting events found dispatching network-vif-plugged-3dd3d50b-ad63-4bee-b823-c23750e7afc1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:40:39 compute-0 nova_compute[185723]: 2026-02-16 13:40:39.368 185727 WARNING nova.compute.manager [req-1a03a5eb-61f6-4355-8824-fa550e063261 req-e6b00766-f4c2-435f-8b85-f683fdb8fbf0 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 93d211b1-f197-4c96-a994-900df3bf28e4] Received unexpected event network-vif-plugged-3dd3d50b-ad63-4bee-b823-c23750e7afc1 for instance with vm_state active and task_state migrating.
Feb 16 13:40:39 compute-0 nova_compute[185723]: 2026-02-16 13:40:39.368 185727 DEBUG nova.compute.manager [req-1a03a5eb-61f6-4355-8824-fa550e063261 req-e6b00766-f4c2-435f-8b85-f683fdb8fbf0 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 93d211b1-f197-4c96-a994-900df3bf28e4] Received event network-vif-unplugged-3dd3d50b-ad63-4bee-b823-c23750e7afc1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:40:39 compute-0 nova_compute[185723]: 2026-02-16 13:40:39.369 185727 DEBUG oslo_concurrency.lockutils [req-1a03a5eb-61f6-4355-8824-fa550e063261 req-e6b00766-f4c2-435f-8b85-f683fdb8fbf0 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "93d211b1-f197-4c96-a994-900df3bf28e4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:40:39 compute-0 nova_compute[185723]: 2026-02-16 13:40:39.369 185727 DEBUG oslo_concurrency.lockutils [req-1a03a5eb-61f6-4355-8824-fa550e063261 req-e6b00766-f4c2-435f-8b85-f683fdb8fbf0 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "93d211b1-f197-4c96-a994-900df3bf28e4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:40:39 compute-0 nova_compute[185723]: 2026-02-16 13:40:39.369 185727 DEBUG oslo_concurrency.lockutils [req-1a03a5eb-61f6-4355-8824-fa550e063261 req-e6b00766-f4c2-435f-8b85-f683fdb8fbf0 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "93d211b1-f197-4c96-a994-900df3bf28e4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:40:39 compute-0 nova_compute[185723]: 2026-02-16 13:40:39.369 185727 DEBUG nova.compute.manager [req-1a03a5eb-61f6-4355-8824-fa550e063261 req-e6b00766-f4c2-435f-8b85-f683fdb8fbf0 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 93d211b1-f197-4c96-a994-900df3bf28e4] No waiting events found dispatching network-vif-unplugged-3dd3d50b-ad63-4bee-b823-c23750e7afc1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:40:39 compute-0 nova_compute[185723]: 2026-02-16 13:40:39.370 185727 DEBUG nova.compute.manager [req-1a03a5eb-61f6-4355-8824-fa550e063261 req-e6b00766-f4c2-435f-8b85-f683fdb8fbf0 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 93d211b1-f197-4c96-a994-900df3bf28e4] Received event network-vif-unplugged-3dd3d50b-ad63-4bee-b823-c23750e7afc1 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Feb 16 13:40:39 compute-0 nova_compute[185723]: 2026-02-16 13:40:39.527 185727 DEBUG nova.network.neutron [req-10d52664-a131-4eff-a023-664089845cf4 req-40163452-b404-4ccb-bffd-252813d08963 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 93d211b1-f197-4c96-a994-900df3bf28e4] Updated VIF entry in instance network info cache for port 3dd3d50b-ad63-4bee-b823-c23750e7afc1. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 16 13:40:39 compute-0 nova_compute[185723]: 2026-02-16 13:40:39.529 185727 DEBUG nova.network.neutron [req-10d52664-a131-4eff-a023-664089845cf4 req-40163452-b404-4ccb-bffd-252813d08963 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 93d211b1-f197-4c96-a994-900df3bf28e4] Updating instance_info_cache with network_info: [{"id": "3dd3d50b-ad63-4bee-b823-c23750e7afc1", "address": "fa:16:3e:dd:e6:11", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3dd3d50b-ad", "ovs_interfaceid": "3dd3d50b-ad63-4bee-b823-c23750e7afc1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-1.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 13:40:39 compute-0 nova_compute[185723]: 2026-02-16 13:40:39.557 185727 DEBUG oslo_concurrency.lockutils [req-10d52664-a131-4eff-a023-664089845cf4 req-40163452-b404-4ccb-bffd-252813d08963 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Releasing lock "refresh_cache-93d211b1-f197-4c96-a994-900df3bf28e4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 13:40:39 compute-0 nova_compute[185723]: 2026-02-16 13:40:39.613 185727 DEBUG nova.network.neutron [None req-04e0a887-f5eb-43c8-b042-ca4d450ef6b8 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Activated binding for port 3dd3d50b-ad63-4bee-b823-c23750e7afc1 and host compute-1.ctlplane.example.com migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3181
Feb 16 13:40:39 compute-0 nova_compute[185723]: 2026-02-16 13:40:39.614 185727 DEBUG nova.compute.manager [None req-04e0a887-f5eb-43c8-b042-ca4d450ef6b8 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 93d211b1-f197-4c96-a994-900df3bf28e4] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "3dd3d50b-ad63-4bee-b823-c23750e7afc1", "address": "fa:16:3e:dd:e6:11", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3dd3d50b-ad", "ovs_interfaceid": "3dd3d50b-ad63-4bee-b823-c23750e7afc1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9326
Feb 16 13:40:39 compute-0 nova_compute[185723]: 2026-02-16 13:40:39.615 185727 DEBUG nova.virt.libvirt.vif [None req-04e0a887-f5eb-43c8-b042-ca4d450ef6b8 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-16T13:39:44Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-804254472',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-804254472',id=16,image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-16T13:39:56Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='76c271745e704d5fa97fe16a7dcd4a81',ramdisk_id='',reservation_id='r-c0ijme3l',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-1085993185',owner_user_name='tempest-TestExecuteStrategies-1085993185-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-16T13:40:24Z,user_data=None,user_id='e19cd2d8a8894526ba620ca3249e9a63',uuid=93d211b1-f197-4c96-a994-900df3bf28e4,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3dd3d50b-ad63-4bee-b823-c23750e7afc1", "address": "fa:16:3e:dd:e6:11", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3dd3d50b-ad", "ovs_interfaceid": "3dd3d50b-ad63-4bee-b823-c23750e7afc1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Feb 16 13:40:39 compute-0 nova_compute[185723]: 2026-02-16 13:40:39.615 185727 DEBUG nova.network.os_vif_util [None req-04e0a887-f5eb-43c8-b042-ca4d450ef6b8 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Converting VIF {"id": "3dd3d50b-ad63-4bee-b823-c23750e7afc1", "address": "fa:16:3e:dd:e6:11", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3dd3d50b-ad", "ovs_interfaceid": "3dd3d50b-ad63-4bee-b823-c23750e7afc1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 13:40:39 compute-0 nova_compute[185723]: 2026-02-16 13:40:39.616 185727 DEBUG nova.network.os_vif_util [None req-04e0a887-f5eb-43c8-b042-ca4d450ef6b8 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:dd:e6:11,bridge_name='br-int',has_traffic_filtering=True,id=3dd3d50b-ad63-4bee-b823-c23750e7afc1,network=Network(62a1ccdd-3048-4bbf-acc8-c791bff79ee8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3dd3d50b-ad') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 13:40:39 compute-0 nova_compute[185723]: 2026-02-16 13:40:39.617 185727 DEBUG os_vif [None req-04e0a887-f5eb-43c8-b042-ca4d450ef6b8 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:dd:e6:11,bridge_name='br-int',has_traffic_filtering=True,id=3dd3d50b-ad63-4bee-b823-c23750e7afc1,network=Network(62a1ccdd-3048-4bbf-acc8-c791bff79ee8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3dd3d50b-ad') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Feb 16 13:40:39 compute-0 nova_compute[185723]: 2026-02-16 13:40:39.619 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:40:39 compute-0 nova_compute[185723]: 2026-02-16 13:40:39.619 185727 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3dd3d50b-ad, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:40:39 compute-0 nova_compute[185723]: 2026-02-16 13:40:39.621 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:40:39 compute-0 nova_compute[185723]: 2026-02-16 13:40:39.623 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:40:39 compute-0 nova_compute[185723]: 2026-02-16 13:40:39.627 185727 INFO os_vif [None req-04e0a887-f5eb-43c8-b042-ca4d450ef6b8 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:dd:e6:11,bridge_name='br-int',has_traffic_filtering=True,id=3dd3d50b-ad63-4bee-b823-c23750e7afc1,network=Network(62a1ccdd-3048-4bbf-acc8-c791bff79ee8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3dd3d50b-ad')
Feb 16 13:40:39 compute-0 nova_compute[185723]: 2026-02-16 13:40:39.628 185727 DEBUG oslo_concurrency.lockutils [None req-04e0a887-f5eb-43c8-b042-ca4d450ef6b8 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:40:39 compute-0 nova_compute[185723]: 2026-02-16 13:40:39.628 185727 DEBUG oslo_concurrency.lockutils [None req-04e0a887-f5eb-43c8-b042-ca4d450ef6b8 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:40:39 compute-0 nova_compute[185723]: 2026-02-16 13:40:39.629 185727 DEBUG oslo_concurrency.lockutils [None req-04e0a887-f5eb-43c8-b042-ca4d450ef6b8 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:40:39 compute-0 nova_compute[185723]: 2026-02-16 13:40:39.629 185727 DEBUG nova.compute.manager [None req-04e0a887-f5eb-43c8-b042-ca4d450ef6b8 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 93d211b1-f197-4c96-a994-900df3bf28e4] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9349
Feb 16 13:40:39 compute-0 nova_compute[185723]: 2026-02-16 13:40:39.629 185727 INFO nova.virt.libvirt.driver [None req-04e0a887-f5eb-43c8-b042-ca4d450ef6b8 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 93d211b1-f197-4c96-a994-900df3bf28e4] Deleting instance files /var/lib/nova/instances/93d211b1-f197-4c96-a994-900df3bf28e4_del
Feb 16 13:40:39 compute-0 nova_compute[185723]: 2026-02-16 13:40:39.630 185727 INFO nova.virt.libvirt.driver [None req-04e0a887-f5eb-43c8-b042-ca4d450ef6b8 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 93d211b1-f197-4c96-a994-900df3bf28e4] Deletion of /var/lib/nova/instances/93d211b1-f197-4c96-a994-900df3bf28e4_del complete
Feb 16 13:40:39 compute-0 nova_compute[185723]: 2026-02-16 13:40:39.939 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:40:40 compute-0 podman[212650]: 2026-02-16 13:40:40.037590554 +0000 UTC m=+0.067436953 container health_status 4ec6105d1727f201f656d7e6e358931be8ceabc5af37fb5ad779abc620122180 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 16 13:40:41 compute-0 nova_compute[185723]: 2026-02-16 13:40:41.574 185727 DEBUG nova.compute.manager [req-2b50195e-a390-40f9-8729-b1aaa6aa72cf req-07869893-de58-46f0-a09e-14cb94d0c1df faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 93d211b1-f197-4c96-a994-900df3bf28e4] Received event network-vif-plugged-3dd3d50b-ad63-4bee-b823-c23750e7afc1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:40:41 compute-0 nova_compute[185723]: 2026-02-16 13:40:41.574 185727 DEBUG oslo_concurrency.lockutils [req-2b50195e-a390-40f9-8729-b1aaa6aa72cf req-07869893-de58-46f0-a09e-14cb94d0c1df faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "93d211b1-f197-4c96-a994-900df3bf28e4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:40:41 compute-0 nova_compute[185723]: 2026-02-16 13:40:41.574 185727 DEBUG oslo_concurrency.lockutils [req-2b50195e-a390-40f9-8729-b1aaa6aa72cf req-07869893-de58-46f0-a09e-14cb94d0c1df faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "93d211b1-f197-4c96-a994-900df3bf28e4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:40:41 compute-0 nova_compute[185723]: 2026-02-16 13:40:41.574 185727 DEBUG oslo_concurrency.lockutils [req-2b50195e-a390-40f9-8729-b1aaa6aa72cf req-07869893-de58-46f0-a09e-14cb94d0c1df faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "93d211b1-f197-4c96-a994-900df3bf28e4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:40:41 compute-0 nova_compute[185723]: 2026-02-16 13:40:41.575 185727 DEBUG nova.compute.manager [req-2b50195e-a390-40f9-8729-b1aaa6aa72cf req-07869893-de58-46f0-a09e-14cb94d0c1df faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 93d211b1-f197-4c96-a994-900df3bf28e4] No waiting events found dispatching network-vif-plugged-3dd3d50b-ad63-4bee-b823-c23750e7afc1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:40:41 compute-0 nova_compute[185723]: 2026-02-16 13:40:41.575 185727 WARNING nova.compute.manager [req-2b50195e-a390-40f9-8729-b1aaa6aa72cf req-07869893-de58-46f0-a09e-14cb94d0c1df faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 93d211b1-f197-4c96-a994-900df3bf28e4] Received unexpected event network-vif-plugged-3dd3d50b-ad63-4bee-b823-c23750e7afc1 for instance with vm_state active and task_state migrating.
Feb 16 13:40:41 compute-0 nova_compute[185723]: 2026-02-16 13:40:41.575 185727 DEBUG nova.compute.manager [req-2b50195e-a390-40f9-8729-b1aaa6aa72cf req-07869893-de58-46f0-a09e-14cb94d0c1df faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 93d211b1-f197-4c96-a994-900df3bf28e4] Received event network-vif-plugged-3dd3d50b-ad63-4bee-b823-c23750e7afc1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:40:41 compute-0 nova_compute[185723]: 2026-02-16 13:40:41.575 185727 DEBUG oslo_concurrency.lockutils [req-2b50195e-a390-40f9-8729-b1aaa6aa72cf req-07869893-de58-46f0-a09e-14cb94d0c1df faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "93d211b1-f197-4c96-a994-900df3bf28e4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:40:41 compute-0 nova_compute[185723]: 2026-02-16 13:40:41.575 185727 DEBUG oslo_concurrency.lockutils [req-2b50195e-a390-40f9-8729-b1aaa6aa72cf req-07869893-de58-46f0-a09e-14cb94d0c1df faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "93d211b1-f197-4c96-a994-900df3bf28e4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:40:41 compute-0 nova_compute[185723]: 2026-02-16 13:40:41.576 185727 DEBUG oslo_concurrency.lockutils [req-2b50195e-a390-40f9-8729-b1aaa6aa72cf req-07869893-de58-46f0-a09e-14cb94d0c1df faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "93d211b1-f197-4c96-a994-900df3bf28e4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:40:41 compute-0 nova_compute[185723]: 2026-02-16 13:40:41.576 185727 DEBUG nova.compute.manager [req-2b50195e-a390-40f9-8729-b1aaa6aa72cf req-07869893-de58-46f0-a09e-14cb94d0c1df faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 93d211b1-f197-4c96-a994-900df3bf28e4] No waiting events found dispatching network-vif-plugged-3dd3d50b-ad63-4bee-b823-c23750e7afc1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:40:41 compute-0 nova_compute[185723]: 2026-02-16 13:40:41.576 185727 WARNING nova.compute.manager [req-2b50195e-a390-40f9-8729-b1aaa6aa72cf req-07869893-de58-46f0-a09e-14cb94d0c1df faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 93d211b1-f197-4c96-a994-900df3bf28e4] Received unexpected event network-vif-plugged-3dd3d50b-ad63-4bee-b823-c23750e7afc1 for instance with vm_state active and task_state migrating.
Feb 16 13:40:42 compute-0 systemd[1]: virtproxyd.service: Deactivated successfully.
Feb 16 13:40:44 compute-0 systemd[1]: Stopping User Manager for UID 42436...
Feb 16 13:40:44 compute-0 systemd[212530]: Activating special unit Exit the Session...
Feb 16 13:40:44 compute-0 systemd[212530]: Stopped target Main User Target.
Feb 16 13:40:44 compute-0 systemd[212530]: Stopped target Basic System.
Feb 16 13:40:44 compute-0 systemd[212530]: Stopped target Paths.
Feb 16 13:40:44 compute-0 systemd[212530]: Stopped target Sockets.
Feb 16 13:40:44 compute-0 systemd[212530]: Stopped target Timers.
Feb 16 13:40:44 compute-0 systemd[212530]: Stopped Mark boot as successful after the user session has run 2 minutes.
Feb 16 13:40:44 compute-0 systemd[212530]: Stopped Daily Cleanup of User's Temporary Directories.
Feb 16 13:40:44 compute-0 systemd[212530]: Closed D-Bus User Message Bus Socket.
Feb 16 13:40:44 compute-0 systemd[212530]: Stopped Create User's Volatile Files and Directories.
Feb 16 13:40:44 compute-0 systemd[212530]: Removed slice User Application Slice.
Feb 16 13:40:44 compute-0 systemd[212530]: Reached target Shutdown.
Feb 16 13:40:44 compute-0 systemd[212530]: Finished Exit the Session.
Feb 16 13:40:44 compute-0 systemd[212530]: Reached target Exit the Session.
Feb 16 13:40:44 compute-0 systemd[1]: user@42436.service: Deactivated successfully.
Feb 16 13:40:44 compute-0 systemd[1]: Stopped User Manager for UID 42436.
Feb 16 13:40:44 compute-0 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Feb 16 13:40:44 compute-0 systemd[1]: run-user-42436.mount: Deactivated successfully.
Feb 16 13:40:44 compute-0 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Feb 16 13:40:44 compute-0 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Feb 16 13:40:44 compute-0 systemd[1]: Removed slice User Slice of UID 42436.
Feb 16 13:40:44 compute-0 nova_compute[185723]: 2026-02-16 13:40:44.658 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:40:44 compute-0 nova_compute[185723]: 2026-02-16 13:40:44.942 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:40:46 compute-0 nova_compute[185723]: 2026-02-16 13:40:46.631 185727 DEBUG oslo_concurrency.lockutils [None req-04e0a887-f5eb-43c8-b042-ca4d450ef6b8 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "93d211b1-f197-4c96-a994-900df3bf28e4-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:40:46 compute-0 nova_compute[185723]: 2026-02-16 13:40:46.632 185727 DEBUG oslo_concurrency.lockutils [None req-04e0a887-f5eb-43c8-b042-ca4d450ef6b8 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "93d211b1-f197-4c96-a994-900df3bf28e4-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:40:46 compute-0 nova_compute[185723]: 2026-02-16 13:40:46.632 185727 DEBUG oslo_concurrency.lockutils [None req-04e0a887-f5eb-43c8-b042-ca4d450ef6b8 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "93d211b1-f197-4c96-a994-900df3bf28e4-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:40:46 compute-0 nova_compute[185723]: 2026-02-16 13:40:46.707 185727 DEBUG oslo_concurrency.lockutils [None req-04e0a887-f5eb-43c8-b042-ca4d450ef6b8 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:40:46 compute-0 nova_compute[185723]: 2026-02-16 13:40:46.708 185727 DEBUG oslo_concurrency.lockutils [None req-04e0a887-f5eb-43c8-b042-ca4d450ef6b8 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:40:46 compute-0 nova_compute[185723]: 2026-02-16 13:40:46.709 185727 DEBUG oslo_concurrency.lockutils [None req-04e0a887-f5eb-43c8-b042-ca4d450ef6b8 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:40:46 compute-0 nova_compute[185723]: 2026-02-16 13:40:46.709 185727 DEBUG nova.compute.resource_tracker [None req-04e0a887-f5eb-43c8-b042-ca4d450ef6b8 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 16 13:40:46 compute-0 nova_compute[185723]: 2026-02-16 13:40:46.876 185727 WARNING nova.virt.libvirt.driver [None req-04e0a887-f5eb-43c8-b042-ca4d450ef6b8 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 13:40:46 compute-0 nova_compute[185723]: 2026-02-16 13:40:46.877 185727 DEBUG nova.compute.resource_tracker [None req-04e0a887-f5eb-43c8-b042-ca4d450ef6b8 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5795MB free_disk=73.22513961791992GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 16 13:40:46 compute-0 nova_compute[185723]: 2026-02-16 13:40:46.877 185727 DEBUG oslo_concurrency.lockutils [None req-04e0a887-f5eb-43c8-b042-ca4d450ef6b8 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:40:46 compute-0 nova_compute[185723]: 2026-02-16 13:40:46.878 185727 DEBUG oslo_concurrency.lockutils [None req-04e0a887-f5eb-43c8-b042-ca4d450ef6b8 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:40:46 compute-0 nova_compute[185723]: 2026-02-16 13:40:46.961 185727 DEBUG nova.compute.resource_tracker [None req-04e0a887-f5eb-43c8-b042-ca4d450ef6b8 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Migration for instance 93d211b1-f197-4c96-a994-900df3bf28e4 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903
Feb 16 13:40:47 compute-0 nova_compute[185723]: 2026-02-16 13:40:47.000 185727 DEBUG nova.compute.resource_tracker [None req-04e0a887-f5eb-43c8-b042-ca4d450ef6b8 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 93d211b1-f197-4c96-a994-900df3bf28e4] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1491
Feb 16 13:40:47 compute-0 nova_compute[185723]: 2026-02-16 13:40:47.073 185727 DEBUG nova.compute.resource_tracker [None req-04e0a887-f5eb-43c8-b042-ca4d450ef6b8 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Migration 3dc052c0-2e01-427f-91f3-ee707b871d5a is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640
Feb 16 13:40:47 compute-0 nova_compute[185723]: 2026-02-16 13:40:47.074 185727 DEBUG nova.compute.resource_tracker [None req-04e0a887-f5eb-43c8-b042-ca4d450ef6b8 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 16 13:40:47 compute-0 nova_compute[185723]: 2026-02-16 13:40:47.074 185727 DEBUG nova.compute.resource_tracker [None req-04e0a887-f5eb-43c8-b042-ca4d450ef6b8 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 16 13:40:47 compute-0 nova_compute[185723]: 2026-02-16 13:40:47.114 185727 DEBUG nova.compute.provider_tree [None req-04e0a887-f5eb-43c8-b042-ca4d450ef6b8 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Inventory has not changed in ProviderTree for provider: c9501a85-df32-4b8f-bce0-9425ef1e7866 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 13:40:47 compute-0 nova_compute[185723]: 2026-02-16 13:40:47.132 185727 DEBUG nova.scheduler.client.report [None req-04e0a887-f5eb-43c8-b042-ca4d450ef6b8 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Inventory has not changed for provider c9501a85-df32-4b8f-bce0-9425ef1e7866 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 13:40:47 compute-0 nova_compute[185723]: 2026-02-16 13:40:47.190 185727 DEBUG nova.compute.resource_tracker [None req-04e0a887-f5eb-43c8-b042-ca4d450ef6b8 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 16 13:40:47 compute-0 nova_compute[185723]: 2026-02-16 13:40:47.191 185727 DEBUG oslo_concurrency.lockutils [None req-04e0a887-f5eb-43c8-b042-ca4d450ef6b8 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.313s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:40:47 compute-0 nova_compute[185723]: 2026-02-16 13:40:47.196 185727 INFO nova.compute.manager [None req-04e0a887-f5eb-43c8-b042-ca4d450ef6b8 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 93d211b1-f197-4c96-a994-900df3bf28e4] Migrating instance to compute-1.ctlplane.example.com finished successfully.
Feb 16 13:40:47 compute-0 nova_compute[185723]: 2026-02-16 13:40:47.292 185727 INFO nova.scheduler.client.report [None req-04e0a887-f5eb-43c8-b042-ca4d450ef6b8 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Deleted allocation for migration 3dc052c0-2e01-427f-91f3-ee707b871d5a
Feb 16 13:40:47 compute-0 nova_compute[185723]: 2026-02-16 13:40:47.293 185727 DEBUG nova.virt.libvirt.driver [None req-04e0a887-f5eb-43c8-b042-ca4d450ef6b8 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 93d211b1-f197-4c96-a994-900df3bf28e4] Live migration monitoring is all done _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10662
Feb 16 13:40:48 compute-0 nova_compute[185723]: 2026-02-16 13:40:48.545 185727 DEBUG nova.compute.manager [None req-1c5c408e-3b2c-45af-b4ec-3b219bdd3e3d bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Removing trait COMPUTE_STATUS_DISABLED from compute node resource provider c9501a85-df32-4b8f-bce0-9425ef1e7866 in placement. update_compute_provider_status /usr/lib/python3.9/site-packages/nova/compute/manager.py:606
Feb 16 13:40:48 compute-0 nova_compute[185723]: 2026-02-16 13:40:48.605 185727 DEBUG nova.compute.provider_tree [None req-1c5c408e-3b2c-45af-b4ec-3b219bdd3e3d bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Updating resource provider c9501a85-df32-4b8f-bce0-9425ef1e7866 generation from 24 to 27 during operation: update_traits _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Feb 16 13:40:49 compute-0 nova_compute[185723]: 2026-02-16 13:40:49.662 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:40:49 compute-0 nova_compute[185723]: 2026-02-16 13:40:49.943 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:40:53 compute-0 nova_compute[185723]: 2026-02-16 13:40:53.010 185727 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1771249238.0068498, 93d211b1-f197-4c96-a994-900df3bf28e4 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 13:40:53 compute-0 nova_compute[185723]: 2026-02-16 13:40:53.011 185727 INFO nova.compute.manager [-] [instance: 93d211b1-f197-4c96-a994-900df3bf28e4] VM Stopped (Lifecycle Event)
Feb 16 13:40:53 compute-0 nova_compute[185723]: 2026-02-16 13:40:53.040 185727 DEBUG nova.compute.manager [None req-22bf30a1-2ce4-41fd-ae9e-e7d7cd43f017 - - - - - -] [instance: 93d211b1-f197-4c96-a994-900df3bf28e4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:40:54 compute-0 nova_compute[185723]: 2026-02-16 13:40:54.666 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:40:54 compute-0 nova_compute[185723]: 2026-02-16 13:40:54.946 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:40:56 compute-0 nova_compute[185723]: 2026-02-16 13:40:56.020 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:40:56 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:40:56.023 105360 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=21, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '52:96:1b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '16:6c:7a:bd:49:39'}, ipsec=False) old=SB_Global(nb_cfg=20) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 13:40:56 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:40:56.024 105360 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 16 13:40:57 compute-0 podman[212681]: 2026-02-16 13:40:57.047414376 +0000 UTC m=+0.082995209 container health_status 93151dcbec9f159583042df5101bdd7b5b1bcb5f3117955b4bc9cd1c0e465b98 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., version=9.7, com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., architecture=x86_64, io.buildah.version=1.33.7, build-date=2026-02-05T04:57:10Z, config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9/ubi-minimal, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, container_name=openstack_network_exporter, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, org.opencontainers.image.created=2026-02-05T04:57:10Z, release=1770267347, managed_by=edpm_ansible)
Feb 16 13:40:57 compute-0 podman[212682]: 2026-02-16 13:40:57.049024926 +0000 UTC m=+0.087947282 container health_status d69bd0be854ec8043fcba555b70c2cd311c7a2510d07d6bc9929fe7684abece9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Feb 16 13:40:59 compute-0 nova_compute[185723]: 2026-02-16 13:40:59.668 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:40:59 compute-0 podman[195053]: time="2026-02-16T13:40:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:40:59 compute-0 podman[195053]: @ - - [16/Feb/2026:13:40:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16012 "" "Go-http-client/1.1"
Feb 16 13:40:59 compute-0 podman[195053]: @ - - [16/Feb/2026:13:40:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2178 "" "Go-http-client/1.1"
Feb 16 13:40:59 compute-0 nova_compute[185723]: 2026-02-16 13:40:59.947 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:41:00 compute-0 podman[212719]: 2026-02-16 13:41:00.054652677 +0000 UTC m=+0.093563030 container health_status 19961a2f872cc72daf954f4dd556f980b5f58f137d145776df6132c167a8a93c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible)
Feb 16 13:41:01 compute-0 openstack_network_exporter[197909]: ERROR   13:41:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:41:01 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:41:01 compute-0 openstack_network_exporter[197909]: ERROR   13:41:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:41:01 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:41:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:41:03.234 105360 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:41:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:41:03.235 105360 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:41:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:41:03.236 105360 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:41:04 compute-0 nova_compute[185723]: 2026-02-16 13:41:04.915 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:41:04 compute-0 nova_compute[185723]: 2026-02-16 13:41:04.950 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:41:05 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:41:05.027 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=b0e583b2-47d7-4bde-bbd6-282143e0c194, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '21'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:41:09 compute-0 nova_compute[185723]: 2026-02-16 13:41:09.918 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:41:09 compute-0 nova_compute[185723]: 2026-02-16 13:41:09.953 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:41:11 compute-0 podman[212746]: 2026-02-16 13:41:11.033427552 +0000 UTC m=+0.074099247 container health_status 4ec6105d1727f201f656d7e6e358931be8ceabc5af37fb5ad779abc620122180 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 16 13:41:14 compute-0 sshd-session[212770]: Invalid user postgres from 188.166.42.159 port 58454
Feb 16 13:41:14 compute-0 sshd-session[212770]: Connection closed by invalid user postgres 188.166.42.159 port 58454 [preauth]
Feb 16 13:41:14 compute-0 nova_compute[185723]: 2026-02-16 13:41:14.921 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:41:14 compute-0 nova_compute[185723]: 2026-02-16 13:41:14.955 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:41:16 compute-0 nova_compute[185723]: 2026-02-16 13:41:16.434 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:41:17 compute-0 nova_compute[185723]: 2026-02-16 13:41:17.434 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:41:17 compute-0 sshd-session[212772]: Invalid user git from 146.190.22.227 port 57002
Feb 16 13:41:18 compute-0 sshd-session[212772]: Connection closed by invalid user git 146.190.22.227 port 57002 [preauth]
Feb 16 13:41:18 compute-0 nova_compute[185723]: 2026-02-16 13:41:18.429 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:41:19 compute-0 sshd-session[212774]: Invalid user admin from 64.227.72.94 port 49210
Feb 16 13:41:19 compute-0 sshd-session[212774]: Connection closed by invalid user admin 64.227.72.94 port 49210 [preauth]
Feb 16 13:41:19 compute-0 nova_compute[185723]: 2026-02-16 13:41:19.433 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:41:19 compute-0 nova_compute[185723]: 2026-02-16 13:41:19.434 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 16 13:41:19 compute-0 nova_compute[185723]: 2026-02-16 13:41:19.434 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 16 13:41:19 compute-0 nova_compute[185723]: 2026-02-16 13:41:19.453 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 16 13:41:19 compute-0 nova_compute[185723]: 2026-02-16 13:41:19.454 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:41:19 compute-0 nova_compute[185723]: 2026-02-16 13:41:19.455 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:41:19 compute-0 nova_compute[185723]: 2026-02-16 13:41:19.479 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:41:19 compute-0 nova_compute[185723]: 2026-02-16 13:41:19.479 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:41:19 compute-0 nova_compute[185723]: 2026-02-16 13:41:19.479 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:41:19 compute-0 nova_compute[185723]: 2026-02-16 13:41:19.480 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 16 13:41:19 compute-0 nova_compute[185723]: 2026-02-16 13:41:19.617 185727 WARNING nova.virt.libvirt.driver [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 13:41:19 compute-0 nova_compute[185723]: 2026-02-16 13:41:19.618 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5818MB free_disk=73.22511672973633GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 16 13:41:19 compute-0 nova_compute[185723]: 2026-02-16 13:41:19.618 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:41:19 compute-0 nova_compute[185723]: 2026-02-16 13:41:19.618 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:41:19 compute-0 nova_compute[185723]: 2026-02-16 13:41:19.924 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:41:19 compute-0 nova_compute[185723]: 2026-02-16 13:41:19.927 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 16 13:41:19 compute-0 nova_compute[185723]: 2026-02-16 13:41:19.927 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 16 13:41:19 compute-0 nova_compute[185723]: 2026-02-16 13:41:19.941 185727 DEBUG nova.scheduler.client.report [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Refreshing inventories for resource provider c9501a85-df32-4b8f-bce0-9425ef1e7866 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Feb 16 13:41:19 compute-0 nova_compute[185723]: 2026-02-16 13:41:19.955 185727 DEBUG nova.scheduler.client.report [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Updating ProviderTree inventory for provider c9501a85-df32-4b8f-bce0-9425ef1e7866 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Feb 16 13:41:19 compute-0 nova_compute[185723]: 2026-02-16 13:41:19.956 185727 DEBUG nova.compute.provider_tree [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Updating inventory in ProviderTree for provider c9501a85-df32-4b8f-bce0-9425ef1e7866 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Feb 16 13:41:19 compute-0 nova_compute[185723]: 2026-02-16 13:41:19.957 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:41:19 compute-0 nova_compute[185723]: 2026-02-16 13:41:19.970 185727 DEBUG nova.scheduler.client.report [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Refreshing aggregate associations for resource provider c9501a85-df32-4b8f-bce0-9425ef1e7866, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Feb 16 13:41:19 compute-0 nova_compute[185723]: 2026-02-16 13:41:19.992 185727 DEBUG nova.scheduler.client.report [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Refreshing trait associations for resource provider c9501a85-df32-4b8f-bce0-9425ef1e7866, traits: COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_MMX,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_SSE,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_SSE2,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_SECURITY_TPM_1_2,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_VOLUME_EXTEND,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_STORAGE_BUS_USB,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_SECURITY_TPM_2_0,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_ACCELERATORS,COMPUTE_STORAGE_BUS_FDC,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NODE,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_SSE41,COMPUTE_IMAGE_TYPE_AKI _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Feb 16 13:41:20 compute-0 nova_compute[185723]: 2026-02-16 13:41:20.011 185727 DEBUG nova.compute.provider_tree [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Inventory has not changed in ProviderTree for provider: c9501a85-df32-4b8f-bce0-9425ef1e7866 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 13:41:20 compute-0 nova_compute[185723]: 2026-02-16 13:41:20.028 185727 DEBUG nova.scheduler.client.report [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Inventory has not changed for provider c9501a85-df32-4b8f-bce0-9425ef1e7866 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 13:41:20 compute-0 nova_compute[185723]: 2026-02-16 13:41:20.030 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 16 13:41:20 compute-0 nova_compute[185723]: 2026-02-16 13:41:20.030 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.412s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:41:20 compute-0 nova_compute[185723]: 2026-02-16 13:41:20.224 185727 DEBUG oslo_concurrency.lockutils [None req-d15e0874-b328-4f20-a78d-7b9847294e72 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Acquiring lock "44f63a81-024b-446b-a144-28445aaae47c" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:41:20 compute-0 nova_compute[185723]: 2026-02-16 13:41:20.225 185727 DEBUG oslo_concurrency.lockutils [None req-d15e0874-b328-4f20-a78d-7b9847294e72 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "44f63a81-024b-446b-a144-28445aaae47c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:41:20 compute-0 nova_compute[185723]: 2026-02-16 13:41:20.248 185727 DEBUG nova.compute.manager [None req-d15e0874-b328-4f20-a78d-7b9847294e72 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 44f63a81-024b-446b-a144-28445aaae47c] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Feb 16 13:41:20 compute-0 nova_compute[185723]: 2026-02-16 13:41:20.328 185727 DEBUG oslo_concurrency.lockutils [None req-d15e0874-b328-4f20-a78d-7b9847294e72 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:41:20 compute-0 nova_compute[185723]: 2026-02-16 13:41:20.329 185727 DEBUG oslo_concurrency.lockutils [None req-d15e0874-b328-4f20-a78d-7b9847294e72 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:41:20 compute-0 nova_compute[185723]: 2026-02-16 13:41:20.338 185727 DEBUG nova.virt.hardware [None req-d15e0874-b328-4f20-a78d-7b9847294e72 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Feb 16 13:41:20 compute-0 nova_compute[185723]: 2026-02-16 13:41:20.338 185727 INFO nova.compute.claims [None req-d15e0874-b328-4f20-a78d-7b9847294e72 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 44f63a81-024b-446b-a144-28445aaae47c] Claim successful on node compute-0.ctlplane.example.com
Feb 16 13:41:20 compute-0 nova_compute[185723]: 2026-02-16 13:41:20.450 185727 DEBUG nova.compute.provider_tree [None req-d15e0874-b328-4f20-a78d-7b9847294e72 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Inventory has not changed in ProviderTree for provider: c9501a85-df32-4b8f-bce0-9425ef1e7866 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 13:41:20 compute-0 nova_compute[185723]: 2026-02-16 13:41:20.465 185727 DEBUG nova.scheduler.client.report [None req-d15e0874-b328-4f20-a78d-7b9847294e72 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Inventory has not changed for provider c9501a85-df32-4b8f-bce0-9425ef1e7866 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 13:41:20 compute-0 nova_compute[185723]: 2026-02-16 13:41:20.488 185727 DEBUG oslo_concurrency.lockutils [None req-d15e0874-b328-4f20-a78d-7b9847294e72 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.159s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:41:20 compute-0 nova_compute[185723]: 2026-02-16 13:41:20.489 185727 DEBUG nova.compute.manager [None req-d15e0874-b328-4f20-a78d-7b9847294e72 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 44f63a81-024b-446b-a144-28445aaae47c] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Feb 16 13:41:20 compute-0 nova_compute[185723]: 2026-02-16 13:41:20.543 185727 DEBUG nova.compute.manager [None req-d15e0874-b328-4f20-a78d-7b9847294e72 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 44f63a81-024b-446b-a144-28445aaae47c] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Feb 16 13:41:20 compute-0 nova_compute[185723]: 2026-02-16 13:41:20.544 185727 DEBUG nova.network.neutron [None req-d15e0874-b328-4f20-a78d-7b9847294e72 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 44f63a81-024b-446b-a144-28445aaae47c] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Feb 16 13:41:20 compute-0 nova_compute[185723]: 2026-02-16 13:41:20.565 185727 INFO nova.virt.libvirt.driver [None req-d15e0874-b328-4f20-a78d-7b9847294e72 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 44f63a81-024b-446b-a144-28445aaae47c] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Feb 16 13:41:20 compute-0 nova_compute[185723]: 2026-02-16 13:41:20.586 185727 DEBUG nova.compute.manager [None req-d15e0874-b328-4f20-a78d-7b9847294e72 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 44f63a81-024b-446b-a144-28445aaae47c] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Feb 16 13:41:20 compute-0 nova_compute[185723]: 2026-02-16 13:41:20.700 185727 DEBUG nova.compute.manager [None req-d15e0874-b328-4f20-a78d-7b9847294e72 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 44f63a81-024b-446b-a144-28445aaae47c] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Feb 16 13:41:20 compute-0 nova_compute[185723]: 2026-02-16 13:41:20.702 185727 DEBUG nova.virt.libvirt.driver [None req-d15e0874-b328-4f20-a78d-7b9847294e72 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 44f63a81-024b-446b-a144-28445aaae47c] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Feb 16 13:41:20 compute-0 nova_compute[185723]: 2026-02-16 13:41:20.702 185727 INFO nova.virt.libvirt.driver [None req-d15e0874-b328-4f20-a78d-7b9847294e72 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 44f63a81-024b-446b-a144-28445aaae47c] Creating image(s)
Feb 16 13:41:20 compute-0 nova_compute[185723]: 2026-02-16 13:41:20.703 185727 DEBUG oslo_concurrency.lockutils [None req-d15e0874-b328-4f20-a78d-7b9847294e72 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Acquiring lock "/var/lib/nova/instances/44f63a81-024b-446b-a144-28445aaae47c/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:41:20 compute-0 nova_compute[185723]: 2026-02-16 13:41:20.703 185727 DEBUG oslo_concurrency.lockutils [None req-d15e0874-b328-4f20-a78d-7b9847294e72 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "/var/lib/nova/instances/44f63a81-024b-446b-a144-28445aaae47c/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:41:20 compute-0 nova_compute[185723]: 2026-02-16 13:41:20.704 185727 DEBUG oslo_concurrency.lockutils [None req-d15e0874-b328-4f20-a78d-7b9847294e72 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "/var/lib/nova/instances/44f63a81-024b-446b-a144-28445aaae47c/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:41:20 compute-0 nova_compute[185723]: 2026-02-16 13:41:20.716 185727 DEBUG oslo_concurrency.processutils [None req-d15e0874-b328-4f20-a78d-7b9847294e72 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:41:20 compute-0 nova_compute[185723]: 2026-02-16 13:41:20.790 185727 DEBUG oslo_concurrency.processutils [None req-d15e0874-b328-4f20-a78d-7b9847294e72 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:41:20 compute-0 nova_compute[185723]: 2026-02-16 13:41:20.792 185727 DEBUG oslo_concurrency.lockutils [None req-d15e0874-b328-4f20-a78d-7b9847294e72 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Acquiring lock "755ae6cb63977e865a71a8c38a1a67ce95c9d0b7" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:41:20 compute-0 nova_compute[185723]: 2026-02-16 13:41:20.792 185727 DEBUG oslo_concurrency.lockutils [None req-d15e0874-b328-4f20-a78d-7b9847294e72 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "755ae6cb63977e865a71a8c38a1a67ce95c9d0b7" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:41:20 compute-0 nova_compute[185723]: 2026-02-16 13:41:20.810 185727 DEBUG oslo_concurrency.processutils [None req-d15e0874-b328-4f20-a78d-7b9847294e72 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:41:20 compute-0 nova_compute[185723]: 2026-02-16 13:41:20.863 185727 DEBUG oslo_concurrency.processutils [None req-d15e0874-b328-4f20-a78d-7b9847294e72 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:41:20 compute-0 nova_compute[185723]: 2026-02-16 13:41:20.864 185727 DEBUG oslo_concurrency.processutils [None req-d15e0874-b328-4f20-a78d-7b9847294e72 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7,backing_fmt=raw /var/lib/nova/instances/44f63a81-024b-446b-a144-28445aaae47c/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:41:20 compute-0 nova_compute[185723]: 2026-02-16 13:41:20.894 185727 DEBUG oslo_concurrency.processutils [None req-d15e0874-b328-4f20-a78d-7b9847294e72 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7,backing_fmt=raw /var/lib/nova/instances/44f63a81-024b-446b-a144-28445aaae47c/disk 1073741824" returned: 0 in 0.030s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:41:20 compute-0 nova_compute[185723]: 2026-02-16 13:41:20.895 185727 DEBUG oslo_concurrency.lockutils [None req-d15e0874-b328-4f20-a78d-7b9847294e72 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "755ae6cb63977e865a71a8c38a1a67ce95c9d0b7" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.103s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:41:20 compute-0 nova_compute[185723]: 2026-02-16 13:41:20.895 185727 DEBUG oslo_concurrency.processutils [None req-d15e0874-b328-4f20-a78d-7b9847294e72 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:41:20 compute-0 nova_compute[185723]: 2026-02-16 13:41:20.949 185727 DEBUG oslo_concurrency.processutils [None req-d15e0874-b328-4f20-a78d-7b9847294e72 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:41:20 compute-0 nova_compute[185723]: 2026-02-16 13:41:20.950 185727 DEBUG nova.virt.disk.api [None req-d15e0874-b328-4f20-a78d-7b9847294e72 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Checking if we can resize image /var/lib/nova/instances/44f63a81-024b-446b-a144-28445aaae47c/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Feb 16 13:41:20 compute-0 nova_compute[185723]: 2026-02-16 13:41:20.951 185727 DEBUG oslo_concurrency.processutils [None req-d15e0874-b328-4f20-a78d-7b9847294e72 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/44f63a81-024b-446b-a144-28445aaae47c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:41:20 compute-0 nova_compute[185723]: 2026-02-16 13:41:20.997 185727 DEBUG oslo_concurrency.processutils [None req-d15e0874-b328-4f20-a78d-7b9847294e72 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/44f63a81-024b-446b-a144-28445aaae47c/disk --force-share --output=json" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:41:20 compute-0 nova_compute[185723]: 2026-02-16 13:41:20.998 185727 DEBUG nova.virt.disk.api [None req-d15e0874-b328-4f20-a78d-7b9847294e72 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Cannot resize image /var/lib/nova/instances/44f63a81-024b-446b-a144-28445aaae47c/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Feb 16 13:41:20 compute-0 nova_compute[185723]: 2026-02-16 13:41:20.998 185727 DEBUG nova.objects.instance [None req-d15e0874-b328-4f20-a78d-7b9847294e72 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lazy-loading 'migration_context' on Instance uuid 44f63a81-024b-446b-a144-28445aaae47c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 13:41:21 compute-0 nova_compute[185723]: 2026-02-16 13:41:21.009 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:41:21 compute-0 nova_compute[185723]: 2026-02-16 13:41:21.038 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:41:21 compute-0 nova_compute[185723]: 2026-02-16 13:41:21.053 185727 DEBUG nova.virt.libvirt.driver [None req-d15e0874-b328-4f20-a78d-7b9847294e72 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 44f63a81-024b-446b-a144-28445aaae47c] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Feb 16 13:41:21 compute-0 nova_compute[185723]: 2026-02-16 13:41:21.053 185727 DEBUG nova.virt.libvirt.driver [None req-d15e0874-b328-4f20-a78d-7b9847294e72 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 44f63a81-024b-446b-a144-28445aaae47c] Ensure instance console log exists: /var/lib/nova/instances/44f63a81-024b-446b-a144-28445aaae47c/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Feb 16 13:41:21 compute-0 nova_compute[185723]: 2026-02-16 13:41:21.054 185727 DEBUG oslo_concurrency.lockutils [None req-d15e0874-b328-4f20-a78d-7b9847294e72 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:41:21 compute-0 nova_compute[185723]: 2026-02-16 13:41:21.054 185727 DEBUG oslo_concurrency.lockutils [None req-d15e0874-b328-4f20-a78d-7b9847294e72 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:41:21 compute-0 nova_compute[185723]: 2026-02-16 13:41:21.054 185727 DEBUG oslo_concurrency.lockutils [None req-d15e0874-b328-4f20-a78d-7b9847294e72 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:41:21 compute-0 nova_compute[185723]: 2026-02-16 13:41:21.432 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:41:21 compute-0 nova_compute[185723]: 2026-02-16 13:41:21.433 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 16 13:41:21 compute-0 nova_compute[185723]: 2026-02-16 13:41:21.542 185727 DEBUG nova.policy [None req-d15e0874-b328-4f20-a78d-7b9847294e72 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'e19cd2d8a8894526ba620ca3249e9a63', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '76c271745e704d5fa97fe16a7dcd4a81', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Feb 16 13:41:22 compute-0 nova_compute[185723]: 2026-02-16 13:41:22.434 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:41:23 compute-0 nova_compute[185723]: 2026-02-16 13:41:23.159 185727 DEBUG nova.network.neutron [None req-d15e0874-b328-4f20-a78d-7b9847294e72 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 44f63a81-024b-446b-a144-28445aaae47c] Successfully created port: f4039b0f-5331-420f-9e9f-432d4b817a98 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Feb 16 13:41:24 compute-0 nova_compute[185723]: 2026-02-16 13:41:24.757 185727 DEBUG nova.network.neutron [None req-d15e0874-b328-4f20-a78d-7b9847294e72 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 44f63a81-024b-446b-a144-28445aaae47c] Successfully updated port: f4039b0f-5331-420f-9e9f-432d4b817a98 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Feb 16 13:41:24 compute-0 nova_compute[185723]: 2026-02-16 13:41:24.774 185727 DEBUG oslo_concurrency.lockutils [None req-d15e0874-b328-4f20-a78d-7b9847294e72 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Acquiring lock "refresh_cache-44f63a81-024b-446b-a144-28445aaae47c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 13:41:24 compute-0 nova_compute[185723]: 2026-02-16 13:41:24.774 185727 DEBUG oslo_concurrency.lockutils [None req-d15e0874-b328-4f20-a78d-7b9847294e72 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Acquired lock "refresh_cache-44f63a81-024b-446b-a144-28445aaae47c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 13:41:24 compute-0 nova_compute[185723]: 2026-02-16 13:41:24.774 185727 DEBUG nova.network.neutron [None req-d15e0874-b328-4f20-a78d-7b9847294e72 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 44f63a81-024b-446b-a144-28445aaae47c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 16 13:41:24 compute-0 nova_compute[185723]: 2026-02-16 13:41:24.901 185727 DEBUG nova.compute.manager [req-bbdf67c6-bb7c-44f8-8cab-8a798d9139be req-a455b107-1c1d-4e1c-a429-ca7c1169f475 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 44f63a81-024b-446b-a144-28445aaae47c] Received event network-changed-f4039b0f-5331-420f-9e9f-432d4b817a98 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:41:24 compute-0 nova_compute[185723]: 2026-02-16 13:41:24.902 185727 DEBUG nova.compute.manager [req-bbdf67c6-bb7c-44f8-8cab-8a798d9139be req-a455b107-1c1d-4e1c-a429-ca7c1169f475 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 44f63a81-024b-446b-a144-28445aaae47c] Refreshing instance network info cache due to event network-changed-f4039b0f-5331-420f-9e9f-432d4b817a98. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 16 13:41:24 compute-0 nova_compute[185723]: 2026-02-16 13:41:24.902 185727 DEBUG oslo_concurrency.lockutils [req-bbdf67c6-bb7c-44f8-8cab-8a798d9139be req-a455b107-1c1d-4e1c-a429-ca7c1169f475 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "refresh_cache-44f63a81-024b-446b-a144-28445aaae47c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 13:41:24 compute-0 nova_compute[185723]: 2026-02-16 13:41:24.927 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:41:24 compute-0 nova_compute[185723]: 2026-02-16 13:41:24.958 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:41:25 compute-0 nova_compute[185723]: 2026-02-16 13:41:25.545 185727 DEBUG nova.network.neutron [None req-d15e0874-b328-4f20-a78d-7b9847294e72 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 44f63a81-024b-446b-a144-28445aaae47c] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Feb 16 13:41:26 compute-0 nova_compute[185723]: 2026-02-16 13:41:26.888 185727 DEBUG nova.network.neutron [None req-d15e0874-b328-4f20-a78d-7b9847294e72 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 44f63a81-024b-446b-a144-28445aaae47c] Updating instance_info_cache with network_info: [{"id": "f4039b0f-5331-420f-9e9f-432d4b817a98", "address": "fa:16:3e:c5:de:ed", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf4039b0f-53", "ovs_interfaceid": "f4039b0f-5331-420f-9e9f-432d4b817a98", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 13:41:26 compute-0 nova_compute[185723]: 2026-02-16 13:41:26.906 185727 DEBUG oslo_concurrency.lockutils [None req-d15e0874-b328-4f20-a78d-7b9847294e72 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Releasing lock "refresh_cache-44f63a81-024b-446b-a144-28445aaae47c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 13:41:26 compute-0 nova_compute[185723]: 2026-02-16 13:41:26.906 185727 DEBUG nova.compute.manager [None req-d15e0874-b328-4f20-a78d-7b9847294e72 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 44f63a81-024b-446b-a144-28445aaae47c] Instance network_info: |[{"id": "f4039b0f-5331-420f-9e9f-432d4b817a98", "address": "fa:16:3e:c5:de:ed", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf4039b0f-53", "ovs_interfaceid": "f4039b0f-5331-420f-9e9f-432d4b817a98", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Feb 16 13:41:26 compute-0 nova_compute[185723]: 2026-02-16 13:41:26.907 185727 DEBUG oslo_concurrency.lockutils [req-bbdf67c6-bb7c-44f8-8cab-8a798d9139be req-a455b107-1c1d-4e1c-a429-ca7c1169f475 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquired lock "refresh_cache-44f63a81-024b-446b-a144-28445aaae47c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 13:41:26 compute-0 nova_compute[185723]: 2026-02-16 13:41:26.907 185727 DEBUG nova.network.neutron [req-bbdf67c6-bb7c-44f8-8cab-8a798d9139be req-a455b107-1c1d-4e1c-a429-ca7c1169f475 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 44f63a81-024b-446b-a144-28445aaae47c] Refreshing network info cache for port f4039b0f-5331-420f-9e9f-432d4b817a98 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 16 13:41:26 compute-0 nova_compute[185723]: 2026-02-16 13:41:26.909 185727 DEBUG nova.virt.libvirt.driver [None req-d15e0874-b328-4f20-a78d-7b9847294e72 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 44f63a81-024b-446b-a144-28445aaae47c] Start _get_guest_xml network_info=[{"id": "f4039b0f-5331-420f-9e9f-432d4b817a98", "address": "fa:16:3e:c5:de:ed", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf4039b0f-53", "ovs_interfaceid": "f4039b0f-5331-420f-9e9f-432d4b817a98", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-16T13:17:01Z,direct_url=<?>,disk_format='qcow2',id=6fb9af7f-2971-4890-a777-6e99e888717f,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='fa8ec31824694513a42cc22c81880a5c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-16T13:17:03Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'encrypted': False, 'device_type': 'disk', 'disk_bus': 'virtio', 'encryption_options': None, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'size': 0, 'image_id': '6fb9af7f-2971-4890-a777-6e99e888717f'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Feb 16 13:41:26 compute-0 nova_compute[185723]: 2026-02-16 13:41:26.914 185727 WARNING nova.virt.libvirt.driver [None req-d15e0874-b328-4f20-a78d-7b9847294e72 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 13:41:26 compute-0 nova_compute[185723]: 2026-02-16 13:41:26.923 185727 DEBUG nova.virt.libvirt.host [None req-d15e0874-b328-4f20-a78d-7b9847294e72 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Feb 16 13:41:26 compute-0 nova_compute[185723]: 2026-02-16 13:41:26.923 185727 DEBUG nova.virt.libvirt.host [None req-d15e0874-b328-4f20-a78d-7b9847294e72 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Feb 16 13:41:26 compute-0 nova_compute[185723]: 2026-02-16 13:41:26.927 185727 DEBUG nova.virt.libvirt.host [None req-d15e0874-b328-4f20-a78d-7b9847294e72 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Feb 16 13:41:26 compute-0 nova_compute[185723]: 2026-02-16 13:41:26.928 185727 DEBUG nova.virt.libvirt.host [None req-d15e0874-b328-4f20-a78d-7b9847294e72 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Feb 16 13:41:26 compute-0 nova_compute[185723]: 2026-02-16 13:41:26.928 185727 DEBUG nova.virt.libvirt.driver [None req-d15e0874-b328-4f20-a78d-7b9847294e72 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Feb 16 13:41:26 compute-0 nova_compute[185723]: 2026-02-16 13:41:26.929 185727 DEBUG nova.virt.hardware [None req-d15e0874-b328-4f20-a78d-7b9847294e72 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-02-16T13:16:59Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='6d89f72c-1760-421e-a5f2-83dfc3723b84',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-16T13:17:01Z,direct_url=<?>,disk_format='qcow2',id=6fb9af7f-2971-4890-a777-6e99e888717f,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='fa8ec31824694513a42cc22c81880a5c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-16T13:17:03Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Feb 16 13:41:26 compute-0 nova_compute[185723]: 2026-02-16 13:41:26.929 185727 DEBUG nova.virt.hardware [None req-d15e0874-b328-4f20-a78d-7b9847294e72 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Feb 16 13:41:26 compute-0 nova_compute[185723]: 2026-02-16 13:41:26.929 185727 DEBUG nova.virt.hardware [None req-d15e0874-b328-4f20-a78d-7b9847294e72 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Feb 16 13:41:26 compute-0 nova_compute[185723]: 2026-02-16 13:41:26.929 185727 DEBUG nova.virt.hardware [None req-d15e0874-b328-4f20-a78d-7b9847294e72 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Feb 16 13:41:26 compute-0 nova_compute[185723]: 2026-02-16 13:41:26.930 185727 DEBUG nova.virt.hardware [None req-d15e0874-b328-4f20-a78d-7b9847294e72 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Feb 16 13:41:26 compute-0 nova_compute[185723]: 2026-02-16 13:41:26.930 185727 DEBUG nova.virt.hardware [None req-d15e0874-b328-4f20-a78d-7b9847294e72 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Feb 16 13:41:26 compute-0 nova_compute[185723]: 2026-02-16 13:41:26.930 185727 DEBUG nova.virt.hardware [None req-d15e0874-b328-4f20-a78d-7b9847294e72 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Feb 16 13:41:26 compute-0 nova_compute[185723]: 2026-02-16 13:41:26.930 185727 DEBUG nova.virt.hardware [None req-d15e0874-b328-4f20-a78d-7b9847294e72 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Feb 16 13:41:26 compute-0 nova_compute[185723]: 2026-02-16 13:41:26.930 185727 DEBUG nova.virt.hardware [None req-d15e0874-b328-4f20-a78d-7b9847294e72 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Feb 16 13:41:26 compute-0 nova_compute[185723]: 2026-02-16 13:41:26.930 185727 DEBUG nova.virt.hardware [None req-d15e0874-b328-4f20-a78d-7b9847294e72 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Feb 16 13:41:26 compute-0 nova_compute[185723]: 2026-02-16 13:41:26.931 185727 DEBUG nova.virt.hardware [None req-d15e0874-b328-4f20-a78d-7b9847294e72 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Feb 16 13:41:26 compute-0 nova_compute[185723]: 2026-02-16 13:41:26.933 185727 DEBUG nova.virt.libvirt.vif [None req-d15e0874-b328-4f20-a78d-7b9847294e72 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-16T13:41:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-230655652',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-230655652',id=17,image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='76c271745e704d5fa97fe16a7dcd4a81',ramdisk_id='',reservation_id='r-4tu80fu6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-1085993185',owner_user_name='tempest-TestExecuteStrategies-1085993185-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-16T13:41:20Z,user_data=None,user_id='e19cd2d8a8894526ba620ca3249e9a63',uuid=44f63a81-024b-446b-a144-28445aaae47c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f4039b0f-5331-420f-9e9f-432d4b817a98", "address": "fa:16:3e:c5:de:ed", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf4039b0f-53", "ovs_interfaceid": "f4039b0f-5331-420f-9e9f-432d4b817a98", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Feb 16 13:41:26 compute-0 nova_compute[185723]: 2026-02-16 13:41:26.934 185727 DEBUG nova.network.os_vif_util [None req-d15e0874-b328-4f20-a78d-7b9847294e72 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Converting VIF {"id": "f4039b0f-5331-420f-9e9f-432d4b817a98", "address": "fa:16:3e:c5:de:ed", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf4039b0f-53", "ovs_interfaceid": "f4039b0f-5331-420f-9e9f-432d4b817a98", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 13:41:26 compute-0 nova_compute[185723]: 2026-02-16 13:41:26.934 185727 DEBUG nova.network.os_vif_util [None req-d15e0874-b328-4f20-a78d-7b9847294e72 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c5:de:ed,bridge_name='br-int',has_traffic_filtering=True,id=f4039b0f-5331-420f-9e9f-432d4b817a98,network=Network(62a1ccdd-3048-4bbf-acc8-c791bff79ee8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf4039b0f-53') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 13:41:26 compute-0 nova_compute[185723]: 2026-02-16 13:41:26.935 185727 DEBUG nova.objects.instance [None req-d15e0874-b328-4f20-a78d-7b9847294e72 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lazy-loading 'pci_devices' on Instance uuid 44f63a81-024b-446b-a144-28445aaae47c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 13:41:26 compute-0 nova_compute[185723]: 2026-02-16 13:41:26.954 185727 DEBUG nova.virt.libvirt.driver [None req-d15e0874-b328-4f20-a78d-7b9847294e72 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 44f63a81-024b-446b-a144-28445aaae47c] End _get_guest_xml xml=<domain type="kvm">
Feb 16 13:41:26 compute-0 nova_compute[185723]:   <uuid>44f63a81-024b-446b-a144-28445aaae47c</uuid>
Feb 16 13:41:26 compute-0 nova_compute[185723]:   <name>instance-00000011</name>
Feb 16 13:41:26 compute-0 nova_compute[185723]:   <memory>131072</memory>
Feb 16 13:41:26 compute-0 nova_compute[185723]:   <vcpu>1</vcpu>
Feb 16 13:41:26 compute-0 nova_compute[185723]:   <metadata>
Feb 16 13:41:26 compute-0 nova_compute[185723]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb 16 13:41:26 compute-0 nova_compute[185723]:       <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Feb 16 13:41:26 compute-0 nova_compute[185723]:       <nova:name>tempest-TestExecuteStrategies-server-230655652</nova:name>
Feb 16 13:41:26 compute-0 nova_compute[185723]:       <nova:creationTime>2026-02-16 13:41:26</nova:creationTime>
Feb 16 13:41:26 compute-0 nova_compute[185723]:       <nova:flavor name="m1.nano">
Feb 16 13:41:26 compute-0 nova_compute[185723]:         <nova:memory>128</nova:memory>
Feb 16 13:41:26 compute-0 nova_compute[185723]:         <nova:disk>1</nova:disk>
Feb 16 13:41:26 compute-0 nova_compute[185723]:         <nova:swap>0</nova:swap>
Feb 16 13:41:26 compute-0 nova_compute[185723]:         <nova:ephemeral>0</nova:ephemeral>
Feb 16 13:41:26 compute-0 nova_compute[185723]:         <nova:vcpus>1</nova:vcpus>
Feb 16 13:41:26 compute-0 nova_compute[185723]:       </nova:flavor>
Feb 16 13:41:26 compute-0 nova_compute[185723]:       <nova:owner>
Feb 16 13:41:26 compute-0 nova_compute[185723]:         <nova:user uuid="e19cd2d8a8894526ba620ca3249e9a63">tempest-TestExecuteStrategies-1085993185-project-member</nova:user>
Feb 16 13:41:26 compute-0 nova_compute[185723]:         <nova:project uuid="76c271745e704d5fa97fe16a7dcd4a81">tempest-TestExecuteStrategies-1085993185</nova:project>
Feb 16 13:41:26 compute-0 nova_compute[185723]:       </nova:owner>
Feb 16 13:41:26 compute-0 nova_compute[185723]:       <nova:root type="image" uuid="6fb9af7f-2971-4890-a777-6e99e888717f"/>
Feb 16 13:41:26 compute-0 nova_compute[185723]:       <nova:ports>
Feb 16 13:41:26 compute-0 nova_compute[185723]:         <nova:port uuid="f4039b0f-5331-420f-9e9f-432d4b817a98">
Feb 16 13:41:26 compute-0 nova_compute[185723]:           <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Feb 16 13:41:26 compute-0 nova_compute[185723]:         </nova:port>
Feb 16 13:41:26 compute-0 nova_compute[185723]:       </nova:ports>
Feb 16 13:41:26 compute-0 nova_compute[185723]:     </nova:instance>
Feb 16 13:41:26 compute-0 nova_compute[185723]:   </metadata>
Feb 16 13:41:26 compute-0 nova_compute[185723]:   <sysinfo type="smbios">
Feb 16 13:41:26 compute-0 nova_compute[185723]:     <system>
Feb 16 13:41:26 compute-0 nova_compute[185723]:       <entry name="manufacturer">RDO</entry>
Feb 16 13:41:26 compute-0 nova_compute[185723]:       <entry name="product">OpenStack Compute</entry>
Feb 16 13:41:26 compute-0 nova_compute[185723]:       <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Feb 16 13:41:26 compute-0 nova_compute[185723]:       <entry name="serial">44f63a81-024b-446b-a144-28445aaae47c</entry>
Feb 16 13:41:26 compute-0 nova_compute[185723]:       <entry name="uuid">44f63a81-024b-446b-a144-28445aaae47c</entry>
Feb 16 13:41:26 compute-0 nova_compute[185723]:       <entry name="family">Virtual Machine</entry>
Feb 16 13:41:26 compute-0 nova_compute[185723]:     </system>
Feb 16 13:41:26 compute-0 nova_compute[185723]:   </sysinfo>
Feb 16 13:41:26 compute-0 nova_compute[185723]:   <os>
Feb 16 13:41:26 compute-0 nova_compute[185723]:     <type arch="x86_64" machine="q35">hvm</type>
Feb 16 13:41:26 compute-0 nova_compute[185723]:     <boot dev="hd"/>
Feb 16 13:41:26 compute-0 nova_compute[185723]:     <smbios mode="sysinfo"/>
Feb 16 13:41:26 compute-0 nova_compute[185723]:   </os>
Feb 16 13:41:26 compute-0 nova_compute[185723]:   <features>
Feb 16 13:41:26 compute-0 nova_compute[185723]:     <acpi/>
Feb 16 13:41:26 compute-0 nova_compute[185723]:     <apic/>
Feb 16 13:41:26 compute-0 nova_compute[185723]:     <vmcoreinfo/>
Feb 16 13:41:26 compute-0 nova_compute[185723]:   </features>
Feb 16 13:41:26 compute-0 nova_compute[185723]:   <clock offset="utc">
Feb 16 13:41:26 compute-0 nova_compute[185723]:     <timer name="pit" tickpolicy="delay"/>
Feb 16 13:41:26 compute-0 nova_compute[185723]:     <timer name="rtc" tickpolicy="catchup"/>
Feb 16 13:41:26 compute-0 nova_compute[185723]:     <timer name="hpet" present="no"/>
Feb 16 13:41:26 compute-0 nova_compute[185723]:   </clock>
Feb 16 13:41:26 compute-0 nova_compute[185723]:   <cpu mode="custom" match="exact">
Feb 16 13:41:26 compute-0 nova_compute[185723]:     <model>Nehalem</model>
Feb 16 13:41:26 compute-0 nova_compute[185723]:     <topology sockets="1" cores="1" threads="1"/>
Feb 16 13:41:26 compute-0 nova_compute[185723]:   </cpu>
Feb 16 13:41:26 compute-0 nova_compute[185723]:   <devices>
Feb 16 13:41:26 compute-0 nova_compute[185723]:     <disk type="file" device="disk">
Feb 16 13:41:26 compute-0 nova_compute[185723]:       <driver name="qemu" type="qcow2" cache="none"/>
Feb 16 13:41:26 compute-0 nova_compute[185723]:       <source file="/var/lib/nova/instances/44f63a81-024b-446b-a144-28445aaae47c/disk"/>
Feb 16 13:41:26 compute-0 nova_compute[185723]:       <target dev="vda" bus="virtio"/>
Feb 16 13:41:26 compute-0 nova_compute[185723]:     </disk>
Feb 16 13:41:26 compute-0 nova_compute[185723]:     <disk type="file" device="cdrom">
Feb 16 13:41:26 compute-0 nova_compute[185723]:       <driver name="qemu" type="raw" cache="none"/>
Feb 16 13:41:26 compute-0 nova_compute[185723]:       <source file="/var/lib/nova/instances/44f63a81-024b-446b-a144-28445aaae47c/disk.config"/>
Feb 16 13:41:26 compute-0 nova_compute[185723]:       <target dev="sda" bus="sata"/>
Feb 16 13:41:26 compute-0 nova_compute[185723]:     </disk>
Feb 16 13:41:26 compute-0 nova_compute[185723]:     <interface type="ethernet">
Feb 16 13:41:26 compute-0 nova_compute[185723]:       <mac address="fa:16:3e:c5:de:ed"/>
Feb 16 13:41:26 compute-0 nova_compute[185723]:       <model type="virtio"/>
Feb 16 13:41:26 compute-0 nova_compute[185723]:       <driver name="vhost" rx_queue_size="512"/>
Feb 16 13:41:26 compute-0 nova_compute[185723]:       <mtu size="1442"/>
Feb 16 13:41:26 compute-0 nova_compute[185723]:       <target dev="tapf4039b0f-53"/>
Feb 16 13:41:26 compute-0 nova_compute[185723]:     </interface>
Feb 16 13:41:26 compute-0 nova_compute[185723]:     <serial type="pty">
Feb 16 13:41:26 compute-0 nova_compute[185723]:       <log file="/var/lib/nova/instances/44f63a81-024b-446b-a144-28445aaae47c/console.log" append="off"/>
Feb 16 13:41:26 compute-0 nova_compute[185723]:     </serial>
Feb 16 13:41:26 compute-0 nova_compute[185723]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Feb 16 13:41:26 compute-0 nova_compute[185723]:     <video>
Feb 16 13:41:26 compute-0 nova_compute[185723]:       <model type="virtio"/>
Feb 16 13:41:26 compute-0 nova_compute[185723]:     </video>
Feb 16 13:41:26 compute-0 nova_compute[185723]:     <input type="tablet" bus="usb"/>
Feb 16 13:41:26 compute-0 nova_compute[185723]:     <rng model="virtio">
Feb 16 13:41:26 compute-0 nova_compute[185723]:       <backend model="random">/dev/urandom</backend>
Feb 16 13:41:26 compute-0 nova_compute[185723]:     </rng>
Feb 16 13:41:26 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root"/>
Feb 16 13:41:26 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:41:26 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:41:26 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:41:26 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:41:26 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:41:26 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:41:26 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:41:26 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:41:26 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:41:26 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:41:26 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:41:26 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:41:26 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:41:26 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:41:26 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:41:26 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:41:26 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:41:26 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:41:26 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:41:26 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:41:26 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:41:26 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:41:26 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:41:26 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:41:26 compute-0 nova_compute[185723]:     <controller type="usb" index="0"/>
Feb 16 13:41:26 compute-0 nova_compute[185723]:     <memballoon model="virtio">
Feb 16 13:41:26 compute-0 nova_compute[185723]:       <stats period="10"/>
Feb 16 13:41:26 compute-0 nova_compute[185723]:     </memballoon>
Feb 16 13:41:26 compute-0 nova_compute[185723]:   </devices>
Feb 16 13:41:26 compute-0 nova_compute[185723]: </domain>
Feb 16 13:41:26 compute-0 nova_compute[185723]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Feb 16 13:41:26 compute-0 nova_compute[185723]: 2026-02-16 13:41:26.956 185727 DEBUG nova.compute.manager [None req-d15e0874-b328-4f20-a78d-7b9847294e72 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 44f63a81-024b-446b-a144-28445aaae47c] Preparing to wait for external event network-vif-plugged-f4039b0f-5331-420f-9e9f-432d4b817a98 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Feb 16 13:41:26 compute-0 nova_compute[185723]: 2026-02-16 13:41:26.956 185727 DEBUG oslo_concurrency.lockutils [None req-d15e0874-b328-4f20-a78d-7b9847294e72 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Acquiring lock "44f63a81-024b-446b-a144-28445aaae47c-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:41:26 compute-0 nova_compute[185723]: 2026-02-16 13:41:26.956 185727 DEBUG oslo_concurrency.lockutils [None req-d15e0874-b328-4f20-a78d-7b9847294e72 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "44f63a81-024b-446b-a144-28445aaae47c-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:41:26 compute-0 nova_compute[185723]: 2026-02-16 13:41:26.956 185727 DEBUG oslo_concurrency.lockutils [None req-d15e0874-b328-4f20-a78d-7b9847294e72 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "44f63a81-024b-446b-a144-28445aaae47c-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:41:26 compute-0 nova_compute[185723]: 2026-02-16 13:41:26.957 185727 DEBUG nova.virt.libvirt.vif [None req-d15e0874-b328-4f20-a78d-7b9847294e72 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-16T13:41:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-230655652',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-230655652',id=17,image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='76c271745e704d5fa97fe16a7dcd4a81',ramdisk_id='',reservation_id='r-4tu80fu6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-1085993185',owner_user_name='tempest-TestExecuteStrategies-1085993185-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-16T13:41:20Z,user_data=None,user_id='e19cd2d8a8894526ba620ca3249e9a63',uuid=44f63a81-024b-446b-a144-28445aaae47c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f4039b0f-5331-420f-9e9f-432d4b817a98", "address": "fa:16:3e:c5:de:ed", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf4039b0f-53", "ovs_interfaceid": "f4039b0f-5331-420f-9e9f-432d4b817a98", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Feb 16 13:41:26 compute-0 nova_compute[185723]: 2026-02-16 13:41:26.957 185727 DEBUG nova.network.os_vif_util [None req-d15e0874-b328-4f20-a78d-7b9847294e72 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Converting VIF {"id": "f4039b0f-5331-420f-9e9f-432d4b817a98", "address": "fa:16:3e:c5:de:ed", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf4039b0f-53", "ovs_interfaceid": "f4039b0f-5331-420f-9e9f-432d4b817a98", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 13:41:26 compute-0 nova_compute[185723]: 2026-02-16 13:41:26.958 185727 DEBUG nova.network.os_vif_util [None req-d15e0874-b328-4f20-a78d-7b9847294e72 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c5:de:ed,bridge_name='br-int',has_traffic_filtering=True,id=f4039b0f-5331-420f-9e9f-432d4b817a98,network=Network(62a1ccdd-3048-4bbf-acc8-c791bff79ee8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf4039b0f-53') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 13:41:26 compute-0 nova_compute[185723]: 2026-02-16 13:41:26.959 185727 DEBUG os_vif [None req-d15e0874-b328-4f20-a78d-7b9847294e72 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c5:de:ed,bridge_name='br-int',has_traffic_filtering=True,id=f4039b0f-5331-420f-9e9f-432d4b817a98,network=Network(62a1ccdd-3048-4bbf-acc8-c791bff79ee8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf4039b0f-53') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Feb 16 13:41:26 compute-0 nova_compute[185723]: 2026-02-16 13:41:26.960 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:41:26 compute-0 nova_compute[185723]: 2026-02-16 13:41:26.960 185727 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:41:26 compute-0 nova_compute[185723]: 2026-02-16 13:41:26.960 185727 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 13:41:26 compute-0 nova_compute[185723]: 2026-02-16 13:41:26.962 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:41:26 compute-0 nova_compute[185723]: 2026-02-16 13:41:26.963 185727 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf4039b0f-53, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:41:26 compute-0 nova_compute[185723]: 2026-02-16 13:41:26.963 185727 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapf4039b0f-53, col_values=(('external_ids', {'iface-id': 'f4039b0f-5331-420f-9e9f-432d4b817a98', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:c5:de:ed', 'vm-uuid': '44f63a81-024b-446b-a144-28445aaae47c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:41:26 compute-0 nova_compute[185723]: 2026-02-16 13:41:26.965 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:41:26 compute-0 NetworkManager[56177]: <info>  [1771249286.9661] manager: (tapf4039b0f-53): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/59)
Feb 16 13:41:26 compute-0 nova_compute[185723]: 2026-02-16 13:41:26.967 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 13:41:26 compute-0 nova_compute[185723]: 2026-02-16 13:41:26.970 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:41:26 compute-0 nova_compute[185723]: 2026-02-16 13:41:26.971 185727 INFO os_vif [None req-d15e0874-b328-4f20-a78d-7b9847294e72 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c5:de:ed,bridge_name='br-int',has_traffic_filtering=True,id=f4039b0f-5331-420f-9e9f-432d4b817a98,network=Network(62a1ccdd-3048-4bbf-acc8-c791bff79ee8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf4039b0f-53')
Feb 16 13:41:27 compute-0 nova_compute[185723]: 2026-02-16 13:41:27.019 185727 DEBUG nova.virt.libvirt.driver [None req-d15e0874-b328-4f20-a78d-7b9847294e72 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 16 13:41:27 compute-0 nova_compute[185723]: 2026-02-16 13:41:27.020 185727 DEBUG nova.virt.libvirt.driver [None req-d15e0874-b328-4f20-a78d-7b9847294e72 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 16 13:41:27 compute-0 nova_compute[185723]: 2026-02-16 13:41:27.020 185727 DEBUG nova.virt.libvirt.driver [None req-d15e0874-b328-4f20-a78d-7b9847294e72 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] No VIF found with MAC fa:16:3e:c5:de:ed, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Feb 16 13:41:27 compute-0 nova_compute[185723]: 2026-02-16 13:41:27.020 185727 INFO nova.virt.libvirt.driver [None req-d15e0874-b328-4f20-a78d-7b9847294e72 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 44f63a81-024b-446b-a144-28445aaae47c] Using config drive
Feb 16 13:41:27 compute-0 nova_compute[185723]: 2026-02-16 13:41:27.605 185727 INFO nova.virt.libvirt.driver [None req-d15e0874-b328-4f20-a78d-7b9847294e72 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 44f63a81-024b-446b-a144-28445aaae47c] Creating config drive at /var/lib/nova/instances/44f63a81-024b-446b-a144-28445aaae47c/disk.config
Feb 16 13:41:27 compute-0 nova_compute[185723]: 2026-02-16 13:41:27.609 185727 DEBUG oslo_concurrency.processutils [None req-d15e0874-b328-4f20-a78d-7b9847294e72 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/44f63a81-024b-446b-a144-28445aaae47c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpahua3wrz execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:41:27 compute-0 nova_compute[185723]: 2026-02-16 13:41:27.733 185727 DEBUG oslo_concurrency.processutils [None req-d15e0874-b328-4f20-a78d-7b9847294e72 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/44f63a81-024b-446b-a144-28445aaae47c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpahua3wrz" returned: 0 in 0.124s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:41:27 compute-0 kernel: tapf4039b0f-53: entered promiscuous mode
Feb 16 13:41:27 compute-0 NetworkManager[56177]: <info>  [1771249287.7893] manager: (tapf4039b0f-53): new Tun device (/org/freedesktop/NetworkManager/Devices/60)
Feb 16 13:41:27 compute-0 nova_compute[185723]: 2026-02-16 13:41:27.791 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:41:27 compute-0 ovn_controller[96072]: 2026-02-16T13:41:27Z|00145|binding|INFO|Claiming lport f4039b0f-5331-420f-9e9f-432d4b817a98 for this chassis.
Feb 16 13:41:27 compute-0 ovn_controller[96072]: 2026-02-16T13:41:27Z|00146|binding|INFO|f4039b0f-5331-420f-9e9f-432d4b817a98: Claiming fa:16:3e:c5:de:ed 10.100.0.4
Feb 16 13:41:27 compute-0 ovn_controller[96072]: 2026-02-16T13:41:27Z|00147|binding|INFO|Setting lport f4039b0f-5331-420f-9e9f-432d4b817a98 ovn-installed in OVS
Feb 16 13:41:27 compute-0 nova_compute[185723]: 2026-02-16 13:41:27.797 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:41:27 compute-0 ovn_controller[96072]: 2026-02-16T13:41:27Z|00148|binding|INFO|Setting lport f4039b0f-5331-420f-9e9f-432d4b817a98 up in Southbound
Feb 16 13:41:27 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:41:27.799 105360 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c5:de:ed 10.100.0.4'], port_security=['fa:16:3e:c5:de:ed 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '44f63a81-024b-446b-a144-28445aaae47c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '76c271745e704d5fa97fe16a7dcd4a81', 'neutron:revision_number': '2', 'neutron:security_group_ids': '832f9d98-96fb-45e2-8c11-0f4534711ee2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fb3ae2f5-242d-4288-83f4-c053c76499e5, chassis=[<ovs.db.idl.Row object at 0x7f2e53046760>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2e53046760>], logical_port=f4039b0f-5331-420f-9e9f-432d4b817a98) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 13:41:27 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:41:27.800 105360 INFO neutron.agent.ovn.metadata.agent [-] Port f4039b0f-5331-420f-9e9f-432d4b817a98 in datapath 62a1ccdd-3048-4bbf-acc8-c791bff79ee8 bound to our chassis
Feb 16 13:41:27 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:41:27.801 105360 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 62a1ccdd-3048-4bbf-acc8-c791bff79ee8
Feb 16 13:41:27 compute-0 nova_compute[185723]: 2026-02-16 13:41:27.815 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:41:27 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:41:27.816 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[a0a7dfea-edda-41d8-886d-556d7637a14a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:41:27 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:41:27.818 105360 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap62a1ccdd-31 in ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Feb 16 13:41:27 compute-0 nova_compute[185723]: 2026-02-16 13:41:27.819 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:41:27 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:41:27.820 206438 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap62a1ccdd-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Feb 16 13:41:27 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:41:27.820 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[d14bb090-c0fd-4296-8e49-91f570f0aedf]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:41:27 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:41:27.821 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[1ac18da8-0508-4a4a-b1e8-23e429a089e0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:41:27 compute-0 systemd-udevd[212832]: Network interface NamePolicy= disabled on kernel command line.
Feb 16 13:41:27 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:41:27.831 105762 DEBUG oslo.privsep.daemon [-] privsep: reply[63a72b70-98a3-48f9-a60d-34bc904e2c8c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:41:27 compute-0 NetworkManager[56177]: <info>  [1771249287.8418] device (tapf4039b0f-53): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 16 13:41:27 compute-0 NetworkManager[56177]: <info>  [1771249287.8428] device (tapf4039b0f-53): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Feb 16 13:41:27 compute-0 systemd-machined[155229]: New machine qemu-13-instance-00000011.
Feb 16 13:41:27 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:41:27.856 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[6ca8f856-68a1-4d56-b4d9-f33acf58f614]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:41:27 compute-0 systemd[1]: Started Virtual Machine qemu-13-instance-00000011.
Feb 16 13:41:27 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:41:27.880 206452 DEBUG oslo.privsep.daemon [-] privsep: reply[32e152ad-8d1e-4e71-b11f-772de37668d5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:41:27 compute-0 podman[212804]: 2026-02-16 13:41:27.883739142 +0000 UTC m=+0.104431849 container health_status d69bd0be854ec8043fcba555b70c2cd311c7a2510d07d6bc9929fe7684abece9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Feb 16 13:41:27 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:41:27.883 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[3ebb29e3-4bfb-4aa0-bc81-03da0506018a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:41:27 compute-0 systemd-udevd[212847]: Network interface NamePolicy= disabled on kernel command line.
Feb 16 13:41:27 compute-0 podman[212798]: 2026-02-16 13:41:27.885780143 +0000 UTC m=+0.109734581 container health_status 93151dcbec9f159583042df5101bdd7b5b1bcb5f3117955b4bc9cd1c0e465b98 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, io.openshift.expose-services=, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., architecture=x86_64, name=ubi9/ubi-minimal, com.redhat.component=ubi9-minimal-container, build-date=2026-02-05T04:57:10Z, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, release=1770267347, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., org.opencontainers.image.created=2026-02-05T04:57:10Z, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-type=git, version=9.7, config_id=openstack_network_exporter, managed_by=edpm_ansible, container_name=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9)
Feb 16 13:41:27 compute-0 NetworkManager[56177]: <info>  [1771249287.8860] manager: (tap62a1ccdd-30): new Veth device (/org/freedesktop/NetworkManager/Devices/61)
Feb 16 13:41:27 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:41:27.914 206452 DEBUG oslo.privsep.daemon [-] privsep: reply[721a95c6-6386-4597-bb4f-2a2e2f2a7ec9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:41:27 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:41:27.919 206452 DEBUG oslo.privsep.daemon [-] privsep: reply[163de89e-df02-4725-972e-4dc5990b7259]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:41:27 compute-0 NetworkManager[56177]: <info>  [1771249287.9393] device (tap62a1ccdd-30): carrier: link connected
Feb 16 13:41:27 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:41:27.945 206452 DEBUG oslo.privsep.daemon [-] privsep: reply[c0e104e7-52d6-4601-9730-cc0572da9fa7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:41:27 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:41:27.962 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[ab67dc7d-8a31-4e3e-ae7b-64aeadc2ff76]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap62a1ccdd-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a9:94:92'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 43], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 537133, 'reachable_time': 44239, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 212881, 'error': None, 'target': 'ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:41:27 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:41:27.976 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[e9d71896-47f2-412d-819a-eec7c892301b]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fea9:9492'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 537133, 'tstamp': 537133}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 212882, 'error': None, 'target': 'ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:41:27 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:41:27.989 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[fc47190a-b3db-4b46-82b0-3f9d0d225660]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap62a1ccdd-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a9:94:92'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 43], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 537133, 'reachable_time': 44239, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 212883, 'error': None, 'target': 'ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:41:27 compute-0 nova_compute[185723]: 2026-02-16 13:41:27.995 185727 DEBUG nova.compute.manager [req-88726e4a-5880-4735-9012-b68d68d9494e req-6b31b3f2-4cfe-4420-bfde-013725822ac0 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 44f63a81-024b-446b-a144-28445aaae47c] Received event network-vif-plugged-f4039b0f-5331-420f-9e9f-432d4b817a98 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:41:27 compute-0 nova_compute[185723]: 2026-02-16 13:41:27.996 185727 DEBUG oslo_concurrency.lockutils [req-88726e4a-5880-4735-9012-b68d68d9494e req-6b31b3f2-4cfe-4420-bfde-013725822ac0 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "44f63a81-024b-446b-a144-28445aaae47c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:41:27 compute-0 nova_compute[185723]: 2026-02-16 13:41:27.996 185727 DEBUG oslo_concurrency.lockutils [req-88726e4a-5880-4735-9012-b68d68d9494e req-6b31b3f2-4cfe-4420-bfde-013725822ac0 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "44f63a81-024b-446b-a144-28445aaae47c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:41:27 compute-0 nova_compute[185723]: 2026-02-16 13:41:27.997 185727 DEBUG oslo_concurrency.lockutils [req-88726e4a-5880-4735-9012-b68d68d9494e req-6b31b3f2-4cfe-4420-bfde-013725822ac0 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "44f63a81-024b-446b-a144-28445aaae47c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:41:27 compute-0 nova_compute[185723]: 2026-02-16 13:41:27.997 185727 DEBUG nova.compute.manager [req-88726e4a-5880-4735-9012-b68d68d9494e req-6b31b3f2-4cfe-4420-bfde-013725822ac0 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 44f63a81-024b-446b-a144-28445aaae47c] Processing event network-vif-plugged-f4039b0f-5331-420f-9e9f-432d4b817a98 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Feb 16 13:41:28 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:41:28.016 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[e0f96caf-c857-40bd-9b74-e1a335a63f7f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:41:28 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:41:28.065 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[636fe111-a914-4e28-b9e3-672b527c3070]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:41:28 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:41:28.066 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap62a1ccdd-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:41:28 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:41:28.066 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 13:41:28 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:41:28.067 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap62a1ccdd-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:41:28 compute-0 NetworkManager[56177]: <info>  [1771249288.0699] manager: (tap62a1ccdd-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/62)
Feb 16 13:41:28 compute-0 kernel: tap62a1ccdd-30: entered promiscuous mode
Feb 16 13:41:28 compute-0 nova_compute[185723]: 2026-02-16 13:41:28.070 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:41:28 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:41:28.076 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap62a1ccdd-30, col_values=(('external_ids', {'iface-id': 'ac21d57d-f71e-4560-b6aa-e9f6e3838308'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:41:28 compute-0 nova_compute[185723]: 2026-02-16 13:41:28.077 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:41:28 compute-0 ovn_controller[96072]: 2026-02-16T13:41:28Z|00149|binding|INFO|Releasing lport ac21d57d-f71e-4560-b6aa-e9f6e3838308 from this chassis (sb_readonly=0)
Feb 16 13:41:28 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:41:28.080 105360 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/62a1ccdd-3048-4bbf-acc8-c791bff79ee8.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/62a1ccdd-3048-4bbf-acc8-c791bff79ee8.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Feb 16 13:41:28 compute-0 nova_compute[185723]: 2026-02-16 13:41:28.082 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:41:28 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:41:28.082 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[d5b7a15e-178e-437e-b15f-2153f837a217]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:41:28 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:41:28.083 105360 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Feb 16 13:41:28 compute-0 ovn_metadata_agent[105355]: global
Feb 16 13:41:28 compute-0 ovn_metadata_agent[105355]:     log         /dev/log local0 debug
Feb 16 13:41:28 compute-0 ovn_metadata_agent[105355]:     log-tag     haproxy-metadata-proxy-62a1ccdd-3048-4bbf-acc8-c791bff79ee8
Feb 16 13:41:28 compute-0 ovn_metadata_agent[105355]:     user        root
Feb 16 13:41:28 compute-0 ovn_metadata_agent[105355]:     group       root
Feb 16 13:41:28 compute-0 ovn_metadata_agent[105355]:     maxconn     1024
Feb 16 13:41:28 compute-0 ovn_metadata_agent[105355]:     pidfile     /var/lib/neutron/external/pids/62a1ccdd-3048-4bbf-acc8-c791bff79ee8.pid.haproxy
Feb 16 13:41:28 compute-0 ovn_metadata_agent[105355]:     daemon
Feb 16 13:41:28 compute-0 ovn_metadata_agent[105355]: 
Feb 16 13:41:28 compute-0 ovn_metadata_agent[105355]: defaults
Feb 16 13:41:28 compute-0 ovn_metadata_agent[105355]:     log global
Feb 16 13:41:28 compute-0 ovn_metadata_agent[105355]:     mode http
Feb 16 13:41:28 compute-0 ovn_metadata_agent[105355]:     option httplog
Feb 16 13:41:28 compute-0 ovn_metadata_agent[105355]:     option dontlognull
Feb 16 13:41:28 compute-0 ovn_metadata_agent[105355]:     option http-server-close
Feb 16 13:41:28 compute-0 ovn_metadata_agent[105355]:     option forwardfor
Feb 16 13:41:28 compute-0 ovn_metadata_agent[105355]:     retries                 3
Feb 16 13:41:28 compute-0 ovn_metadata_agent[105355]:     timeout http-request    30s
Feb 16 13:41:28 compute-0 ovn_metadata_agent[105355]:     timeout connect         30s
Feb 16 13:41:28 compute-0 ovn_metadata_agent[105355]:     timeout client          32s
Feb 16 13:41:28 compute-0 ovn_metadata_agent[105355]:     timeout server          32s
Feb 16 13:41:28 compute-0 ovn_metadata_agent[105355]:     timeout http-keep-alive 30s
Feb 16 13:41:28 compute-0 ovn_metadata_agent[105355]: 
Feb 16 13:41:28 compute-0 ovn_metadata_agent[105355]: 
Feb 16 13:41:28 compute-0 ovn_metadata_agent[105355]: listen listener
Feb 16 13:41:28 compute-0 ovn_metadata_agent[105355]:     bind 169.254.169.254:80
Feb 16 13:41:28 compute-0 ovn_metadata_agent[105355]:     server metadata /var/lib/neutron/metadata_proxy
Feb 16 13:41:28 compute-0 ovn_metadata_agent[105355]:     http-request add-header X-OVN-Network-ID 62a1ccdd-3048-4bbf-acc8-c791bff79ee8
Feb 16 13:41:28 compute-0 ovn_metadata_agent[105355]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Feb 16 13:41:28 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:41:28.084 105360 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'env', 'PROCESS_TAG=haproxy-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/62a1ccdd-3048-4bbf-acc8-c791bff79ee8.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Feb 16 13:41:28 compute-0 nova_compute[185723]: 2026-02-16 13:41:28.210 185727 DEBUG nova.compute.manager [None req-d15e0874-b328-4f20-a78d-7b9847294e72 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 44f63a81-024b-446b-a144-28445aaae47c] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Feb 16 13:41:28 compute-0 nova_compute[185723]: 2026-02-16 13:41:28.211 185727 DEBUG nova.virt.driver [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] Emitting event <LifecycleEvent: 1771249288.2101657, 44f63a81-024b-446b-a144-28445aaae47c => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 13:41:28 compute-0 nova_compute[185723]: 2026-02-16 13:41:28.211 185727 INFO nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: 44f63a81-024b-446b-a144-28445aaae47c] VM Started (Lifecycle Event)
Feb 16 13:41:28 compute-0 nova_compute[185723]: 2026-02-16 13:41:28.214 185727 DEBUG nova.virt.libvirt.driver [None req-d15e0874-b328-4f20-a78d-7b9847294e72 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 44f63a81-024b-446b-a144-28445aaae47c] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Feb 16 13:41:28 compute-0 nova_compute[185723]: 2026-02-16 13:41:28.218 185727 INFO nova.virt.libvirt.driver [-] [instance: 44f63a81-024b-446b-a144-28445aaae47c] Instance spawned successfully.
Feb 16 13:41:28 compute-0 nova_compute[185723]: 2026-02-16 13:41:28.218 185727 DEBUG nova.virt.libvirt.driver [None req-d15e0874-b328-4f20-a78d-7b9847294e72 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 44f63a81-024b-446b-a144-28445aaae47c] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Feb 16 13:41:28 compute-0 nova_compute[185723]: 2026-02-16 13:41:28.241 185727 DEBUG nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: 44f63a81-024b-446b-a144-28445aaae47c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:41:28 compute-0 nova_compute[185723]: 2026-02-16 13:41:28.245 185727 DEBUG nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: 44f63a81-024b-446b-a144-28445aaae47c] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 16 13:41:28 compute-0 nova_compute[185723]: 2026-02-16 13:41:28.250 185727 DEBUG nova.virt.libvirt.driver [None req-d15e0874-b328-4f20-a78d-7b9847294e72 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 44f63a81-024b-446b-a144-28445aaae47c] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:41:28 compute-0 nova_compute[185723]: 2026-02-16 13:41:28.250 185727 DEBUG nova.virt.libvirt.driver [None req-d15e0874-b328-4f20-a78d-7b9847294e72 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 44f63a81-024b-446b-a144-28445aaae47c] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:41:28 compute-0 nova_compute[185723]: 2026-02-16 13:41:28.250 185727 DEBUG nova.virt.libvirt.driver [None req-d15e0874-b328-4f20-a78d-7b9847294e72 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 44f63a81-024b-446b-a144-28445aaae47c] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:41:28 compute-0 nova_compute[185723]: 2026-02-16 13:41:28.251 185727 DEBUG nova.virt.libvirt.driver [None req-d15e0874-b328-4f20-a78d-7b9847294e72 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 44f63a81-024b-446b-a144-28445aaae47c] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:41:28 compute-0 nova_compute[185723]: 2026-02-16 13:41:28.251 185727 DEBUG nova.virt.libvirt.driver [None req-d15e0874-b328-4f20-a78d-7b9847294e72 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 44f63a81-024b-446b-a144-28445aaae47c] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:41:28 compute-0 nova_compute[185723]: 2026-02-16 13:41:28.251 185727 DEBUG nova.virt.libvirt.driver [None req-d15e0874-b328-4f20-a78d-7b9847294e72 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 44f63a81-024b-446b-a144-28445aaae47c] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:41:28 compute-0 nova_compute[185723]: 2026-02-16 13:41:28.282 185727 INFO nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: 44f63a81-024b-446b-a144-28445aaae47c] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 16 13:41:28 compute-0 nova_compute[185723]: 2026-02-16 13:41:28.283 185727 DEBUG nova.virt.driver [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] Emitting event <LifecycleEvent: 1771249288.2110903, 44f63a81-024b-446b-a144-28445aaae47c => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 13:41:28 compute-0 nova_compute[185723]: 2026-02-16 13:41:28.283 185727 INFO nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: 44f63a81-024b-446b-a144-28445aaae47c] VM Paused (Lifecycle Event)
Feb 16 13:41:28 compute-0 nova_compute[185723]: 2026-02-16 13:41:28.312 185727 DEBUG nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: 44f63a81-024b-446b-a144-28445aaae47c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:41:28 compute-0 nova_compute[185723]: 2026-02-16 13:41:28.316 185727 DEBUG nova.virt.driver [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] Emitting event <LifecycleEvent: 1771249288.2130578, 44f63a81-024b-446b-a144-28445aaae47c => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 13:41:28 compute-0 nova_compute[185723]: 2026-02-16 13:41:28.316 185727 INFO nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: 44f63a81-024b-446b-a144-28445aaae47c] VM Resumed (Lifecycle Event)
Feb 16 13:41:28 compute-0 nova_compute[185723]: 2026-02-16 13:41:28.337 185727 INFO nova.compute.manager [None req-d15e0874-b328-4f20-a78d-7b9847294e72 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 44f63a81-024b-446b-a144-28445aaae47c] Took 7.64 seconds to spawn the instance on the hypervisor.
Feb 16 13:41:28 compute-0 nova_compute[185723]: 2026-02-16 13:41:28.338 185727 DEBUG nova.compute.manager [None req-d15e0874-b328-4f20-a78d-7b9847294e72 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 44f63a81-024b-446b-a144-28445aaae47c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:41:28 compute-0 nova_compute[185723]: 2026-02-16 13:41:28.344 185727 DEBUG nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: 44f63a81-024b-446b-a144-28445aaae47c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:41:28 compute-0 nova_compute[185723]: 2026-02-16 13:41:28.346 185727 DEBUG nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: 44f63a81-024b-446b-a144-28445aaae47c] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 16 13:41:28 compute-0 nova_compute[185723]: 2026-02-16 13:41:28.366 185727 INFO nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: 44f63a81-024b-446b-a144-28445aaae47c] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 16 13:41:28 compute-0 podman[212921]: 2026-02-16 13:41:28.397824515 +0000 UTC m=+0.039092190 container create 42069fc2ab7b509976930f3d5acfa8f54a7dd91cf46332176957e1c728be8c8f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Feb 16 13:41:28 compute-0 nova_compute[185723]: 2026-02-16 13:41:28.408 185727 INFO nova.compute.manager [None req-d15e0874-b328-4f20-a78d-7b9847294e72 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 44f63a81-024b-446b-a144-28445aaae47c] Took 8.10 seconds to build instance.
Feb 16 13:41:28 compute-0 nova_compute[185723]: 2026-02-16 13:41:28.425 185727 DEBUG oslo_concurrency.lockutils [None req-d15e0874-b328-4f20-a78d-7b9847294e72 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "44f63a81-024b-446b-a144-28445aaae47c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.200s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:41:28 compute-0 systemd[1]: Started libpod-conmon-42069fc2ab7b509976930f3d5acfa8f54a7dd91cf46332176957e1c728be8c8f.scope.
Feb 16 13:41:28 compute-0 systemd[1]: Started libcrun container.
Feb 16 13:41:28 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/035db9f9b501e1ba06ecf122331369d532122cbed242aa804dee3180072ddefe/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 16 13:41:28 compute-0 podman[212921]: 2026-02-16 13:41:28.375314877 +0000 UTC m=+0.016582572 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Feb 16 13:41:28 compute-0 podman[212921]: 2026-02-16 13:41:28.476120915 +0000 UTC m=+0.117388610 container init 42069fc2ab7b509976930f3d5acfa8f54a7dd91cf46332176957e1c728be8c8f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Feb 16 13:41:28 compute-0 podman[212921]: 2026-02-16 13:41:28.48031629 +0000 UTC m=+0.121583965 container start 42069fc2ab7b509976930f3d5acfa8f54a7dd91cf46332176957e1c728be8c8f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 16 13:41:28 compute-0 neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8[212936]: [NOTICE]   (212940) : New worker (212942) forked
Feb 16 13:41:28 compute-0 neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8[212936]: [NOTICE]   (212940) : Loading success.
Feb 16 13:41:28 compute-0 nova_compute[185723]: 2026-02-16 13:41:28.967 185727 DEBUG nova.network.neutron [req-bbdf67c6-bb7c-44f8-8cab-8a798d9139be req-a455b107-1c1d-4e1c-a429-ca7c1169f475 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 44f63a81-024b-446b-a144-28445aaae47c] Updated VIF entry in instance network info cache for port f4039b0f-5331-420f-9e9f-432d4b817a98. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 16 13:41:28 compute-0 nova_compute[185723]: 2026-02-16 13:41:28.967 185727 DEBUG nova.network.neutron [req-bbdf67c6-bb7c-44f8-8cab-8a798d9139be req-a455b107-1c1d-4e1c-a429-ca7c1169f475 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 44f63a81-024b-446b-a144-28445aaae47c] Updating instance_info_cache with network_info: [{"id": "f4039b0f-5331-420f-9e9f-432d4b817a98", "address": "fa:16:3e:c5:de:ed", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf4039b0f-53", "ovs_interfaceid": "f4039b0f-5331-420f-9e9f-432d4b817a98", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 13:41:29 compute-0 nova_compute[185723]: 2026-02-16 13:41:29.322 185727 DEBUG oslo_concurrency.lockutils [req-bbdf67c6-bb7c-44f8-8cab-8a798d9139be req-a455b107-1c1d-4e1c-a429-ca7c1169f475 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Releasing lock "refresh_cache-44f63a81-024b-446b-a144-28445aaae47c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 13:41:29 compute-0 podman[195053]: time="2026-02-16T13:41:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:41:29 compute-0 podman[195053]: @ - - [16/Feb/2026:13:41:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17245 "" "Go-http-client/1.1"
Feb 16 13:41:29 compute-0 podman[195053]: @ - - [16/Feb/2026:13:41:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2636 "" "Go-http-client/1.1"
Feb 16 13:41:29 compute-0 nova_compute[185723]: 2026-02-16 13:41:29.961 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:41:30 compute-0 nova_compute[185723]: 2026-02-16 13:41:30.092 185727 DEBUG nova.compute.manager [req-ceda5cb3-d21d-4618-b5e2-805aa6e28a09 req-74dbe6d6-bde7-4cd9-9bf3-a92df617e2bd faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 44f63a81-024b-446b-a144-28445aaae47c] Received event network-vif-plugged-f4039b0f-5331-420f-9e9f-432d4b817a98 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:41:30 compute-0 nova_compute[185723]: 2026-02-16 13:41:30.094 185727 DEBUG oslo_concurrency.lockutils [req-ceda5cb3-d21d-4618-b5e2-805aa6e28a09 req-74dbe6d6-bde7-4cd9-9bf3-a92df617e2bd faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "44f63a81-024b-446b-a144-28445aaae47c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:41:30 compute-0 nova_compute[185723]: 2026-02-16 13:41:30.094 185727 DEBUG oslo_concurrency.lockutils [req-ceda5cb3-d21d-4618-b5e2-805aa6e28a09 req-74dbe6d6-bde7-4cd9-9bf3-a92df617e2bd faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "44f63a81-024b-446b-a144-28445aaae47c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:41:30 compute-0 nova_compute[185723]: 2026-02-16 13:41:30.094 185727 DEBUG oslo_concurrency.lockutils [req-ceda5cb3-d21d-4618-b5e2-805aa6e28a09 req-74dbe6d6-bde7-4cd9-9bf3-a92df617e2bd faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "44f63a81-024b-446b-a144-28445aaae47c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:41:30 compute-0 nova_compute[185723]: 2026-02-16 13:41:30.095 185727 DEBUG nova.compute.manager [req-ceda5cb3-d21d-4618-b5e2-805aa6e28a09 req-74dbe6d6-bde7-4cd9-9bf3-a92df617e2bd faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 44f63a81-024b-446b-a144-28445aaae47c] No waiting events found dispatching network-vif-plugged-f4039b0f-5331-420f-9e9f-432d4b817a98 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:41:30 compute-0 nova_compute[185723]: 2026-02-16 13:41:30.095 185727 WARNING nova.compute.manager [req-ceda5cb3-d21d-4618-b5e2-805aa6e28a09 req-74dbe6d6-bde7-4cd9-9bf3-a92df617e2bd faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 44f63a81-024b-446b-a144-28445aaae47c] Received unexpected event network-vif-plugged-f4039b0f-5331-420f-9e9f-432d4b817a98 for instance with vm_state active and task_state None.
Feb 16 13:41:31 compute-0 podman[212951]: 2026-02-16 13:41:31.036699873 +0000 UTC m=+0.070874358 container health_status 19961a2f872cc72daf954f4dd556f980b5f58f137d145776df6132c167a8a93c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Feb 16 13:41:31 compute-0 openstack_network_exporter[197909]: ERROR   13:41:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:41:31 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:41:31 compute-0 openstack_network_exporter[197909]: ERROR   13:41:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:41:31 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:41:31 compute-0 nova_compute[185723]: 2026-02-16 13:41:31.966 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:41:33 compute-0 sshd-session[212978]: Invalid user admin from 146.190.226.24 port 35410
Feb 16 13:41:33 compute-0 sshd-session[212978]: Connection closed by invalid user admin 146.190.226.24 port 35410 [preauth]
Feb 16 13:41:35 compute-0 nova_compute[185723]: 2026-02-16 13:41:35.045 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:41:36 compute-0 nova_compute[185723]: 2026-02-16 13:41:36.969 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:41:40 compute-0 nova_compute[185723]: 2026-02-16 13:41:40.048 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:41:41 compute-0 nova_compute[185723]: 2026-02-16 13:41:41.972 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:41:42 compute-0 podman[212998]: 2026-02-16 13:41:42.019583261 +0000 UTC m=+0.055966099 container health_status 4ec6105d1727f201f656d7e6e358931be8ceabc5af37fb5ad779abc620122180 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 16 13:41:44 compute-0 ovn_controller[96072]: 2026-02-16T13:41:44Z|00022|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:c5:de:ed 10.100.0.4
Feb 16 13:41:44 compute-0 ovn_controller[96072]: 2026-02-16T13:41:44Z|00023|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:c5:de:ed 10.100.0.4
Feb 16 13:41:45 compute-0 nova_compute[185723]: 2026-02-16 13:41:45.049 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:41:46 compute-0 nova_compute[185723]: 2026-02-16 13:41:46.974 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:41:50 compute-0 nova_compute[185723]: 2026-02-16 13:41:50.050 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:41:51 compute-0 nova_compute[185723]: 2026-02-16 13:41:51.991 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:41:55 compute-0 nova_compute[185723]: 2026-02-16 13:41:55.053 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:41:57 compute-0 nova_compute[185723]: 2026-02-16 13:41:57.002 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:41:58 compute-0 podman[213023]: 2026-02-16 13:41:58.007395764 +0000 UTC m=+0.048012311 container health_status 93151dcbec9f159583042df5101bdd7b5b1bcb5f3117955b4bc9cd1c0e465b98 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, org.opencontainers.image.created=2026-02-05T04:57:10Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., version=9.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, release=1770267347, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, build-date=2026-02-05T04:57:10Z, maintainer=Red Hat, Inc., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, distribution-scope=public, config_id=openstack_network_exporter, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.openshift.expose-services=, name=ubi9/ubi-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container)
Feb 16 13:41:58 compute-0 podman[213024]: 2026-02-16 13:41:58.021191276 +0000 UTC m=+0.055460796 container health_status d69bd0be854ec8043fcba555b70c2cd311c7a2510d07d6bc9929fe7684abece9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, org.label-schema.build-date=20260127, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 16 13:41:59 compute-0 podman[195053]: time="2026-02-16T13:41:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:41:59 compute-0 podman[195053]: @ - - [16/Feb/2026:13:41:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17245 "" "Go-http-client/1.1"
Feb 16 13:41:59 compute-0 podman[195053]: @ - - [16/Feb/2026:13:41:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2639 "" "Go-http-client/1.1"
Feb 16 13:42:00 compute-0 nova_compute[185723]: 2026-02-16 13:42:00.054 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:42:01 compute-0 openstack_network_exporter[197909]: ERROR   13:42:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:42:01 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:42:01 compute-0 openstack_network_exporter[197909]: ERROR   13:42:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:42:01 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:42:02 compute-0 nova_compute[185723]: 2026-02-16 13:42:02.004 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:42:02 compute-0 podman[213063]: 2026-02-16 13:42:02.026277409 +0000 UTC m=+0.068579891 container health_status 19961a2f872cc72daf954f4dd556f980b5f58f137d145776df6132c167a8a93c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.build-date=20260127, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Feb 16 13:42:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:42:03.236 105360 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:42:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:42:03.236 105360 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:42:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:42:03.237 105360 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:42:05 compute-0 nova_compute[185723]: 2026-02-16 13:42:05.056 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:42:05 compute-0 sshd-session[213090]: Invalid user admin from 64.227.72.94 port 34182
Feb 16 13:42:05 compute-0 sshd-session[213090]: Connection closed by invalid user admin 64.227.72.94 port 34182 [preauth]
Feb 16 13:42:06 compute-0 sshd-session[213092]: Invalid user postgres from 188.166.42.159 port 40722
Feb 16 13:42:06 compute-0 sshd-session[213092]: Connection closed by invalid user postgres 188.166.42.159 port 40722 [preauth]
Feb 16 13:42:07 compute-0 nova_compute[185723]: 2026-02-16 13:42:07.008 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:42:10 compute-0 nova_compute[185723]: 2026-02-16 13:42:10.058 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:42:11 compute-0 ovn_controller[96072]: 2026-02-16T13:42:11Z|00150|memory_trim|INFO|Detected inactivity (last active 30006 ms ago): trimming memory
Feb 16 13:42:12 compute-0 nova_compute[185723]: 2026-02-16 13:42:12.011 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:42:13 compute-0 podman[213094]: 2026-02-16 13:42:13.03430429 +0000 UTC m=+0.071816841 container health_status 4ec6105d1727f201f656d7e6e358931be8ceabc5af37fb5ad779abc620122180 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 16 13:42:15 compute-0 nova_compute[185723]: 2026-02-16 13:42:15.059 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:42:16 compute-0 nova_compute[185723]: 2026-02-16 13:42:16.434 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:42:17 compute-0 nova_compute[185723]: 2026-02-16 13:42:17.012 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:42:19 compute-0 nova_compute[185723]: 2026-02-16 13:42:19.433 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:42:19 compute-0 nova_compute[185723]: 2026-02-16 13:42:19.434 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 16 13:42:19 compute-0 nova_compute[185723]: 2026-02-16 13:42:19.434 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 16 13:42:20 compute-0 nova_compute[185723]: 2026-02-16 13:42:20.060 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:42:20 compute-0 nova_compute[185723]: 2026-02-16 13:42:20.550 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Acquiring lock "refresh_cache-44f63a81-024b-446b-a144-28445aaae47c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 13:42:20 compute-0 nova_compute[185723]: 2026-02-16 13:42:20.550 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Acquired lock "refresh_cache-44f63a81-024b-446b-a144-28445aaae47c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 13:42:20 compute-0 nova_compute[185723]: 2026-02-16 13:42:20.551 185727 DEBUG nova.network.neutron [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] [instance: 44f63a81-024b-446b-a144-28445aaae47c] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Feb 16 13:42:20 compute-0 nova_compute[185723]: 2026-02-16 13:42:20.551 185727 DEBUG nova.objects.instance [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 44f63a81-024b-446b-a144-28445aaae47c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 13:42:22 compute-0 nova_compute[185723]: 2026-02-16 13:42:22.015 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:42:23 compute-0 nova_compute[185723]: 2026-02-16 13:42:23.835 185727 DEBUG nova.network.neutron [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] [instance: 44f63a81-024b-446b-a144-28445aaae47c] Updating instance_info_cache with network_info: [{"id": "f4039b0f-5331-420f-9e9f-432d4b817a98", "address": "fa:16:3e:c5:de:ed", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf4039b0f-53", "ovs_interfaceid": "f4039b0f-5331-420f-9e9f-432d4b817a98", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 13:42:23 compute-0 nova_compute[185723]: 2026-02-16 13:42:23.854 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Releasing lock "refresh_cache-44f63a81-024b-446b-a144-28445aaae47c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 13:42:23 compute-0 nova_compute[185723]: 2026-02-16 13:42:23.855 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] [instance: 44f63a81-024b-446b-a144-28445aaae47c] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Feb 16 13:42:23 compute-0 nova_compute[185723]: 2026-02-16 13:42:23.855 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:42:23 compute-0 nova_compute[185723]: 2026-02-16 13:42:23.855 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:42:23 compute-0 nova_compute[185723]: 2026-02-16 13:42:23.855 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:42:23 compute-0 nova_compute[185723]: 2026-02-16 13:42:23.856 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:42:23 compute-0 nova_compute[185723]: 2026-02-16 13:42:23.856 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 16 13:42:23 compute-0 nova_compute[185723]: 2026-02-16 13:42:23.856 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:42:23 compute-0 nova_compute[185723]: 2026-02-16 13:42:23.878 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:42:23 compute-0 nova_compute[185723]: 2026-02-16 13:42:23.879 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:42:23 compute-0 nova_compute[185723]: 2026-02-16 13:42:23.879 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:42:23 compute-0 nova_compute[185723]: 2026-02-16 13:42:23.879 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 16 13:42:23 compute-0 nova_compute[185723]: 2026-02-16 13:42:23.945 185727 DEBUG oslo_concurrency.processutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/44f63a81-024b-446b-a144-28445aaae47c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:42:24 compute-0 nova_compute[185723]: 2026-02-16 13:42:24.001 185727 DEBUG oslo_concurrency.processutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/44f63a81-024b-446b-a144-28445aaae47c/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:42:24 compute-0 nova_compute[185723]: 2026-02-16 13:42:24.002 185727 DEBUG oslo_concurrency.processutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/44f63a81-024b-446b-a144-28445aaae47c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:42:24 compute-0 nova_compute[185723]: 2026-02-16 13:42:24.052 185727 DEBUG oslo_concurrency.processutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/44f63a81-024b-446b-a144-28445aaae47c/disk --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:42:24 compute-0 nova_compute[185723]: 2026-02-16 13:42:24.228 185727 WARNING nova.virt.libvirt.driver [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 13:42:24 compute-0 nova_compute[185723]: 2026-02-16 13:42:24.229 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5644MB free_disk=73.19628524780273GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 16 13:42:24 compute-0 nova_compute[185723]: 2026-02-16 13:42:24.229 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:42:24 compute-0 nova_compute[185723]: 2026-02-16 13:42:24.230 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:42:24 compute-0 nova_compute[185723]: 2026-02-16 13:42:24.314 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Instance 44f63a81-024b-446b-a144-28445aaae47c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 16 13:42:24 compute-0 nova_compute[185723]: 2026-02-16 13:42:24.314 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 16 13:42:24 compute-0 nova_compute[185723]: 2026-02-16 13:42:24.315 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 16 13:42:24 compute-0 nova_compute[185723]: 2026-02-16 13:42:24.369 185727 DEBUG nova.compute.provider_tree [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Inventory has not changed in ProviderTree for provider: c9501a85-df32-4b8f-bce0-9425ef1e7866 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 13:42:24 compute-0 nova_compute[185723]: 2026-02-16 13:42:24.383 185727 DEBUG nova.scheduler.client.report [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Inventory has not changed for provider c9501a85-df32-4b8f-bce0-9425ef1e7866 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 13:42:24 compute-0 nova_compute[185723]: 2026-02-16 13:42:24.400 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 16 13:42:24 compute-0 nova_compute[185723]: 2026-02-16 13:42:24.400 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.171s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:42:24 compute-0 nova_compute[185723]: 2026-02-16 13:42:24.978 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:42:24 compute-0 nova_compute[185723]: 2026-02-16 13:42:24.978 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:42:25 compute-0 nova_compute[185723]: 2026-02-16 13:42:25.061 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:42:27 compute-0 nova_compute[185723]: 2026-02-16 13:42:27.017 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:42:29 compute-0 podman[213126]: 2026-02-16 13:42:29.011423488 +0000 UTC m=+0.053886657 container health_status 93151dcbec9f159583042df5101bdd7b5b1bcb5f3117955b4bc9cd1c0e465b98 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, version=9.7, architecture=x86_64, build-date=2026-02-05T04:57:10Z, config_id=openstack_network_exporter, vendor=Red Hat, Inc., io.buildah.version=1.33.7, name=ubi9/ubi-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, managed_by=edpm_ansible, vcs-type=git, release=1770267347, distribution-scope=public, org.opencontainers.image.created=2026-02-05T04:57:10Z, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container)
Feb 16 13:42:29 compute-0 podman[213127]: 2026-02-16 13:42:29.014950655 +0000 UTC m=+0.053543828 container health_status d69bd0be854ec8043fcba555b70c2cd311c7a2510d07d6bc9929fe7684abece9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Feb 16 13:42:29 compute-0 podman[195053]: time="2026-02-16T13:42:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:42:29 compute-0 podman[195053]: @ - - [16/Feb/2026:13:42:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17245 "" "Go-http-client/1.1"
Feb 16 13:42:29 compute-0 podman[195053]: @ - - [16/Feb/2026:13:42:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2645 "" "Go-http-client/1.1"
Feb 16 13:42:30 compute-0 nova_compute[185723]: 2026-02-16 13:42:30.062 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:42:31 compute-0 openstack_network_exporter[197909]: ERROR   13:42:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:42:31 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:42:31 compute-0 openstack_network_exporter[197909]: ERROR   13:42:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:42:31 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:42:32 compute-0 nova_compute[185723]: 2026-02-16 13:42:32.019 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:42:32 compute-0 nova_compute[185723]: 2026-02-16 13:42:32.890 185727 DEBUG nova.virt.libvirt.driver [None req-d80a3cf2-6a33-46be-b9cd-5e124e82152d bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: ed0f983d-6cd6-429c-8af1-0d52a56731d6] Creating tmpfile /var/lib/nova/instances/tmpnq0a57mh to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10041
Feb 16 13:42:33 compute-0 nova_compute[185723]: 2026-02-16 13:42:33.004 185727 DEBUG nova.compute.manager [None req-d80a3cf2-6a33-46be-b9cd-5e124e82152d bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpnq0a57mh',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.9/site-packages/nova/compute/manager.py:8476
Feb 16 13:42:33 compute-0 podman[213163]: 2026-02-16 13:42:33.029259337 +0000 UTC m=+0.069623367 container health_status 19961a2f872cc72daf954f4dd556f980b5f58f137d145776df6132c167a8a93c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 16 13:42:34 compute-0 nova_compute[185723]: 2026-02-16 13:42:34.101 185727 DEBUG nova.compute.manager [None req-d80a3cf2-6a33-46be-b9cd-5e124e82152d bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpnq0a57mh',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='ed0f983d-6cd6-429c-8af1-0d52a56731d6',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8604
Feb 16 13:42:34 compute-0 nova_compute[185723]: 2026-02-16 13:42:34.124 185727 DEBUG oslo_concurrency.lockutils [None req-d80a3cf2-6a33-46be-b9cd-5e124e82152d bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "refresh_cache-ed0f983d-6cd6-429c-8af1-0d52a56731d6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 13:42:34 compute-0 nova_compute[185723]: 2026-02-16 13:42:34.124 185727 DEBUG oslo_concurrency.lockutils [None req-d80a3cf2-6a33-46be-b9cd-5e124e82152d bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Acquired lock "refresh_cache-ed0f983d-6cd6-429c-8af1-0d52a56731d6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 13:42:34 compute-0 nova_compute[185723]: 2026-02-16 13:42:34.124 185727 DEBUG nova.network.neutron [None req-d80a3cf2-6a33-46be-b9cd-5e124e82152d bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: ed0f983d-6cd6-429c-8af1-0d52a56731d6] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 16 13:42:35 compute-0 nova_compute[185723]: 2026-02-16 13:42:35.063 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:42:35 compute-0 nova_compute[185723]: 2026-02-16 13:42:35.594 185727 DEBUG nova.network.neutron [None req-d80a3cf2-6a33-46be-b9cd-5e124e82152d bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: ed0f983d-6cd6-429c-8af1-0d52a56731d6] Updating instance_info_cache with network_info: [{"id": "c9816814-5dfa-4f80-812c-4fc20a800a47", "address": "fa:16:3e:b7:0e:aa", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc9816814-5d", "ovs_interfaceid": "c9816814-5dfa-4f80-812c-4fc20a800a47", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 13:42:35 compute-0 nova_compute[185723]: 2026-02-16 13:42:35.611 185727 DEBUG oslo_concurrency.lockutils [None req-d80a3cf2-6a33-46be-b9cd-5e124e82152d bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Releasing lock "refresh_cache-ed0f983d-6cd6-429c-8af1-0d52a56731d6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 13:42:35 compute-0 nova_compute[185723]: 2026-02-16 13:42:35.613 185727 DEBUG nova.virt.libvirt.driver [None req-d80a3cf2-6a33-46be-b9cd-5e124e82152d bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: ed0f983d-6cd6-429c-8af1-0d52a56731d6] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpnq0a57mh',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='ed0f983d-6cd6-429c-8af1-0d52a56731d6',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10827
Feb 16 13:42:35 compute-0 nova_compute[185723]: 2026-02-16 13:42:35.613 185727 DEBUG nova.virt.libvirt.driver [None req-d80a3cf2-6a33-46be-b9cd-5e124e82152d bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: ed0f983d-6cd6-429c-8af1-0d52a56731d6] Creating instance directory: /var/lib/nova/instances/ed0f983d-6cd6-429c-8af1-0d52a56731d6 pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10840
Feb 16 13:42:35 compute-0 nova_compute[185723]: 2026-02-16 13:42:35.613 185727 DEBUG nova.virt.libvirt.driver [None req-d80a3cf2-6a33-46be-b9cd-5e124e82152d bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: ed0f983d-6cd6-429c-8af1-0d52a56731d6] Creating disk.info with the contents: {'/var/lib/nova/instances/ed0f983d-6cd6-429c-8af1-0d52a56731d6/disk': 'qcow2', '/var/lib/nova/instances/ed0f983d-6cd6-429c-8af1-0d52a56731d6/disk.config': 'raw'} pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10854
Feb 16 13:42:35 compute-0 nova_compute[185723]: 2026-02-16 13:42:35.614 185727 DEBUG nova.virt.libvirt.driver [None req-d80a3cf2-6a33-46be-b9cd-5e124e82152d bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: ed0f983d-6cd6-429c-8af1-0d52a56731d6] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10864
Feb 16 13:42:35 compute-0 nova_compute[185723]: 2026-02-16 13:42:35.614 185727 DEBUG nova.objects.instance [None req-d80a3cf2-6a33-46be-b9cd-5e124e82152d bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lazy-loading 'trusted_certs' on Instance uuid ed0f983d-6cd6-429c-8af1-0d52a56731d6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 13:42:35 compute-0 nova_compute[185723]: 2026-02-16 13:42:35.644 185727 DEBUG oslo_concurrency.processutils [None req-d80a3cf2-6a33-46be-b9cd-5e124e82152d bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:42:35 compute-0 nova_compute[185723]: 2026-02-16 13:42:35.693 185727 DEBUG oslo_concurrency.processutils [None req-d80a3cf2-6a33-46be-b9cd-5e124e82152d bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json" returned: 0 in 0.049s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:42:35 compute-0 nova_compute[185723]: 2026-02-16 13:42:35.695 185727 DEBUG oslo_concurrency.lockutils [None req-d80a3cf2-6a33-46be-b9cd-5e124e82152d bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "755ae6cb63977e865a71a8c38a1a67ce95c9d0b7" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:42:35 compute-0 nova_compute[185723]: 2026-02-16 13:42:35.696 185727 DEBUG oslo_concurrency.lockutils [None req-d80a3cf2-6a33-46be-b9cd-5e124e82152d bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "755ae6cb63977e865a71a8c38a1a67ce95c9d0b7" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:42:35 compute-0 nova_compute[185723]: 2026-02-16 13:42:35.724 185727 DEBUG oslo_concurrency.processutils [None req-d80a3cf2-6a33-46be-b9cd-5e124e82152d bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:42:35 compute-0 nova_compute[185723]: 2026-02-16 13:42:35.775 185727 DEBUG oslo_concurrency.processutils [None req-d80a3cf2-6a33-46be-b9cd-5e124e82152d bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:42:35 compute-0 nova_compute[185723]: 2026-02-16 13:42:35.776 185727 DEBUG oslo_concurrency.processutils [None req-d80a3cf2-6a33-46be-b9cd-5e124e82152d bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7,backing_fmt=raw /var/lib/nova/instances/ed0f983d-6cd6-429c-8af1-0d52a56731d6/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:42:35 compute-0 nova_compute[185723]: 2026-02-16 13:42:35.822 185727 DEBUG oslo_concurrency.processutils [None req-d80a3cf2-6a33-46be-b9cd-5e124e82152d bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7,backing_fmt=raw /var/lib/nova/instances/ed0f983d-6cd6-429c-8af1-0d52a56731d6/disk 1073741824" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:42:35 compute-0 nova_compute[185723]: 2026-02-16 13:42:35.823 185727 DEBUG oslo_concurrency.lockutils [None req-d80a3cf2-6a33-46be-b9cd-5e124e82152d bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "755ae6cb63977e865a71a8c38a1a67ce95c9d0b7" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.127s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:42:35 compute-0 nova_compute[185723]: 2026-02-16 13:42:35.823 185727 DEBUG oslo_concurrency.processutils [None req-d80a3cf2-6a33-46be-b9cd-5e124e82152d bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:42:35 compute-0 nova_compute[185723]: 2026-02-16 13:42:35.878 185727 DEBUG oslo_concurrency.processutils [None req-d80a3cf2-6a33-46be-b9cd-5e124e82152d bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:42:35 compute-0 nova_compute[185723]: 2026-02-16 13:42:35.879 185727 DEBUG nova.virt.disk.api [None req-d80a3cf2-6a33-46be-b9cd-5e124e82152d bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Checking if we can resize image /var/lib/nova/instances/ed0f983d-6cd6-429c-8af1-0d52a56731d6/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Feb 16 13:42:35 compute-0 nova_compute[185723]: 2026-02-16 13:42:35.880 185727 DEBUG oslo_concurrency.processutils [None req-d80a3cf2-6a33-46be-b9cd-5e124e82152d bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ed0f983d-6cd6-429c-8af1-0d52a56731d6/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:42:35 compute-0 nova_compute[185723]: 2026-02-16 13:42:35.926 185727 DEBUG oslo_concurrency.processutils [None req-d80a3cf2-6a33-46be-b9cd-5e124e82152d bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ed0f983d-6cd6-429c-8af1-0d52a56731d6/disk --force-share --output=json" returned: 0 in 0.047s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:42:35 compute-0 nova_compute[185723]: 2026-02-16 13:42:35.927 185727 DEBUG nova.virt.disk.api [None req-d80a3cf2-6a33-46be-b9cd-5e124e82152d bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Cannot resize image /var/lib/nova/instances/ed0f983d-6cd6-429c-8af1-0d52a56731d6/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Feb 16 13:42:35 compute-0 nova_compute[185723]: 2026-02-16 13:42:35.928 185727 DEBUG nova.objects.instance [None req-d80a3cf2-6a33-46be-b9cd-5e124e82152d bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lazy-loading 'migration_context' on Instance uuid ed0f983d-6cd6-429c-8af1-0d52a56731d6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 13:42:35 compute-0 nova_compute[185723]: 2026-02-16 13:42:35.944 185727 DEBUG oslo_concurrency.processutils [None req-d80a3cf2-6a33-46be-b9cd-5e124e82152d bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/ed0f983d-6cd6-429c-8af1-0d52a56731d6/disk.config 485376 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:42:35 compute-0 nova_compute[185723]: 2026-02-16 13:42:35.964 185727 DEBUG oslo_concurrency.processutils [None req-d80a3cf2-6a33-46be-b9cd-5e124e82152d bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/ed0f983d-6cd6-429c-8af1-0d52a56731d6/disk.config 485376" returned: 0 in 0.020s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:42:35 compute-0 nova_compute[185723]: 2026-02-16 13:42:35.968 185727 DEBUG nova.virt.libvirt.volume.remotefs [None req-d80a3cf2-6a33-46be-b9cd-5e124e82152d bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Copying file compute-1.ctlplane.example.com:/var/lib/nova/instances/ed0f983d-6cd6-429c-8af1-0d52a56731d6/disk.config to /var/lib/nova/instances/ed0f983d-6cd6-429c-8af1-0d52a56731d6 copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Feb 16 13:42:35 compute-0 nova_compute[185723]: 2026-02-16 13:42:35.968 185727 DEBUG oslo_concurrency.processutils [None req-d80a3cf2-6a33-46be-b9cd-5e124e82152d bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Running cmd (subprocess): scp -C -r compute-1.ctlplane.example.com:/var/lib/nova/instances/ed0f983d-6cd6-429c-8af1-0d52a56731d6/disk.config /var/lib/nova/instances/ed0f983d-6cd6-429c-8af1-0d52a56731d6 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:42:36 compute-0 nova_compute[185723]: 2026-02-16 13:42:36.398 185727 DEBUG oslo_concurrency.processutils [None req-d80a3cf2-6a33-46be-b9cd-5e124e82152d bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] CMD "scp -C -r compute-1.ctlplane.example.com:/var/lib/nova/instances/ed0f983d-6cd6-429c-8af1-0d52a56731d6/disk.config /var/lib/nova/instances/ed0f983d-6cd6-429c-8af1-0d52a56731d6" returned: 0 in 0.430s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:42:36 compute-0 nova_compute[185723]: 2026-02-16 13:42:36.399 185727 DEBUG nova.virt.libvirt.driver [None req-d80a3cf2-6a33-46be-b9cd-5e124e82152d bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: ed0f983d-6cd6-429c-8af1-0d52a56731d6] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10794
Feb 16 13:42:36 compute-0 nova_compute[185723]: 2026-02-16 13:42:36.402 185727 DEBUG nova.virt.libvirt.vif [None req-d80a3cf2-6a33-46be-b9cd-5e124e82152d bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-16T13:41:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1167094500',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1167094500',id=18,image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-16T13:41:47Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='76c271745e704d5fa97fe16a7dcd4a81',ramdisk_id='',reservation_id='r-r5iyo2b2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-1085993185',owner_user_name='tempest-TestExecuteStrategies-1085993185-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2026-02-16T13:41:47Z,user_data=None,user_id='e19cd2d8a8894526ba620ca3249e9a63',uuid=ed0f983d-6cd6-429c-8af1-0d52a56731d6,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c9816814-5dfa-4f80-812c-4fc20a800a47", "address": "fa:16:3e:b7:0e:aa", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapc9816814-5d", "ovs_interfaceid": "c9816814-5dfa-4f80-812c-4fc20a800a47", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Feb 16 13:42:36 compute-0 nova_compute[185723]: 2026-02-16 13:42:36.402 185727 DEBUG nova.network.os_vif_util [None req-d80a3cf2-6a33-46be-b9cd-5e124e82152d bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Converting VIF {"id": "c9816814-5dfa-4f80-812c-4fc20a800a47", "address": "fa:16:3e:b7:0e:aa", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapc9816814-5d", "ovs_interfaceid": "c9816814-5dfa-4f80-812c-4fc20a800a47", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 13:42:36 compute-0 nova_compute[185723]: 2026-02-16 13:42:36.404 185727 DEBUG nova.network.os_vif_util [None req-d80a3cf2-6a33-46be-b9cd-5e124e82152d bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:b7:0e:aa,bridge_name='br-int',has_traffic_filtering=True,id=c9816814-5dfa-4f80-812c-4fc20a800a47,network=Network(62a1ccdd-3048-4bbf-acc8-c791bff79ee8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc9816814-5d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 13:42:36 compute-0 nova_compute[185723]: 2026-02-16 13:42:36.405 185727 DEBUG os_vif [None req-d80a3cf2-6a33-46be-b9cd-5e124e82152d bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:b7:0e:aa,bridge_name='br-int',has_traffic_filtering=True,id=c9816814-5dfa-4f80-812c-4fc20a800a47,network=Network(62a1ccdd-3048-4bbf-acc8-c791bff79ee8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc9816814-5d') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Feb 16 13:42:36 compute-0 nova_compute[185723]: 2026-02-16 13:42:36.406 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:42:36 compute-0 nova_compute[185723]: 2026-02-16 13:42:36.407 185727 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:42:36 compute-0 nova_compute[185723]: 2026-02-16 13:42:36.407 185727 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 13:42:36 compute-0 nova_compute[185723]: 2026-02-16 13:42:36.410 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:42:36 compute-0 nova_compute[185723]: 2026-02-16 13:42:36.411 185727 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc9816814-5d, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:42:36 compute-0 nova_compute[185723]: 2026-02-16 13:42:36.411 185727 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapc9816814-5d, col_values=(('external_ids', {'iface-id': 'c9816814-5dfa-4f80-812c-4fc20a800a47', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:b7:0e:aa', 'vm-uuid': 'ed0f983d-6cd6-429c-8af1-0d52a56731d6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:42:36 compute-0 nova_compute[185723]: 2026-02-16 13:42:36.413 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:42:36 compute-0 NetworkManager[56177]: <info>  [1771249356.4143] manager: (tapc9816814-5d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/63)
Feb 16 13:42:36 compute-0 nova_compute[185723]: 2026-02-16 13:42:36.417 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 13:42:36 compute-0 nova_compute[185723]: 2026-02-16 13:42:36.421 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:42:36 compute-0 nova_compute[185723]: 2026-02-16 13:42:36.423 185727 INFO os_vif [None req-d80a3cf2-6a33-46be-b9cd-5e124e82152d bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:b7:0e:aa,bridge_name='br-int',has_traffic_filtering=True,id=c9816814-5dfa-4f80-812c-4fc20a800a47,network=Network(62a1ccdd-3048-4bbf-acc8-c791bff79ee8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc9816814-5d')
Feb 16 13:42:36 compute-0 nova_compute[185723]: 2026-02-16 13:42:36.423 185727 DEBUG nova.virt.libvirt.driver [None req-d80a3cf2-6a33-46be-b9cd-5e124e82152d bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10954
Feb 16 13:42:36 compute-0 nova_compute[185723]: 2026-02-16 13:42:36.423 185727 DEBUG nova.compute.manager [None req-d80a3cf2-6a33-46be-b9cd-5e124e82152d bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpnq0a57mh',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='ed0f983d-6cd6-429c-8af1-0d52a56731d6',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8668
Feb 16 13:42:36 compute-0 sshd-session[213189]: Invalid user test from 146.190.226.24 port 48852
Feb 16 13:42:36 compute-0 sshd-session[213189]: Connection closed by invalid user test 146.190.226.24 port 48852 [preauth]
Feb 16 13:42:39 compute-0 nova_compute[185723]: 2026-02-16 13:42:39.532 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:42:39 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:42:39.531 105360 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=22, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '52:96:1b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '16:6c:7a:bd:49:39'}, ipsec=False) old=SB_Global(nb_cfg=21) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 13:42:39 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:42:39.533 105360 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 16 13:42:40 compute-0 nova_compute[185723]: 2026-02-16 13:42:40.065 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:42:41 compute-0 nova_compute[185723]: 2026-02-16 13:42:41.413 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:42:42 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:42:42.535 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=b0e583b2-47d7-4bde-bbd6-282143e0c194, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '22'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:42:42 compute-0 nova_compute[185723]: 2026-02-16 13:42:42.706 185727 DEBUG nova.network.neutron [None req-d80a3cf2-6a33-46be-b9cd-5e124e82152d bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: ed0f983d-6cd6-429c-8af1-0d52a56731d6] Port c9816814-5dfa-4f80-812c-4fc20a800a47 updated with migration profile {'migrating_to': 'compute-0.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.9/site-packages/nova/network/neutron.py:354
Feb 16 13:42:42 compute-0 nova_compute[185723]: 2026-02-16 13:42:42.708 185727 DEBUG nova.compute.manager [None req-d80a3cf2-6a33-46be-b9cd-5e124e82152d bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpnq0a57mh',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='ed0f983d-6cd6-429c-8af1-0d52a56731d6',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8723
Feb 16 13:42:42 compute-0 systemd[1]: Starting libvirt proxy daemon...
Feb 16 13:42:42 compute-0 systemd[1]: Started libvirt proxy daemon.
Feb 16 13:42:43 compute-0 kernel: tapc9816814-5d: entered promiscuous mode
Feb 16 13:42:43 compute-0 NetworkManager[56177]: <info>  [1771249363.0242] manager: (tapc9816814-5d): new Tun device (/org/freedesktop/NetworkManager/Devices/64)
Feb 16 13:42:43 compute-0 ovn_controller[96072]: 2026-02-16T13:42:43Z|00151|binding|INFO|Claiming lport c9816814-5dfa-4f80-812c-4fc20a800a47 for this additional chassis.
Feb 16 13:42:43 compute-0 ovn_controller[96072]: 2026-02-16T13:42:43Z|00152|binding|INFO|c9816814-5dfa-4f80-812c-4fc20a800a47: Claiming fa:16:3e:b7:0e:aa 10.100.0.13
Feb 16 13:42:43 compute-0 nova_compute[185723]: 2026-02-16 13:42:43.025 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:42:43 compute-0 ovn_controller[96072]: 2026-02-16T13:42:43Z|00153|binding|INFO|Setting lport c9816814-5dfa-4f80-812c-4fc20a800a47 ovn-installed in OVS
Feb 16 13:42:43 compute-0 nova_compute[185723]: 2026-02-16 13:42:43.032 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:42:43 compute-0 nova_compute[185723]: 2026-02-16 13:42:43.035 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:42:43 compute-0 systemd-machined[155229]: New machine qemu-14-instance-00000012.
Feb 16 13:42:43 compute-0 systemd[1]: Started Virtual Machine qemu-14-instance-00000012.
Feb 16 13:42:43 compute-0 systemd-udevd[213248]: Network interface NamePolicy= disabled on kernel command line.
Feb 16 13:42:43 compute-0 NetworkManager[56177]: <info>  [1771249363.0894] device (tapc9816814-5d): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 16 13:42:43 compute-0 NetworkManager[56177]: <info>  [1771249363.0905] device (tapc9816814-5d): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Feb 16 13:42:43 compute-0 podman[213245]: 2026-02-16 13:42:43.134718436 +0000 UTC m=+0.058829560 container health_status 4ec6105d1727f201f656d7e6e358931be8ceabc5af37fb5ad779abc620122180 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 16 13:42:43 compute-0 nova_compute[185723]: 2026-02-16 13:42:43.716 185727 DEBUG nova.virt.driver [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] Emitting event <LifecycleEvent: 1771249363.7157118, ed0f983d-6cd6-429c-8af1-0d52a56731d6 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 13:42:43 compute-0 nova_compute[185723]: 2026-02-16 13:42:43.716 185727 INFO nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: ed0f983d-6cd6-429c-8af1-0d52a56731d6] VM Started (Lifecycle Event)
Feb 16 13:42:43 compute-0 nova_compute[185723]: 2026-02-16 13:42:43.741 185727 DEBUG nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: ed0f983d-6cd6-429c-8af1-0d52a56731d6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:42:44 compute-0 nova_compute[185723]: 2026-02-16 13:42:44.579 185727 DEBUG nova.virt.driver [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] Emitting event <LifecycleEvent: 1771249364.5789332, ed0f983d-6cd6-429c-8af1-0d52a56731d6 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 13:42:44 compute-0 nova_compute[185723]: 2026-02-16 13:42:44.580 185727 INFO nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: ed0f983d-6cd6-429c-8af1-0d52a56731d6] VM Resumed (Lifecycle Event)
Feb 16 13:42:44 compute-0 nova_compute[185723]: 2026-02-16 13:42:44.614 185727 DEBUG nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: ed0f983d-6cd6-429c-8af1-0d52a56731d6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:42:44 compute-0 nova_compute[185723]: 2026-02-16 13:42:44.618 185727 DEBUG nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: ed0f983d-6cd6-429c-8af1-0d52a56731d6] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 16 13:42:44 compute-0 nova_compute[185723]: 2026-02-16 13:42:44.936 185727 INFO nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: ed0f983d-6cd6-429c-8af1-0d52a56731d6] During the sync_power process the instance has moved from host compute-1.ctlplane.example.com to host compute-0.ctlplane.example.com
Feb 16 13:42:45 compute-0 nova_compute[185723]: 2026-02-16 13:42:45.067 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:42:46 compute-0 nova_compute[185723]: 2026-02-16 13:42:46.416 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:42:47 compute-0 ovn_controller[96072]: 2026-02-16T13:42:47Z|00154|binding|INFO|Claiming lport c9816814-5dfa-4f80-812c-4fc20a800a47 for this chassis.
Feb 16 13:42:47 compute-0 ovn_controller[96072]: 2026-02-16T13:42:47Z|00155|binding|INFO|c9816814-5dfa-4f80-812c-4fc20a800a47: Claiming fa:16:3e:b7:0e:aa 10.100.0.13
Feb 16 13:42:47 compute-0 ovn_controller[96072]: 2026-02-16T13:42:47Z|00156|binding|INFO|Setting lport c9816814-5dfa-4f80-812c-4fc20a800a47 up in Southbound
Feb 16 13:42:47 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:42:47.659 105360 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b7:0e:aa 10.100.0.13'], port_security=['fa:16:3e:b7:0e:aa 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'ed0f983d-6cd6-429c-8af1-0d52a56731d6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '76c271745e704d5fa97fe16a7dcd4a81', 'neutron:revision_number': '11', 'neutron:security_group_ids': '832f9d98-96fb-45e2-8c11-0f4534711ee2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fb3ae2f5-242d-4288-83f4-c053c76499e5, chassis=[<ovs.db.idl.Row object at 0x7f2e53046760>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2e53046760>], logical_port=c9816814-5dfa-4f80-812c-4fc20a800a47) old=Port_Binding(up=[False], additional_chassis=[<ovs.db.idl.Row object at 0x7f2e53046760>], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 13:42:47 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:42:47.661 105360 INFO neutron.agent.ovn.metadata.agent [-] Port c9816814-5dfa-4f80-812c-4fc20a800a47 in datapath 62a1ccdd-3048-4bbf-acc8-c791bff79ee8 bound to our chassis
Feb 16 13:42:47 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:42:47.662 105360 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 62a1ccdd-3048-4bbf-acc8-c791bff79ee8
Feb 16 13:42:47 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:42:47.677 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[a8e08cff-5fff-41f5-a6a5-2446d0cdac67]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:42:47 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:42:47.712 206452 DEBUG oslo.privsep.daemon [-] privsep: reply[b9bcba3d-1232-475f-a502-618de81e2651]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:42:47 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:42:47.716 206452 DEBUG oslo.privsep.daemon [-] privsep: reply[495ac8e3-80ca-4b8f-b632-6ca041c3b13b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:42:47 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:42:47.742 206452 DEBUG oslo.privsep.daemon [-] privsep: reply[859ba34b-f30c-41cd-a5ef-3c1b1af7fc8b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:42:47 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:42:47.761 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[0e6f8a3a-c48b-4290-bae8-c4a06e0bbb62]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap62a1ccdd-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a9:94:92'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 13, 'tx_packets': 5, 'rx_bytes': 826, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 13, 'tx_packets': 5, 'rx_bytes': 826, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 43], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 537133, 'reachable_time': 44239, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 213319, 'error': None, 'target': 'ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:42:47 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:42:47.779 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[b12f3aff-6e66-4208-ac20-1532cf62c3de]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap62a1ccdd-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 537143, 'tstamp': 537143}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 213320, 'error': None, 'target': 'ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap62a1ccdd-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 537145, 'tstamp': 537145}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 213320, 'error': None, 'target': 'ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:42:47 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:42:47.781 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap62a1ccdd-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:42:47 compute-0 nova_compute[185723]: 2026-02-16 13:42:47.782 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:42:47 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:42:47.783 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap62a1ccdd-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:42:47 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:42:47.784 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 13:42:47 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:42:47.784 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap62a1ccdd-30, col_values=(('external_ids', {'iface-id': 'ac21d57d-f71e-4560-b6aa-e9f6e3838308'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:42:47 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:42:47.784 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 13:42:47 compute-0 nova_compute[185723]: 2026-02-16 13:42:47.896 185727 INFO nova.compute.manager [None req-d80a3cf2-6a33-46be-b9cd-5e124e82152d bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: ed0f983d-6cd6-429c-8af1-0d52a56731d6] Post operation of migration started
Feb 16 13:42:48 compute-0 nova_compute[185723]: 2026-02-16 13:42:48.567 185727 DEBUG oslo_concurrency.lockutils [None req-d80a3cf2-6a33-46be-b9cd-5e124e82152d bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "refresh_cache-ed0f983d-6cd6-429c-8af1-0d52a56731d6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 13:42:48 compute-0 nova_compute[185723]: 2026-02-16 13:42:48.568 185727 DEBUG oslo_concurrency.lockutils [None req-d80a3cf2-6a33-46be-b9cd-5e124e82152d bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Acquired lock "refresh_cache-ed0f983d-6cd6-429c-8af1-0d52a56731d6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 13:42:48 compute-0 nova_compute[185723]: 2026-02-16 13:42:48.568 185727 DEBUG nova.network.neutron [None req-d80a3cf2-6a33-46be-b9cd-5e124e82152d bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: ed0f983d-6cd6-429c-8af1-0d52a56731d6] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 16 13:42:50 compute-0 nova_compute[185723]: 2026-02-16 13:42:50.070 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:42:50 compute-0 nova_compute[185723]: 2026-02-16 13:42:50.367 185727 DEBUG nova.network.neutron [None req-d80a3cf2-6a33-46be-b9cd-5e124e82152d bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: ed0f983d-6cd6-429c-8af1-0d52a56731d6] Updating instance_info_cache with network_info: [{"id": "c9816814-5dfa-4f80-812c-4fc20a800a47", "address": "fa:16:3e:b7:0e:aa", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc9816814-5d", "ovs_interfaceid": "c9816814-5dfa-4f80-812c-4fc20a800a47", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 13:42:50 compute-0 nova_compute[185723]: 2026-02-16 13:42:50.404 185727 DEBUG oslo_concurrency.lockutils [None req-d80a3cf2-6a33-46be-b9cd-5e124e82152d bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Releasing lock "refresh_cache-ed0f983d-6cd6-429c-8af1-0d52a56731d6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 13:42:50 compute-0 nova_compute[185723]: 2026-02-16 13:42:50.423 185727 DEBUG oslo_concurrency.lockutils [None req-d80a3cf2-6a33-46be-b9cd-5e124e82152d bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:42:50 compute-0 nova_compute[185723]: 2026-02-16 13:42:50.424 185727 DEBUG oslo_concurrency.lockutils [None req-d80a3cf2-6a33-46be-b9cd-5e124e82152d bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:42:50 compute-0 nova_compute[185723]: 2026-02-16 13:42:50.424 185727 DEBUG oslo_concurrency.lockutils [None req-d80a3cf2-6a33-46be-b9cd-5e124e82152d bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:42:50 compute-0 nova_compute[185723]: 2026-02-16 13:42:50.428 185727 INFO nova.virt.libvirt.driver [None req-d80a3cf2-6a33-46be-b9cd-5e124e82152d bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: ed0f983d-6cd6-429c-8af1-0d52a56731d6] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Feb 16 13:42:50 compute-0 virtqemud[184843]: Domain id=14 name='instance-00000012' uuid=ed0f983d-6cd6-429c-8af1-0d52a56731d6 is tainted: custom-monitor
Feb 16 13:42:51 compute-0 sshd-session[213321]: Invalid user admin from 64.227.72.94 port 34464
Feb 16 13:42:51 compute-0 nova_compute[185723]: 2026-02-16 13:42:51.419 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:42:51 compute-0 nova_compute[185723]: 2026-02-16 13:42:51.435 185727 INFO nova.virt.libvirt.driver [None req-d80a3cf2-6a33-46be-b9cd-5e124e82152d bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: ed0f983d-6cd6-429c-8af1-0d52a56731d6] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Feb 16 13:42:51 compute-0 sshd-session[213321]: Connection closed by invalid user admin 64.227.72.94 port 34464 [preauth]
Feb 16 13:42:52 compute-0 nova_compute[185723]: 2026-02-16 13:42:52.440 185727 INFO nova.virt.libvirt.driver [None req-d80a3cf2-6a33-46be-b9cd-5e124e82152d bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: ed0f983d-6cd6-429c-8af1-0d52a56731d6] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Feb 16 13:42:52 compute-0 nova_compute[185723]: 2026-02-16 13:42:52.445 185727 DEBUG nova.compute.manager [None req-d80a3cf2-6a33-46be-b9cd-5e124e82152d bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: ed0f983d-6cd6-429c-8af1-0d52a56731d6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:42:52 compute-0 nova_compute[185723]: 2026-02-16 13:42:52.469 185727 DEBUG nova.objects.instance [None req-d80a3cf2-6a33-46be-b9cd-5e124e82152d bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: ed0f983d-6cd6-429c-8af1-0d52a56731d6] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Feb 16 13:42:55 compute-0 nova_compute[185723]: 2026-02-16 13:42:55.073 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:42:55 compute-0 sshd-session[213323]: Invalid user deploy from 146.190.22.227 port 36402
Feb 16 13:42:55 compute-0 nova_compute[185723]: 2026-02-16 13:42:55.821 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:42:56 compute-0 sshd-session[213325]: Invalid user postgres from 188.166.42.159 port 51782
Feb 16 13:42:56 compute-0 sshd-session[213325]: Connection closed by invalid user postgres 188.166.42.159 port 51782 [preauth]
Feb 16 13:42:56 compute-0 nova_compute[185723]: 2026-02-16 13:42:56.421 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:42:56 compute-0 sshd-session[213323]: Connection closed by invalid user deploy 146.190.22.227 port 36402 [preauth]
Feb 16 13:42:58 compute-0 nova_compute[185723]: 2026-02-16 13:42:58.019 185727 DEBUG oslo_concurrency.lockutils [None req-8556d44a-c076-4022-8123-6b3cdba35a20 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Acquiring lock "ed0f983d-6cd6-429c-8af1-0d52a56731d6" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:42:58 compute-0 nova_compute[185723]: 2026-02-16 13:42:58.020 185727 DEBUG oslo_concurrency.lockutils [None req-8556d44a-c076-4022-8123-6b3cdba35a20 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "ed0f983d-6cd6-429c-8af1-0d52a56731d6" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:42:58 compute-0 nova_compute[185723]: 2026-02-16 13:42:58.020 185727 DEBUG oslo_concurrency.lockutils [None req-8556d44a-c076-4022-8123-6b3cdba35a20 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Acquiring lock "ed0f983d-6cd6-429c-8af1-0d52a56731d6-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:42:58 compute-0 nova_compute[185723]: 2026-02-16 13:42:58.021 185727 DEBUG oslo_concurrency.lockutils [None req-8556d44a-c076-4022-8123-6b3cdba35a20 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "ed0f983d-6cd6-429c-8af1-0d52a56731d6-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:42:58 compute-0 nova_compute[185723]: 2026-02-16 13:42:58.021 185727 DEBUG oslo_concurrency.lockutils [None req-8556d44a-c076-4022-8123-6b3cdba35a20 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "ed0f983d-6cd6-429c-8af1-0d52a56731d6-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:42:58 compute-0 nova_compute[185723]: 2026-02-16 13:42:58.023 185727 INFO nova.compute.manager [None req-8556d44a-c076-4022-8123-6b3cdba35a20 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: ed0f983d-6cd6-429c-8af1-0d52a56731d6] Terminating instance
Feb 16 13:42:58 compute-0 nova_compute[185723]: 2026-02-16 13:42:58.024 185727 DEBUG nova.compute.manager [None req-8556d44a-c076-4022-8123-6b3cdba35a20 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: ed0f983d-6cd6-429c-8af1-0d52a56731d6] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Feb 16 13:42:58 compute-0 kernel: tapc9816814-5d (unregistering): left promiscuous mode
Feb 16 13:42:58 compute-0 NetworkManager[56177]: <info>  [1771249378.0452] device (tapc9816814-5d): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Feb 16 13:42:58 compute-0 ovn_controller[96072]: 2026-02-16T13:42:58Z|00157|binding|INFO|Releasing lport c9816814-5dfa-4f80-812c-4fc20a800a47 from this chassis (sb_readonly=0)
Feb 16 13:42:58 compute-0 nova_compute[185723]: 2026-02-16 13:42:58.052 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:42:58 compute-0 ovn_controller[96072]: 2026-02-16T13:42:58Z|00158|binding|INFO|Setting lport c9816814-5dfa-4f80-812c-4fc20a800a47 down in Southbound
Feb 16 13:42:58 compute-0 ovn_controller[96072]: 2026-02-16T13:42:58Z|00159|binding|INFO|Removing iface tapc9816814-5d ovn-installed in OVS
Feb 16 13:42:58 compute-0 nova_compute[185723]: 2026-02-16 13:42:58.054 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:42:58 compute-0 nova_compute[185723]: 2026-02-16 13:42:58.059 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:42:58 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:42:58.066 105360 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b7:0e:aa 10.100.0.13'], port_security=['fa:16:3e:b7:0e:aa 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'ed0f983d-6cd6-429c-8af1-0d52a56731d6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '76c271745e704d5fa97fe16a7dcd4a81', 'neutron:revision_number': '13', 'neutron:security_group_ids': '832f9d98-96fb-45e2-8c11-0f4534711ee2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fb3ae2f5-242d-4288-83f4-c053c76499e5, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2e53046760>], logical_port=c9816814-5dfa-4f80-812c-4fc20a800a47) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2e53046760>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 13:42:58 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:42:58.068 105360 INFO neutron.agent.ovn.metadata.agent [-] Port c9816814-5dfa-4f80-812c-4fc20a800a47 in datapath 62a1ccdd-3048-4bbf-acc8-c791bff79ee8 unbound from our chassis
Feb 16 13:42:58 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:42:58.069 105360 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 62a1ccdd-3048-4bbf-acc8-c791bff79ee8
Feb 16 13:42:58 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:42:58.084 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[7d43dfd2-d546-4ad5-9d73-257bcf18a2bc]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:42:58 compute-0 systemd[1]: machine-qemu\x2d14\x2dinstance\x2d00000012.scope: Deactivated successfully.
Feb 16 13:42:58 compute-0 systemd[1]: machine-qemu\x2d14\x2dinstance\x2d00000012.scope: Consumed 1.449s CPU time.
Feb 16 13:42:58 compute-0 systemd-machined[155229]: Machine qemu-14-instance-00000012 terminated.
Feb 16 13:42:58 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:42:58.105 206452 DEBUG oslo.privsep.daemon [-] privsep: reply[d00511b2-f913-4049-9d3c-e05dde22288b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:42:58 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:42:58.109 206452 DEBUG oslo.privsep.daemon [-] privsep: reply[d33b53a2-3bdd-44cb-8190-434f8b713606]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:42:58 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:42:58.129 206452 DEBUG oslo.privsep.daemon [-] privsep: reply[05603d3d-e2a5-42d1-982a-95bf8e4c63c0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:42:58 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:42:58.145 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[77189dd3-0d59-4fbb-9c44-72cd1c9836e8]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap62a1ccdd-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a9:94:92'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 28, 'tx_packets': 7, 'rx_bytes': 1456, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 28, 'tx_packets': 7, 'rx_bytes': 1456, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 43], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 537133, 'reachable_time': 44239, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 213339, 'error': None, 'target': 'ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:42:58 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:42:58.158 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[be863dbc-8823-4d18-8a26-4eefc809008e]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap62a1ccdd-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 537143, 'tstamp': 537143}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 213340, 'error': None, 'target': 'ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap62a1ccdd-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 537145, 'tstamp': 537145}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 213340, 'error': None, 'target': 'ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:42:58 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:42:58.160 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap62a1ccdd-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:42:58 compute-0 nova_compute[185723]: 2026-02-16 13:42:58.161 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:42:58 compute-0 nova_compute[185723]: 2026-02-16 13:42:58.164 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:42:58 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:42:58.165 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap62a1ccdd-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:42:58 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:42:58.166 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 13:42:58 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:42:58.166 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap62a1ccdd-30, col_values=(('external_ids', {'iface-id': 'ac21d57d-f71e-4560-b6aa-e9f6e3838308'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:42:58 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:42:58.166 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 13:42:58 compute-0 nova_compute[185723]: 2026-02-16 13:42:58.287 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:42:58 compute-0 nova_compute[185723]: 2026-02-16 13:42:58.314 185727 DEBUG nova.compute.manager [req-30c578a2-80d8-4742-8f93-6e8468e59539 req-9fb60c37-64d3-44fc-833f-e49112f5663e faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: ed0f983d-6cd6-429c-8af1-0d52a56731d6] Received event network-vif-unplugged-c9816814-5dfa-4f80-812c-4fc20a800a47 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:42:58 compute-0 nova_compute[185723]: 2026-02-16 13:42:58.314 185727 DEBUG oslo_concurrency.lockutils [req-30c578a2-80d8-4742-8f93-6e8468e59539 req-9fb60c37-64d3-44fc-833f-e49112f5663e faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "ed0f983d-6cd6-429c-8af1-0d52a56731d6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:42:58 compute-0 nova_compute[185723]: 2026-02-16 13:42:58.315 185727 DEBUG oslo_concurrency.lockutils [req-30c578a2-80d8-4742-8f93-6e8468e59539 req-9fb60c37-64d3-44fc-833f-e49112f5663e faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "ed0f983d-6cd6-429c-8af1-0d52a56731d6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:42:58 compute-0 nova_compute[185723]: 2026-02-16 13:42:58.315 185727 DEBUG oslo_concurrency.lockutils [req-30c578a2-80d8-4742-8f93-6e8468e59539 req-9fb60c37-64d3-44fc-833f-e49112f5663e faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "ed0f983d-6cd6-429c-8af1-0d52a56731d6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:42:58 compute-0 nova_compute[185723]: 2026-02-16 13:42:58.315 185727 DEBUG nova.compute.manager [req-30c578a2-80d8-4742-8f93-6e8468e59539 req-9fb60c37-64d3-44fc-833f-e49112f5663e faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: ed0f983d-6cd6-429c-8af1-0d52a56731d6] No waiting events found dispatching network-vif-unplugged-c9816814-5dfa-4f80-812c-4fc20a800a47 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:42:58 compute-0 nova_compute[185723]: 2026-02-16 13:42:58.315 185727 DEBUG nova.compute.manager [req-30c578a2-80d8-4742-8f93-6e8468e59539 req-9fb60c37-64d3-44fc-833f-e49112f5663e faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: ed0f983d-6cd6-429c-8af1-0d52a56731d6] Received event network-vif-unplugged-c9816814-5dfa-4f80-812c-4fc20a800a47 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Feb 16 13:42:58 compute-0 nova_compute[185723]: 2026-02-16 13:42:58.331 185727 INFO nova.virt.libvirt.driver [-] [instance: ed0f983d-6cd6-429c-8af1-0d52a56731d6] Instance destroyed successfully.
Feb 16 13:42:58 compute-0 nova_compute[185723]: 2026-02-16 13:42:58.332 185727 DEBUG nova.objects.instance [None req-8556d44a-c076-4022-8123-6b3cdba35a20 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lazy-loading 'resources' on Instance uuid ed0f983d-6cd6-429c-8af1-0d52a56731d6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 13:42:58 compute-0 nova_compute[185723]: 2026-02-16 13:42:58.349 185727 DEBUG nova.virt.libvirt.vif [None req-8556d44a-c076-4022-8123-6b3cdba35a20 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-02-16T13:41:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1167094500',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1167094500',id=18,image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-16T13:41:47Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='76c271745e704d5fa97fe16a7dcd4a81',ramdisk_id='',reservation_id='r-r5iyo2b2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-1085993185',owner_user_name='tempest-TestExecuteStrategies-1085993185-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-16T13:42:52Z,user_data=None,user_id='e19cd2d8a8894526ba620ca3249e9a63',uuid=ed0f983d-6cd6-429c-8af1-0d52a56731d6,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c9816814-5dfa-4f80-812c-4fc20a800a47", "address": "fa:16:3e:b7:0e:aa", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc9816814-5d", "ovs_interfaceid": "c9816814-5dfa-4f80-812c-4fc20a800a47", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Feb 16 13:42:58 compute-0 nova_compute[185723]: 2026-02-16 13:42:58.350 185727 DEBUG nova.network.os_vif_util [None req-8556d44a-c076-4022-8123-6b3cdba35a20 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Converting VIF {"id": "c9816814-5dfa-4f80-812c-4fc20a800a47", "address": "fa:16:3e:b7:0e:aa", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc9816814-5d", "ovs_interfaceid": "c9816814-5dfa-4f80-812c-4fc20a800a47", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 13:42:58 compute-0 nova_compute[185723]: 2026-02-16 13:42:58.350 185727 DEBUG nova.network.os_vif_util [None req-8556d44a-c076-4022-8123-6b3cdba35a20 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:b7:0e:aa,bridge_name='br-int',has_traffic_filtering=True,id=c9816814-5dfa-4f80-812c-4fc20a800a47,network=Network(62a1ccdd-3048-4bbf-acc8-c791bff79ee8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc9816814-5d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 13:42:58 compute-0 nova_compute[185723]: 2026-02-16 13:42:58.350 185727 DEBUG os_vif [None req-8556d44a-c076-4022-8123-6b3cdba35a20 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:b7:0e:aa,bridge_name='br-int',has_traffic_filtering=True,id=c9816814-5dfa-4f80-812c-4fc20a800a47,network=Network(62a1ccdd-3048-4bbf-acc8-c791bff79ee8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc9816814-5d') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Feb 16 13:42:58 compute-0 nova_compute[185723]: 2026-02-16 13:42:58.352 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:42:58 compute-0 nova_compute[185723]: 2026-02-16 13:42:58.352 185727 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc9816814-5d, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:42:58 compute-0 nova_compute[185723]: 2026-02-16 13:42:58.354 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:42:58 compute-0 nova_compute[185723]: 2026-02-16 13:42:58.356 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 13:42:58 compute-0 nova_compute[185723]: 2026-02-16 13:42:58.358 185727 INFO os_vif [None req-8556d44a-c076-4022-8123-6b3cdba35a20 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:b7:0e:aa,bridge_name='br-int',has_traffic_filtering=True,id=c9816814-5dfa-4f80-812c-4fc20a800a47,network=Network(62a1ccdd-3048-4bbf-acc8-c791bff79ee8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc9816814-5d')
Feb 16 13:42:58 compute-0 nova_compute[185723]: 2026-02-16 13:42:58.358 185727 INFO nova.virt.libvirt.driver [None req-8556d44a-c076-4022-8123-6b3cdba35a20 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: ed0f983d-6cd6-429c-8af1-0d52a56731d6] Deleting instance files /var/lib/nova/instances/ed0f983d-6cd6-429c-8af1-0d52a56731d6_del
Feb 16 13:42:58 compute-0 nova_compute[185723]: 2026-02-16 13:42:58.359 185727 INFO nova.virt.libvirt.driver [None req-8556d44a-c076-4022-8123-6b3cdba35a20 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: ed0f983d-6cd6-429c-8af1-0d52a56731d6] Deletion of /var/lib/nova/instances/ed0f983d-6cd6-429c-8af1-0d52a56731d6_del complete
Feb 16 13:42:58 compute-0 nova_compute[185723]: 2026-02-16 13:42:58.422 185727 INFO nova.compute.manager [None req-8556d44a-c076-4022-8123-6b3cdba35a20 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: ed0f983d-6cd6-429c-8af1-0d52a56731d6] Took 0.40 seconds to destroy the instance on the hypervisor.
Feb 16 13:42:58 compute-0 nova_compute[185723]: 2026-02-16 13:42:58.422 185727 DEBUG oslo.service.loopingcall [None req-8556d44a-c076-4022-8123-6b3cdba35a20 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Feb 16 13:42:58 compute-0 nova_compute[185723]: 2026-02-16 13:42:58.423 185727 DEBUG nova.compute.manager [-] [instance: ed0f983d-6cd6-429c-8af1-0d52a56731d6] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Feb 16 13:42:58 compute-0 nova_compute[185723]: 2026-02-16 13:42:58.423 185727 DEBUG nova.network.neutron [-] [instance: ed0f983d-6cd6-429c-8af1-0d52a56731d6] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Feb 16 13:42:59 compute-0 nova_compute[185723]: 2026-02-16 13:42:59.466 185727 DEBUG nova.network.neutron [-] [instance: ed0f983d-6cd6-429c-8af1-0d52a56731d6] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 13:42:59 compute-0 nova_compute[185723]: 2026-02-16 13:42:59.482 185727 INFO nova.compute.manager [-] [instance: ed0f983d-6cd6-429c-8af1-0d52a56731d6] Took 1.06 seconds to deallocate network for instance.
Feb 16 13:42:59 compute-0 nova_compute[185723]: 2026-02-16 13:42:59.540 185727 DEBUG oslo_concurrency.lockutils [None req-8556d44a-c076-4022-8123-6b3cdba35a20 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:42:59 compute-0 nova_compute[185723]: 2026-02-16 13:42:59.541 185727 DEBUG oslo_concurrency.lockutils [None req-8556d44a-c076-4022-8123-6b3cdba35a20 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:42:59 compute-0 nova_compute[185723]: 2026-02-16 13:42:59.548 185727 DEBUG oslo_concurrency.lockutils [None req-8556d44a-c076-4022-8123-6b3cdba35a20 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.007s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:42:59 compute-0 nova_compute[185723]: 2026-02-16 13:42:59.601 185727 INFO nova.scheduler.client.report [None req-8556d44a-c076-4022-8123-6b3cdba35a20 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Deleted allocations for instance ed0f983d-6cd6-429c-8af1-0d52a56731d6
Feb 16 13:42:59 compute-0 nova_compute[185723]: 2026-02-16 13:42:59.695 185727 DEBUG oslo_concurrency.lockutils [None req-8556d44a-c076-4022-8123-6b3cdba35a20 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "ed0f983d-6cd6-429c-8af1-0d52a56731d6" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.675s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:42:59 compute-0 podman[195053]: time="2026-02-16T13:42:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:42:59 compute-0 podman[195053]: @ - - [16/Feb/2026:13:42:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17245 "" "Go-http-client/1.1"
Feb 16 13:42:59 compute-0 podman[195053]: @ - - [16/Feb/2026:13:42:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2642 "" "Go-http-client/1.1"
Feb 16 13:43:00 compute-0 nova_compute[185723]: 2026-02-16 13:42:59.995 185727 DEBUG oslo_concurrency.lockutils [None req-8b17c3c6-8fca-4e8c-b7c5-ba73474e087a e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Acquiring lock "44f63a81-024b-446b-a144-28445aaae47c" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:43:00 compute-0 nova_compute[185723]: 2026-02-16 13:42:59.997 185727 DEBUG oslo_concurrency.lockutils [None req-8b17c3c6-8fca-4e8c-b7c5-ba73474e087a e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "44f63a81-024b-446b-a144-28445aaae47c" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:43:00 compute-0 nova_compute[185723]: 2026-02-16 13:42:59.997 185727 DEBUG oslo_concurrency.lockutils [None req-8b17c3c6-8fca-4e8c-b7c5-ba73474e087a e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Acquiring lock "44f63a81-024b-446b-a144-28445aaae47c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:43:00 compute-0 nova_compute[185723]: 2026-02-16 13:42:59.997 185727 DEBUG oslo_concurrency.lockutils [None req-8b17c3c6-8fca-4e8c-b7c5-ba73474e087a e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "44f63a81-024b-446b-a144-28445aaae47c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:43:00 compute-0 nova_compute[185723]: 2026-02-16 13:42:59.997 185727 DEBUG oslo_concurrency.lockutils [None req-8b17c3c6-8fca-4e8c-b7c5-ba73474e087a e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "44f63a81-024b-446b-a144-28445aaae47c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:43:00 compute-0 nova_compute[185723]: 2026-02-16 13:42:59.998 185727 INFO nova.compute.manager [None req-8b17c3c6-8fca-4e8c-b7c5-ba73474e087a e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 44f63a81-024b-446b-a144-28445aaae47c] Terminating instance
Feb 16 13:43:00 compute-0 nova_compute[185723]: 2026-02-16 13:42:59.999 185727 DEBUG nova.compute.manager [None req-8b17c3c6-8fca-4e8c-b7c5-ba73474e087a e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 44f63a81-024b-446b-a144-28445aaae47c] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Feb 16 13:43:00 compute-0 kernel: tapf4039b0f-53 (unregistering): left promiscuous mode
Feb 16 13:43:00 compute-0 podman[213359]: 2026-02-16 13:43:00.025040397 +0000 UTC m=+0.056840500 container health_status d69bd0be854ec8043fcba555b70c2cd311c7a2510d07d6bc9929fe7684abece9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent)
Feb 16 13:43:00 compute-0 NetworkManager[56177]: <info>  [1771249380.0259] device (tapf4039b0f-53): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Feb 16 13:43:00 compute-0 podman[213358]: 2026-02-16 13:43:00.034558043 +0000 UTC m=+0.066743035 container health_status 93151dcbec9f159583042df5101bdd7b5b1bcb5f3117955b4bc9cd1c0e465b98 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.7, architecture=x86_64, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-type=git, vendor=Red Hat, Inc., build-date=2026-02-05T04:57:10Z, io.openshift.tags=minimal rhel9, org.opencontainers.image.created=2026-02-05T04:57:10Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, container_name=openstack_network_exporter, config_id=openstack_network_exporter, maintainer=Red Hat, Inc., managed_by=edpm_ansible, name=ubi9/ubi-minimal, release=1770267347, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 16 13:43:00 compute-0 nova_compute[185723]: 2026-02-16 13:43:00.034 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:43:00 compute-0 ovn_controller[96072]: 2026-02-16T13:43:00Z|00160|binding|INFO|Releasing lport f4039b0f-5331-420f-9e9f-432d4b817a98 from this chassis (sb_readonly=0)
Feb 16 13:43:00 compute-0 ovn_controller[96072]: 2026-02-16T13:43:00Z|00161|binding|INFO|Setting lport f4039b0f-5331-420f-9e9f-432d4b817a98 down in Southbound
Feb 16 13:43:00 compute-0 ovn_controller[96072]: 2026-02-16T13:43:00Z|00162|binding|INFO|Removing iface tapf4039b0f-53 ovn-installed in OVS
Feb 16 13:43:00 compute-0 nova_compute[185723]: 2026-02-16 13:43:00.036 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:43:00 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:43:00.042 105360 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c5:de:ed 10.100.0.4'], port_security=['fa:16:3e:c5:de:ed 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '44f63a81-024b-446b-a144-28445aaae47c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '76c271745e704d5fa97fe16a7dcd4a81', 'neutron:revision_number': '4', 'neutron:security_group_ids': '832f9d98-96fb-45e2-8c11-0f4534711ee2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fb3ae2f5-242d-4288-83f4-c053c76499e5, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2e53046760>], logical_port=f4039b0f-5331-420f-9e9f-432d4b817a98) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2e53046760>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 13:43:00 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:43:00.043 105360 INFO neutron.agent.ovn.metadata.agent [-] Port f4039b0f-5331-420f-9e9f-432d4b817a98 in datapath 62a1ccdd-3048-4bbf-acc8-c791bff79ee8 unbound from our chassis
Feb 16 13:43:00 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:43:00.044 105360 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 62a1ccdd-3048-4bbf-acc8-c791bff79ee8, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 16 13:43:00 compute-0 nova_compute[185723]: 2026-02-16 13:43:00.045 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:43:00 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:43:00.045 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[106aeecd-a75c-473a-bdcd-eae846018022]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:43:00 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:43:00.047 105360 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8 namespace which is not needed anymore
Feb 16 13:43:00 compute-0 nova_compute[185723]: 2026-02-16 13:43:00.074 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:43:00 compute-0 systemd[1]: machine-qemu\x2d13\x2dinstance\x2d00000011.scope: Deactivated successfully.
Feb 16 13:43:00 compute-0 systemd[1]: machine-qemu\x2d13\x2dinstance\x2d00000011.scope: Consumed 14.986s CPU time.
Feb 16 13:43:00 compute-0 systemd-machined[155229]: Machine qemu-13-instance-00000011 terminated.
Feb 16 13:43:00 compute-0 neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8[212936]: [NOTICE]   (212940) : haproxy version is 2.8.14-c23fe91
Feb 16 13:43:00 compute-0 neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8[212936]: [NOTICE]   (212940) : path to executable is /usr/sbin/haproxy
Feb 16 13:43:00 compute-0 neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8[212936]: [WARNING]  (212940) : Exiting Master process...
Feb 16 13:43:00 compute-0 neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8[212936]: [ALERT]    (212940) : Current worker (212942) exited with code 143 (Terminated)
Feb 16 13:43:00 compute-0 neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8[212936]: [WARNING]  (212940) : All workers exited. Exiting... (0)
Feb 16 13:43:00 compute-0 systemd[1]: libpod-42069fc2ab7b509976930f3d5acfa8f54a7dd91cf46332176957e1c728be8c8f.scope: Deactivated successfully.
Feb 16 13:43:00 compute-0 podman[213419]: 2026-02-16 13:43:00.170745319 +0000 UTC m=+0.044483064 container died 42069fc2ab7b509976930f3d5acfa8f54a7dd91cf46332176957e1c728be8c8f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Feb 16 13:43:00 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-42069fc2ab7b509976930f3d5acfa8f54a7dd91cf46332176957e1c728be8c8f-userdata-shm.mount: Deactivated successfully.
Feb 16 13:43:00 compute-0 systemd[1]: var-lib-containers-storage-overlay-035db9f9b501e1ba06ecf122331369d532122cbed242aa804dee3180072ddefe-merged.mount: Deactivated successfully.
Feb 16 13:43:00 compute-0 podman[213419]: 2026-02-16 13:43:00.205143551 +0000 UTC m=+0.078881266 container cleanup 42069fc2ab7b509976930f3d5acfa8f54a7dd91cf46332176957e1c728be8c8f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 16 13:43:00 compute-0 systemd[1]: libpod-conmon-42069fc2ab7b509976930f3d5acfa8f54a7dd91cf46332176957e1c728be8c8f.scope: Deactivated successfully.
Feb 16 13:43:00 compute-0 NetworkManager[56177]: <info>  [1771249380.2165] manager: (tapf4039b0f-53): new Tun device (/org/freedesktop/NetworkManager/Devices/65)
Feb 16 13:43:00 compute-0 nova_compute[185723]: 2026-02-16 13:43:00.237 185727 INFO nova.virt.libvirt.driver [-] [instance: 44f63a81-024b-446b-a144-28445aaae47c] Instance destroyed successfully.
Feb 16 13:43:00 compute-0 nova_compute[185723]: 2026-02-16 13:43:00.238 185727 DEBUG nova.objects.instance [None req-8b17c3c6-8fca-4e8c-b7c5-ba73474e087a e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lazy-loading 'resources' on Instance uuid 44f63a81-024b-446b-a144-28445aaae47c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 13:43:00 compute-0 podman[213452]: 2026-02-16 13:43:00.265446756 +0000 UTC m=+0.043331575 container remove 42069fc2ab7b509976930f3d5acfa8f54a7dd91cf46332176957e1c728be8c8f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Feb 16 13:43:00 compute-0 nova_compute[185723]: 2026-02-16 13:43:00.265 185727 DEBUG nova.virt.libvirt.vif [None req-8b17c3c6-8fca-4e8c-b7c5-ba73474e087a e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-16T13:41:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-230655652',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-230655652',id=17,image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-16T13:41:28Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='76c271745e704d5fa97fe16a7dcd4a81',ramdisk_id='',reservation_id='r-4tu80fu6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-1085993185',owner_user_name='tempest-TestExecuteStrategies-1085993185-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-16T13:41:28Z,user_data=None,user_id='e19cd2d8a8894526ba620ca3249e9a63',uuid=44f63a81-024b-446b-a144-28445aaae47c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "f4039b0f-5331-420f-9e9f-432d4b817a98", "address": "fa:16:3e:c5:de:ed", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf4039b0f-53", "ovs_interfaceid": "f4039b0f-5331-420f-9e9f-432d4b817a98", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Feb 16 13:43:00 compute-0 nova_compute[185723]: 2026-02-16 13:43:00.266 185727 DEBUG nova.network.os_vif_util [None req-8b17c3c6-8fca-4e8c-b7c5-ba73474e087a e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Converting VIF {"id": "f4039b0f-5331-420f-9e9f-432d4b817a98", "address": "fa:16:3e:c5:de:ed", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf4039b0f-53", "ovs_interfaceid": "f4039b0f-5331-420f-9e9f-432d4b817a98", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 13:43:00 compute-0 nova_compute[185723]: 2026-02-16 13:43:00.267 185727 DEBUG nova.network.os_vif_util [None req-8b17c3c6-8fca-4e8c-b7c5-ba73474e087a e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:c5:de:ed,bridge_name='br-int',has_traffic_filtering=True,id=f4039b0f-5331-420f-9e9f-432d4b817a98,network=Network(62a1ccdd-3048-4bbf-acc8-c791bff79ee8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf4039b0f-53') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 13:43:00 compute-0 nova_compute[185723]: 2026-02-16 13:43:00.267 185727 DEBUG os_vif [None req-8b17c3c6-8fca-4e8c-b7c5-ba73474e087a e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:c5:de:ed,bridge_name='br-int',has_traffic_filtering=True,id=f4039b0f-5331-420f-9e9f-432d4b817a98,network=Network(62a1ccdd-3048-4bbf-acc8-c791bff79ee8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf4039b0f-53') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Feb 16 13:43:00 compute-0 nova_compute[185723]: 2026-02-16 13:43:00.269 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:43:00 compute-0 nova_compute[185723]: 2026-02-16 13:43:00.270 185727 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf4039b0f-53, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:43:00 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:43:00.270 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[72b70868-643e-4e6e-ad38-27b80e1745e4]: (4, ('Mon Feb 16 01:43:00 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8 (42069fc2ab7b509976930f3d5acfa8f54a7dd91cf46332176957e1c728be8c8f)\n42069fc2ab7b509976930f3d5acfa8f54a7dd91cf46332176957e1c728be8c8f\nMon Feb 16 01:43:00 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8 (42069fc2ab7b509976930f3d5acfa8f54a7dd91cf46332176957e1c728be8c8f)\n42069fc2ab7b509976930f3d5acfa8f54a7dd91cf46332176957e1c728be8c8f\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:43:00 compute-0 nova_compute[185723]: 2026-02-16 13:43:00.271 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:43:00 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:43:00.272 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[fbc3d8b3-1699-456d-bc03-04412f4cb39b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:43:00 compute-0 nova_compute[185723]: 2026-02-16 13:43:00.273 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:43:00 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:43:00.273 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap62a1ccdd-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:43:00 compute-0 nova_compute[185723]: 2026-02-16 13:43:00.274 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:43:00 compute-0 kernel: tap62a1ccdd-30: left promiscuous mode
Feb 16 13:43:00 compute-0 nova_compute[185723]: 2026-02-16 13:43:00.278 185727 INFO os_vif [None req-8b17c3c6-8fca-4e8c-b7c5-ba73474e087a e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:c5:de:ed,bridge_name='br-int',has_traffic_filtering=True,id=f4039b0f-5331-420f-9e9f-432d4b817a98,network=Network(62a1ccdd-3048-4bbf-acc8-c791bff79ee8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf4039b0f-53')
Feb 16 13:43:00 compute-0 nova_compute[185723]: 2026-02-16 13:43:00.279 185727 INFO nova.virt.libvirt.driver [None req-8b17c3c6-8fca-4e8c-b7c5-ba73474e087a e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 44f63a81-024b-446b-a144-28445aaae47c] Deleting instance files /var/lib/nova/instances/44f63a81-024b-446b-a144-28445aaae47c_del
Feb 16 13:43:00 compute-0 nova_compute[185723]: 2026-02-16 13:43:00.279 185727 INFO nova.virt.libvirt.driver [None req-8b17c3c6-8fca-4e8c-b7c5-ba73474e087a e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 44f63a81-024b-446b-a144-28445aaae47c] Deletion of /var/lib/nova/instances/44f63a81-024b-446b-a144-28445aaae47c_del complete
Feb 16 13:43:00 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:43:00.281 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[177020e5-c404-4dec-9e33-7793525c2bdc]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:43:00 compute-0 nova_compute[185723]: 2026-02-16 13:43:00.283 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:43:00 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:43:00.293 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[b0976374-ee03-45b7-88e4-7f20ef087ba0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:43:00 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:43:00.295 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[4b283426-b26b-4280-bca9-fa630508ac50]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:43:00 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:43:00.308 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[16c3e06b-ad08-4c8b-a08c-97fbb06cfa10]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 537127, 'reachable_time': 36179, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 213483, 'error': None, 'target': 'ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:43:00 compute-0 systemd[1]: run-netns-ovnmeta\x2d62a1ccdd\x2d3048\x2d4bbf\x2dacc8\x2dc791bff79ee8.mount: Deactivated successfully.
Feb 16 13:43:00 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:43:00.312 105762 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Feb 16 13:43:00 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:43:00.313 105762 DEBUG oslo.privsep.daemon [-] privsep: reply[ac7a8f04-6774-4f53-9977-79c47961ba63]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:43:00 compute-0 nova_compute[185723]: 2026-02-16 13:43:00.325 185727 INFO nova.compute.manager [None req-8b17c3c6-8fca-4e8c-b7c5-ba73474e087a e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 44f63a81-024b-446b-a144-28445aaae47c] Took 0.33 seconds to destroy the instance on the hypervisor.
Feb 16 13:43:00 compute-0 nova_compute[185723]: 2026-02-16 13:43:00.326 185727 DEBUG oslo.service.loopingcall [None req-8b17c3c6-8fca-4e8c-b7c5-ba73474e087a e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Feb 16 13:43:00 compute-0 nova_compute[185723]: 2026-02-16 13:43:00.326 185727 DEBUG nova.compute.manager [-] [instance: 44f63a81-024b-446b-a144-28445aaae47c] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Feb 16 13:43:00 compute-0 nova_compute[185723]: 2026-02-16 13:43:00.326 185727 DEBUG nova.network.neutron [-] [instance: 44f63a81-024b-446b-a144-28445aaae47c] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Feb 16 13:43:00 compute-0 nova_compute[185723]: 2026-02-16 13:43:00.416 185727 DEBUG nova.compute.manager [req-c5d90dd3-91ee-4dd5-874a-9dd68b6326ac req-f797b71e-8684-4321-950c-f881214b2ca5 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: ed0f983d-6cd6-429c-8af1-0d52a56731d6] Received event network-vif-plugged-c9816814-5dfa-4f80-812c-4fc20a800a47 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:43:00 compute-0 nova_compute[185723]: 2026-02-16 13:43:00.417 185727 DEBUG oslo_concurrency.lockutils [req-c5d90dd3-91ee-4dd5-874a-9dd68b6326ac req-f797b71e-8684-4321-950c-f881214b2ca5 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "ed0f983d-6cd6-429c-8af1-0d52a56731d6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:43:00 compute-0 nova_compute[185723]: 2026-02-16 13:43:00.417 185727 DEBUG oslo_concurrency.lockutils [req-c5d90dd3-91ee-4dd5-874a-9dd68b6326ac req-f797b71e-8684-4321-950c-f881214b2ca5 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "ed0f983d-6cd6-429c-8af1-0d52a56731d6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:43:00 compute-0 nova_compute[185723]: 2026-02-16 13:43:00.418 185727 DEBUG oslo_concurrency.lockutils [req-c5d90dd3-91ee-4dd5-874a-9dd68b6326ac req-f797b71e-8684-4321-950c-f881214b2ca5 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "ed0f983d-6cd6-429c-8af1-0d52a56731d6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:43:00 compute-0 nova_compute[185723]: 2026-02-16 13:43:00.418 185727 DEBUG nova.compute.manager [req-c5d90dd3-91ee-4dd5-874a-9dd68b6326ac req-f797b71e-8684-4321-950c-f881214b2ca5 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: ed0f983d-6cd6-429c-8af1-0d52a56731d6] No waiting events found dispatching network-vif-plugged-c9816814-5dfa-4f80-812c-4fc20a800a47 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:43:00 compute-0 nova_compute[185723]: 2026-02-16 13:43:00.418 185727 WARNING nova.compute.manager [req-c5d90dd3-91ee-4dd5-874a-9dd68b6326ac req-f797b71e-8684-4321-950c-f881214b2ca5 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: ed0f983d-6cd6-429c-8af1-0d52a56731d6] Received unexpected event network-vif-plugged-c9816814-5dfa-4f80-812c-4fc20a800a47 for instance with vm_state deleted and task_state None.
Feb 16 13:43:00 compute-0 nova_compute[185723]: 2026-02-16 13:43:00.418 185727 DEBUG nova.compute.manager [req-c5d90dd3-91ee-4dd5-874a-9dd68b6326ac req-f797b71e-8684-4321-950c-f881214b2ca5 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: ed0f983d-6cd6-429c-8af1-0d52a56731d6] Received event network-vif-deleted-c9816814-5dfa-4f80-812c-4fc20a800a47 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:43:00 compute-0 nova_compute[185723]: 2026-02-16 13:43:00.889 185727 DEBUG nova.compute.manager [req-8840afe4-58d3-4f29-b17a-574d15da4244 req-d5baca2a-9143-46fc-a95a-bc7b62ca5a40 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 44f63a81-024b-446b-a144-28445aaae47c] Received event network-vif-unplugged-f4039b0f-5331-420f-9e9f-432d4b817a98 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:43:00 compute-0 nova_compute[185723]: 2026-02-16 13:43:00.890 185727 DEBUG oslo_concurrency.lockutils [req-8840afe4-58d3-4f29-b17a-574d15da4244 req-d5baca2a-9143-46fc-a95a-bc7b62ca5a40 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "44f63a81-024b-446b-a144-28445aaae47c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:43:00 compute-0 nova_compute[185723]: 2026-02-16 13:43:00.890 185727 DEBUG oslo_concurrency.lockutils [req-8840afe4-58d3-4f29-b17a-574d15da4244 req-d5baca2a-9143-46fc-a95a-bc7b62ca5a40 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "44f63a81-024b-446b-a144-28445aaae47c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:43:00 compute-0 nova_compute[185723]: 2026-02-16 13:43:00.890 185727 DEBUG oslo_concurrency.lockutils [req-8840afe4-58d3-4f29-b17a-574d15da4244 req-d5baca2a-9143-46fc-a95a-bc7b62ca5a40 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "44f63a81-024b-446b-a144-28445aaae47c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:43:00 compute-0 nova_compute[185723]: 2026-02-16 13:43:00.891 185727 DEBUG nova.compute.manager [req-8840afe4-58d3-4f29-b17a-574d15da4244 req-d5baca2a-9143-46fc-a95a-bc7b62ca5a40 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 44f63a81-024b-446b-a144-28445aaae47c] No waiting events found dispatching network-vif-unplugged-f4039b0f-5331-420f-9e9f-432d4b817a98 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:43:00 compute-0 nova_compute[185723]: 2026-02-16 13:43:00.891 185727 DEBUG nova.compute.manager [req-8840afe4-58d3-4f29-b17a-574d15da4244 req-d5baca2a-9143-46fc-a95a-bc7b62ca5a40 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 44f63a81-024b-446b-a144-28445aaae47c] Received event network-vif-unplugged-f4039b0f-5331-420f-9e9f-432d4b817a98 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Feb 16 13:43:01 compute-0 nova_compute[185723]: 2026-02-16 13:43:01.215 185727 DEBUG nova.network.neutron [-] [instance: 44f63a81-024b-446b-a144-28445aaae47c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 13:43:01 compute-0 nova_compute[185723]: 2026-02-16 13:43:01.233 185727 INFO nova.compute.manager [-] [instance: 44f63a81-024b-446b-a144-28445aaae47c] Took 0.91 seconds to deallocate network for instance.
Feb 16 13:43:01 compute-0 nova_compute[185723]: 2026-02-16 13:43:01.272 185727 DEBUG oslo_concurrency.lockutils [None req-8b17c3c6-8fca-4e8c-b7c5-ba73474e087a e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:43:01 compute-0 nova_compute[185723]: 2026-02-16 13:43:01.273 185727 DEBUG oslo_concurrency.lockutils [None req-8b17c3c6-8fca-4e8c-b7c5-ba73474e087a e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:43:01 compute-0 nova_compute[185723]: 2026-02-16 13:43:01.331 185727 DEBUG nova.compute.provider_tree [None req-8b17c3c6-8fca-4e8c-b7c5-ba73474e087a e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Inventory has not changed in ProviderTree for provider: c9501a85-df32-4b8f-bce0-9425ef1e7866 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 13:43:01 compute-0 nova_compute[185723]: 2026-02-16 13:43:01.349 185727 DEBUG nova.scheduler.client.report [None req-8b17c3c6-8fca-4e8c-b7c5-ba73474e087a e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Inventory has not changed for provider c9501a85-df32-4b8f-bce0-9425ef1e7866 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 13:43:01 compute-0 nova_compute[185723]: 2026-02-16 13:43:01.375 185727 DEBUG oslo_concurrency.lockutils [None req-8b17c3c6-8fca-4e8c-b7c5-ba73474e087a e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.102s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:43:01 compute-0 openstack_network_exporter[197909]: ERROR   13:43:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:43:01 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:43:01 compute-0 openstack_network_exporter[197909]: ERROR   13:43:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:43:01 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:43:01 compute-0 nova_compute[185723]: 2026-02-16 13:43:01.427 185727 INFO nova.scheduler.client.report [None req-8b17c3c6-8fca-4e8c-b7c5-ba73474e087a e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Deleted allocations for instance 44f63a81-024b-446b-a144-28445aaae47c
Feb 16 13:43:01 compute-0 nova_compute[185723]: 2026-02-16 13:43:01.511 185727 DEBUG oslo_concurrency.lockutils [None req-8b17c3c6-8fca-4e8c-b7c5-ba73474e087a e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "44f63a81-024b-446b-a144-28445aaae47c" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.514s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:43:02 compute-0 nova_compute[185723]: 2026-02-16 13:43:02.974 185727 DEBUG nova.compute.manager [req-d48f7e1b-be4b-4cc6-96c0-f6bccfeafdde req-6bd438d4-2bc8-49f2-96bc-60274a06819c faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 44f63a81-024b-446b-a144-28445aaae47c] Received event network-vif-plugged-f4039b0f-5331-420f-9e9f-432d4b817a98 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:43:02 compute-0 nova_compute[185723]: 2026-02-16 13:43:02.974 185727 DEBUG oslo_concurrency.lockutils [req-d48f7e1b-be4b-4cc6-96c0-f6bccfeafdde req-6bd438d4-2bc8-49f2-96bc-60274a06819c faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "44f63a81-024b-446b-a144-28445aaae47c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:43:02 compute-0 nova_compute[185723]: 2026-02-16 13:43:02.974 185727 DEBUG oslo_concurrency.lockutils [req-d48f7e1b-be4b-4cc6-96c0-f6bccfeafdde req-6bd438d4-2bc8-49f2-96bc-60274a06819c faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "44f63a81-024b-446b-a144-28445aaae47c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:43:02 compute-0 nova_compute[185723]: 2026-02-16 13:43:02.974 185727 DEBUG oslo_concurrency.lockutils [req-d48f7e1b-be4b-4cc6-96c0-f6bccfeafdde req-6bd438d4-2bc8-49f2-96bc-60274a06819c faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "44f63a81-024b-446b-a144-28445aaae47c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:43:02 compute-0 nova_compute[185723]: 2026-02-16 13:43:02.975 185727 DEBUG nova.compute.manager [req-d48f7e1b-be4b-4cc6-96c0-f6bccfeafdde req-6bd438d4-2bc8-49f2-96bc-60274a06819c faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 44f63a81-024b-446b-a144-28445aaae47c] No waiting events found dispatching network-vif-plugged-f4039b0f-5331-420f-9e9f-432d4b817a98 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:43:02 compute-0 nova_compute[185723]: 2026-02-16 13:43:02.975 185727 WARNING nova.compute.manager [req-d48f7e1b-be4b-4cc6-96c0-f6bccfeafdde req-6bd438d4-2bc8-49f2-96bc-60274a06819c faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 44f63a81-024b-446b-a144-28445aaae47c] Received unexpected event network-vif-plugged-f4039b0f-5331-420f-9e9f-432d4b817a98 for instance with vm_state deleted and task_state None.
Feb 16 13:43:02 compute-0 nova_compute[185723]: 2026-02-16 13:43:02.975 185727 DEBUG nova.compute.manager [req-d48f7e1b-be4b-4cc6-96c0-f6bccfeafdde req-6bd438d4-2bc8-49f2-96bc-60274a06819c faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 44f63a81-024b-446b-a144-28445aaae47c] Received event network-vif-deleted-f4039b0f-5331-420f-9e9f-432d4b817a98 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:43:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:43:03.236 105360 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:43:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:43:03.238 105360 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:43:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:43:03.238 105360 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:43:04 compute-0 podman[213484]: 2026-02-16 13:43:04.031250047 +0000 UTC m=+0.068305545 container health_status 19961a2f872cc72daf954f4dd556f980b5f58f137d145776df6132c167a8a93c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Feb 16 13:43:05 compute-0 nova_compute[185723]: 2026-02-16 13:43:05.076 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:43:05 compute-0 nova_compute[185723]: 2026-02-16 13:43:05.271 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:43:09 compute-0 nova_compute[185723]: 2026-02-16 13:43:09.434 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:43:09 compute-0 nova_compute[185723]: 2026-02-16 13:43:09.434 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Feb 16 13:43:09 compute-0 nova_compute[185723]: 2026-02-16 13:43:09.455 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Feb 16 13:43:10 compute-0 nova_compute[185723]: 2026-02-16 13:43:10.078 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:43:10 compute-0 nova_compute[185723]: 2026-02-16 13:43:10.274 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:43:13 compute-0 nova_compute[185723]: 2026-02-16 13:43:13.329 185727 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1771249378.3280087, ed0f983d-6cd6-429c-8af1-0d52a56731d6 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 13:43:13 compute-0 nova_compute[185723]: 2026-02-16 13:43:13.330 185727 INFO nova.compute.manager [-] [instance: ed0f983d-6cd6-429c-8af1-0d52a56731d6] VM Stopped (Lifecycle Event)
Feb 16 13:43:13 compute-0 nova_compute[185723]: 2026-02-16 13:43:13.349 185727 DEBUG nova.compute.manager [None req-b795b09e-7169-4561-9894-bfae15bdd931 - - - - - -] [instance: ed0f983d-6cd6-429c-8af1-0d52a56731d6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:43:14 compute-0 podman[213511]: 2026-02-16 13:43:14.000964761 +0000 UTC m=+0.040056982 container health_status 4ec6105d1727f201f656d7e6e358931be8ceabc5af37fb5ad779abc620122180 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 16 13:43:15 compute-0 nova_compute[185723]: 2026-02-16 13:43:15.081 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:43:15 compute-0 nova_compute[185723]: 2026-02-16 13:43:15.236 185727 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1771249380.235596, 44f63a81-024b-446b-a144-28445aaae47c => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 13:43:15 compute-0 nova_compute[185723]: 2026-02-16 13:43:15.236 185727 INFO nova.compute.manager [-] [instance: 44f63a81-024b-446b-a144-28445aaae47c] VM Stopped (Lifecycle Event)
Feb 16 13:43:15 compute-0 nova_compute[185723]: 2026-02-16 13:43:15.269 185727 DEBUG nova.compute.manager [None req-a9e0d09c-5d65-4dbb-9e96-a1f708370134 - - - - - -] [instance: 44f63a81-024b-446b-a144-28445aaae47c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:43:15 compute-0 nova_compute[185723]: 2026-02-16 13:43:15.275 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:43:17 compute-0 nova_compute[185723]: 2026-02-16 13:43:17.454 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:43:19 compute-0 nova_compute[185723]: 2026-02-16 13:43:19.433 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:43:20 compute-0 nova_compute[185723]: 2026-02-16 13:43:20.083 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:43:20 compute-0 nova_compute[185723]: 2026-02-16 13:43:20.277 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:43:20 compute-0 nova_compute[185723]: 2026-02-16 13:43:20.434 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:43:20 compute-0 nova_compute[185723]: 2026-02-16 13:43:20.435 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 16 13:43:20 compute-0 nova_compute[185723]: 2026-02-16 13:43:20.435 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 16 13:43:20 compute-0 nova_compute[185723]: 2026-02-16 13:43:20.450 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 16 13:43:20 compute-0 nova_compute[185723]: 2026-02-16 13:43:20.450 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:43:20 compute-0 nova_compute[185723]: 2026-02-16 13:43:20.476 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:43:20 compute-0 nova_compute[185723]: 2026-02-16 13:43:20.477 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:43:20 compute-0 nova_compute[185723]: 2026-02-16 13:43:20.477 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:43:20 compute-0 nova_compute[185723]: 2026-02-16 13:43:20.477 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 16 13:43:20 compute-0 nova_compute[185723]: 2026-02-16 13:43:20.608 185727 WARNING nova.virt.libvirt.driver [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 13:43:20 compute-0 nova_compute[185723]: 2026-02-16 13:43:20.609 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5823MB free_disk=73.22512817382812GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 16 13:43:20 compute-0 nova_compute[185723]: 2026-02-16 13:43:20.609 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:43:20 compute-0 nova_compute[185723]: 2026-02-16 13:43:20.609 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:43:20 compute-0 nova_compute[185723]: 2026-02-16 13:43:20.739 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 16 13:43:20 compute-0 nova_compute[185723]: 2026-02-16 13:43:20.739 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 16 13:43:20 compute-0 nova_compute[185723]: 2026-02-16 13:43:20.796 185727 DEBUG nova.compute.provider_tree [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Inventory has not changed in ProviderTree for provider: c9501a85-df32-4b8f-bce0-9425ef1e7866 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 13:43:20 compute-0 nova_compute[185723]: 2026-02-16 13:43:20.824 185727 DEBUG nova.scheduler.client.report [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Inventory has not changed for provider c9501a85-df32-4b8f-bce0-9425ef1e7866 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 13:43:20 compute-0 nova_compute[185723]: 2026-02-16 13:43:20.849 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 16 13:43:20 compute-0 nova_compute[185723]: 2026-02-16 13:43:20.850 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.240s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:43:21 compute-0 nova_compute[185723]: 2026-02-16 13:43:21.833 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:43:21 compute-0 nova_compute[185723]: 2026-02-16 13:43:21.833 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:43:22 compute-0 nova_compute[185723]: 2026-02-16 13:43:22.429 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:43:23 compute-0 nova_compute[185723]: 2026-02-16 13:43:23.428 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:43:23 compute-0 nova_compute[185723]: 2026-02-16 13:43:23.454 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:43:23 compute-0 nova_compute[185723]: 2026-02-16 13:43:23.454 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 16 13:43:24 compute-0 nova_compute[185723]: 2026-02-16 13:43:24.433 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:43:25 compute-0 nova_compute[185723]: 2026-02-16 13:43:25.084 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:43:25 compute-0 nova_compute[185723]: 2026-02-16 13:43:25.279 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:43:27 compute-0 nova_compute[185723]: 2026-02-16 13:43:27.433 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:43:27 compute-0 nova_compute[185723]: 2026-02-16 13:43:27.434 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Feb 16 13:43:29 compute-0 podman[195053]: time="2026-02-16T13:43:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:43:29 compute-0 podman[195053]: @ - - [16/Feb/2026:13:43:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16012 "" "Go-http-client/1.1"
Feb 16 13:43:29 compute-0 podman[195053]: @ - - [16/Feb/2026:13:43:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2180 "" "Go-http-client/1.1"
Feb 16 13:43:30 compute-0 nova_compute[185723]: 2026-02-16 13:43:30.085 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:43:30 compute-0 nova_compute[185723]: 2026-02-16 13:43:30.280 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:43:31 compute-0 podman[213538]: 2026-02-16 13:43:31.011623167 +0000 UTC m=+0.044535325 container health_status d69bd0be854ec8043fcba555b70c2cd311c7a2510d07d6bc9929fe7684abece9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Feb 16 13:43:31 compute-0 podman[213537]: 2026-02-16 13:43:31.01291974 +0000 UTC m=+0.049515199 container health_status 93151dcbec9f159583042df5101bdd7b5b1bcb5f3117955b4bc9cd1c0e465b98 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1770267347, vendor=Red Hat, Inc., io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=openstack_network_exporter, version=9.7, architecture=x86_64, managed_by=edpm_ansible, container_name=openstack_network_exporter, vcs-type=git, io.openshift.expose-services=, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2026-02-05T04:57:10Z, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9/ubi-minimal, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=ubi9-minimal-container, org.opencontainers.image.created=2026-02-05T04:57:10Z)
Feb 16 13:43:31 compute-0 openstack_network_exporter[197909]: ERROR   13:43:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:43:31 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:43:31 compute-0 openstack_network_exporter[197909]: ERROR   13:43:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:43:31 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:43:35 compute-0 podman[213577]: 2026-02-16 13:43:35.031100807 +0000 UTC m=+0.072879688 container health_status 19961a2f872cc72daf954f4dd556f980b5f58f137d145776df6132c167a8a93c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127)
Feb 16 13:43:35 compute-0 nova_compute[185723]: 2026-02-16 13:43:35.087 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:43:35 compute-0 nova_compute[185723]: 2026-02-16 13:43:35.282 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:43:35 compute-0 nova_compute[185723]: 2026-02-16 13:43:35.939 185727 DEBUG oslo_concurrency.lockutils [None req-5ebbdce6-00be-45c3-bcf7-9f97301139cd e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Acquiring lock "e5e9f487-f690-47b2-aaed-59236907f08b" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:43:35 compute-0 nova_compute[185723]: 2026-02-16 13:43:35.939 185727 DEBUG oslo_concurrency.lockutils [None req-5ebbdce6-00be-45c3-bcf7-9f97301139cd e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "e5e9f487-f690-47b2-aaed-59236907f08b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:43:35 compute-0 nova_compute[185723]: 2026-02-16 13:43:35.957 185727 DEBUG nova.compute.manager [None req-5ebbdce6-00be-45c3-bcf7-9f97301139cd e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: e5e9f487-f690-47b2-aaed-59236907f08b] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Feb 16 13:43:36 compute-0 nova_compute[185723]: 2026-02-16 13:43:36.105 185727 DEBUG oslo_concurrency.lockutils [None req-5ebbdce6-00be-45c3-bcf7-9f97301139cd e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:43:36 compute-0 nova_compute[185723]: 2026-02-16 13:43:36.106 185727 DEBUG oslo_concurrency.lockutils [None req-5ebbdce6-00be-45c3-bcf7-9f97301139cd e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:43:36 compute-0 nova_compute[185723]: 2026-02-16 13:43:36.111 185727 DEBUG nova.virt.hardware [None req-5ebbdce6-00be-45c3-bcf7-9f97301139cd e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Feb 16 13:43:36 compute-0 nova_compute[185723]: 2026-02-16 13:43:36.112 185727 INFO nova.compute.claims [None req-5ebbdce6-00be-45c3-bcf7-9f97301139cd e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: e5e9f487-f690-47b2-aaed-59236907f08b] Claim successful on node compute-0.ctlplane.example.com
Feb 16 13:43:36 compute-0 nova_compute[185723]: 2026-02-16 13:43:36.223 185727 DEBUG nova.compute.provider_tree [None req-5ebbdce6-00be-45c3-bcf7-9f97301139cd e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Inventory has not changed in ProviderTree for provider: c9501a85-df32-4b8f-bce0-9425ef1e7866 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 13:43:36 compute-0 nova_compute[185723]: 2026-02-16 13:43:36.237 185727 DEBUG nova.scheduler.client.report [None req-5ebbdce6-00be-45c3-bcf7-9f97301139cd e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Inventory has not changed for provider c9501a85-df32-4b8f-bce0-9425ef1e7866 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 13:43:36 compute-0 nova_compute[185723]: 2026-02-16 13:43:36.258 185727 DEBUG oslo_concurrency.lockutils [None req-5ebbdce6-00be-45c3-bcf7-9f97301139cd e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.153s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:43:36 compute-0 nova_compute[185723]: 2026-02-16 13:43:36.259 185727 DEBUG nova.compute.manager [None req-5ebbdce6-00be-45c3-bcf7-9f97301139cd e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: e5e9f487-f690-47b2-aaed-59236907f08b] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Feb 16 13:43:36 compute-0 nova_compute[185723]: 2026-02-16 13:43:36.300 185727 DEBUG nova.compute.manager [None req-5ebbdce6-00be-45c3-bcf7-9f97301139cd e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: e5e9f487-f690-47b2-aaed-59236907f08b] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Feb 16 13:43:36 compute-0 nova_compute[185723]: 2026-02-16 13:43:36.300 185727 DEBUG nova.network.neutron [None req-5ebbdce6-00be-45c3-bcf7-9f97301139cd e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: e5e9f487-f690-47b2-aaed-59236907f08b] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Feb 16 13:43:36 compute-0 nova_compute[185723]: 2026-02-16 13:43:36.324 185727 INFO nova.virt.libvirt.driver [None req-5ebbdce6-00be-45c3-bcf7-9f97301139cd e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: e5e9f487-f690-47b2-aaed-59236907f08b] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Feb 16 13:43:36 compute-0 nova_compute[185723]: 2026-02-16 13:43:36.346 185727 DEBUG nova.compute.manager [None req-5ebbdce6-00be-45c3-bcf7-9f97301139cd e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: e5e9f487-f690-47b2-aaed-59236907f08b] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Feb 16 13:43:36 compute-0 nova_compute[185723]: 2026-02-16 13:43:36.451 185727 DEBUG nova.compute.manager [None req-5ebbdce6-00be-45c3-bcf7-9f97301139cd e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: e5e9f487-f690-47b2-aaed-59236907f08b] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Feb 16 13:43:36 compute-0 nova_compute[185723]: 2026-02-16 13:43:36.453 185727 DEBUG nova.virt.libvirt.driver [None req-5ebbdce6-00be-45c3-bcf7-9f97301139cd e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: e5e9f487-f690-47b2-aaed-59236907f08b] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Feb 16 13:43:36 compute-0 nova_compute[185723]: 2026-02-16 13:43:36.454 185727 INFO nova.virt.libvirt.driver [None req-5ebbdce6-00be-45c3-bcf7-9f97301139cd e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: e5e9f487-f690-47b2-aaed-59236907f08b] Creating image(s)
Feb 16 13:43:36 compute-0 nova_compute[185723]: 2026-02-16 13:43:36.455 185727 DEBUG oslo_concurrency.lockutils [None req-5ebbdce6-00be-45c3-bcf7-9f97301139cd e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Acquiring lock "/var/lib/nova/instances/e5e9f487-f690-47b2-aaed-59236907f08b/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:43:36 compute-0 nova_compute[185723]: 2026-02-16 13:43:36.456 185727 DEBUG oslo_concurrency.lockutils [None req-5ebbdce6-00be-45c3-bcf7-9f97301139cd e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "/var/lib/nova/instances/e5e9f487-f690-47b2-aaed-59236907f08b/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:43:36 compute-0 nova_compute[185723]: 2026-02-16 13:43:36.457 185727 DEBUG oslo_concurrency.lockutils [None req-5ebbdce6-00be-45c3-bcf7-9f97301139cd e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "/var/lib/nova/instances/e5e9f487-f690-47b2-aaed-59236907f08b/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:43:36 compute-0 nova_compute[185723]: 2026-02-16 13:43:36.482 185727 DEBUG oslo_concurrency.processutils [None req-5ebbdce6-00be-45c3-bcf7-9f97301139cd e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:43:36 compute-0 nova_compute[185723]: 2026-02-16 13:43:36.536 185727 DEBUG oslo_concurrency.processutils [None req-5ebbdce6-00be-45c3-bcf7-9f97301139cd e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:43:36 compute-0 nova_compute[185723]: 2026-02-16 13:43:36.537 185727 DEBUG oslo_concurrency.lockutils [None req-5ebbdce6-00be-45c3-bcf7-9f97301139cd e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Acquiring lock "755ae6cb63977e865a71a8c38a1a67ce95c9d0b7" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:43:36 compute-0 nova_compute[185723]: 2026-02-16 13:43:36.538 185727 DEBUG oslo_concurrency.lockutils [None req-5ebbdce6-00be-45c3-bcf7-9f97301139cd e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "755ae6cb63977e865a71a8c38a1a67ce95c9d0b7" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:43:36 compute-0 nova_compute[185723]: 2026-02-16 13:43:36.550 185727 DEBUG oslo_concurrency.processutils [None req-5ebbdce6-00be-45c3-bcf7-9f97301139cd e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:43:36 compute-0 nova_compute[185723]: 2026-02-16 13:43:36.605 185727 DEBUG oslo_concurrency.processutils [None req-5ebbdce6-00be-45c3-bcf7-9f97301139cd e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:43:36 compute-0 nova_compute[185723]: 2026-02-16 13:43:36.606 185727 DEBUG oslo_concurrency.processutils [None req-5ebbdce6-00be-45c3-bcf7-9f97301139cd e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7,backing_fmt=raw /var/lib/nova/instances/e5e9f487-f690-47b2-aaed-59236907f08b/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:43:36 compute-0 nova_compute[185723]: 2026-02-16 13:43:36.643 185727 DEBUG oslo_concurrency.processutils [None req-5ebbdce6-00be-45c3-bcf7-9f97301139cd e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7,backing_fmt=raw /var/lib/nova/instances/e5e9f487-f690-47b2-aaed-59236907f08b/disk 1073741824" returned: 0 in 0.037s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:43:36 compute-0 nova_compute[185723]: 2026-02-16 13:43:36.644 185727 DEBUG oslo_concurrency.lockutils [None req-5ebbdce6-00be-45c3-bcf7-9f97301139cd e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "755ae6cb63977e865a71a8c38a1a67ce95c9d0b7" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.107s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:43:36 compute-0 nova_compute[185723]: 2026-02-16 13:43:36.645 185727 DEBUG oslo_concurrency.processutils [None req-5ebbdce6-00be-45c3-bcf7-9f97301139cd e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:43:36 compute-0 nova_compute[185723]: 2026-02-16 13:43:36.696 185727 DEBUG oslo_concurrency.processutils [None req-5ebbdce6-00be-45c3-bcf7-9f97301139cd e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:43:36 compute-0 nova_compute[185723]: 2026-02-16 13:43:36.697 185727 DEBUG nova.virt.disk.api [None req-5ebbdce6-00be-45c3-bcf7-9f97301139cd e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Checking if we can resize image /var/lib/nova/instances/e5e9f487-f690-47b2-aaed-59236907f08b/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Feb 16 13:43:36 compute-0 nova_compute[185723]: 2026-02-16 13:43:36.698 185727 DEBUG oslo_concurrency.processutils [None req-5ebbdce6-00be-45c3-bcf7-9f97301139cd e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e5e9f487-f690-47b2-aaed-59236907f08b/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:43:36 compute-0 nova_compute[185723]: 2026-02-16 13:43:36.748 185727 DEBUG oslo_concurrency.processutils [None req-5ebbdce6-00be-45c3-bcf7-9f97301139cd e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e5e9f487-f690-47b2-aaed-59236907f08b/disk --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:43:36 compute-0 nova_compute[185723]: 2026-02-16 13:43:36.749 185727 DEBUG nova.virt.disk.api [None req-5ebbdce6-00be-45c3-bcf7-9f97301139cd e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Cannot resize image /var/lib/nova/instances/e5e9f487-f690-47b2-aaed-59236907f08b/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Feb 16 13:43:36 compute-0 nova_compute[185723]: 2026-02-16 13:43:36.749 185727 DEBUG nova.objects.instance [None req-5ebbdce6-00be-45c3-bcf7-9f97301139cd e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lazy-loading 'migration_context' on Instance uuid e5e9f487-f690-47b2-aaed-59236907f08b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 13:43:36 compute-0 nova_compute[185723]: 2026-02-16 13:43:36.766 185727 DEBUG nova.virt.libvirt.driver [None req-5ebbdce6-00be-45c3-bcf7-9f97301139cd e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: e5e9f487-f690-47b2-aaed-59236907f08b] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Feb 16 13:43:36 compute-0 nova_compute[185723]: 2026-02-16 13:43:36.766 185727 DEBUG nova.virt.libvirt.driver [None req-5ebbdce6-00be-45c3-bcf7-9f97301139cd e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: e5e9f487-f690-47b2-aaed-59236907f08b] Ensure instance console log exists: /var/lib/nova/instances/e5e9f487-f690-47b2-aaed-59236907f08b/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Feb 16 13:43:36 compute-0 nova_compute[185723]: 2026-02-16 13:43:36.767 185727 DEBUG oslo_concurrency.lockutils [None req-5ebbdce6-00be-45c3-bcf7-9f97301139cd e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:43:36 compute-0 nova_compute[185723]: 2026-02-16 13:43:36.767 185727 DEBUG oslo_concurrency.lockutils [None req-5ebbdce6-00be-45c3-bcf7-9f97301139cd e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:43:36 compute-0 nova_compute[185723]: 2026-02-16 13:43:36.767 185727 DEBUG oslo_concurrency.lockutils [None req-5ebbdce6-00be-45c3-bcf7-9f97301139cd e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:43:36 compute-0 nova_compute[185723]: 2026-02-16 13:43:36.813 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:43:36 compute-0 nova_compute[185723]: 2026-02-16 13:43:36.835 185727 WARNING nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] While synchronizing instance power states, found 1 instances in the database and 0 instances on the hypervisor.
Feb 16 13:43:36 compute-0 nova_compute[185723]: 2026-02-16 13:43:36.835 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Triggering sync for uuid e5e9f487-f690-47b2-aaed-59236907f08b _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268
Feb 16 13:43:36 compute-0 nova_compute[185723]: 2026-02-16 13:43:36.836 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Acquiring lock "e5e9f487-f690-47b2-aaed-59236907f08b" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:43:37 compute-0 nova_compute[185723]: 2026-02-16 13:43:37.420 185727 DEBUG nova.policy [None req-5ebbdce6-00be-45c3-bcf7-9f97301139cd e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'e19cd2d8a8894526ba620ca3249e9a63', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '76c271745e704d5fa97fe16a7dcd4a81', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Feb 16 13:43:38 compute-0 sshd-session[213620]: Invalid user admin from 64.227.72.94 port 37668
Feb 16 13:43:38 compute-0 sshd-session[213620]: Connection closed by invalid user admin 64.227.72.94 port 37668 [preauth]
Feb 16 13:43:39 compute-0 nova_compute[185723]: 2026-02-16 13:43:39.126 185727 DEBUG nova.network.neutron [None req-5ebbdce6-00be-45c3-bcf7-9f97301139cd e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: e5e9f487-f690-47b2-aaed-59236907f08b] Successfully created port: abc17ab2-80ac-4b01-990a-6eb58e6238ba _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Feb 16 13:43:39 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:43:39.618 105360 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=23, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '52:96:1b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '16:6c:7a:bd:49:39'}, ipsec=False) old=SB_Global(nb_cfg=22) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 13:43:39 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:43:39.620 105360 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 16 13:43:39 compute-0 nova_compute[185723]: 2026-02-16 13:43:39.619 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:43:39 compute-0 nova_compute[185723]: 2026-02-16 13:43:39.741 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:43:40 compute-0 nova_compute[185723]: 2026-02-16 13:43:40.146 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:43:40 compute-0 nova_compute[185723]: 2026-02-16 13:43:40.283 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:43:41 compute-0 sshd-session[213622]: Invalid user test from 146.190.226.24 port 50440
Feb 16 13:43:41 compute-0 nova_compute[185723]: 2026-02-16 13:43:41.715 185727 DEBUG nova.network.neutron [None req-5ebbdce6-00be-45c3-bcf7-9f97301139cd e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: e5e9f487-f690-47b2-aaed-59236907f08b] Successfully updated port: abc17ab2-80ac-4b01-990a-6eb58e6238ba _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Feb 16 13:43:41 compute-0 nova_compute[185723]: 2026-02-16 13:43:41.732 185727 DEBUG oslo_concurrency.lockutils [None req-5ebbdce6-00be-45c3-bcf7-9f97301139cd e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Acquiring lock "refresh_cache-e5e9f487-f690-47b2-aaed-59236907f08b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 13:43:41 compute-0 nova_compute[185723]: 2026-02-16 13:43:41.733 185727 DEBUG oslo_concurrency.lockutils [None req-5ebbdce6-00be-45c3-bcf7-9f97301139cd e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Acquired lock "refresh_cache-e5e9f487-f690-47b2-aaed-59236907f08b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 13:43:41 compute-0 nova_compute[185723]: 2026-02-16 13:43:41.733 185727 DEBUG nova.network.neutron [None req-5ebbdce6-00be-45c3-bcf7-9f97301139cd e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: e5e9f487-f690-47b2-aaed-59236907f08b] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 16 13:43:41 compute-0 nova_compute[185723]: 2026-02-16 13:43:41.834 185727 DEBUG nova.compute.manager [req-84a70273-a5b1-410f-9700-a58fc544d446 req-976f987b-4393-4382-9cd3-a669784da162 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: e5e9f487-f690-47b2-aaed-59236907f08b] Received event network-changed-abc17ab2-80ac-4b01-990a-6eb58e6238ba external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:43:41 compute-0 nova_compute[185723]: 2026-02-16 13:43:41.834 185727 DEBUG nova.compute.manager [req-84a70273-a5b1-410f-9700-a58fc544d446 req-976f987b-4393-4382-9cd3-a669784da162 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: e5e9f487-f690-47b2-aaed-59236907f08b] Refreshing instance network info cache due to event network-changed-abc17ab2-80ac-4b01-990a-6eb58e6238ba. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 16 13:43:41 compute-0 nova_compute[185723]: 2026-02-16 13:43:41.835 185727 DEBUG oslo_concurrency.lockutils [req-84a70273-a5b1-410f-9700-a58fc544d446 req-976f987b-4393-4382-9cd3-a669784da162 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "refresh_cache-e5e9f487-f690-47b2-aaed-59236907f08b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 13:43:41 compute-0 sshd-session[213622]: Connection closed by invalid user test 146.190.226.24 port 50440 [preauth]
Feb 16 13:43:42 compute-0 nova_compute[185723]: 2026-02-16 13:43:42.587 185727 DEBUG nova.network.neutron [None req-5ebbdce6-00be-45c3-bcf7-9f97301139cd e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: e5e9f487-f690-47b2-aaed-59236907f08b] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Feb 16 13:43:44 compute-0 nova_compute[185723]: 2026-02-16 13:43:44.433 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:43:44 compute-0 nova_compute[185723]: 2026-02-16 13:43:44.785 185727 DEBUG nova.network.neutron [None req-5ebbdce6-00be-45c3-bcf7-9f97301139cd e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: e5e9f487-f690-47b2-aaed-59236907f08b] Updating instance_info_cache with network_info: [{"id": "abc17ab2-80ac-4b01-990a-6eb58e6238ba", "address": "fa:16:3e:eb:ae:a5", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapabc17ab2-80", "ovs_interfaceid": "abc17ab2-80ac-4b01-990a-6eb58e6238ba", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 13:43:44 compute-0 nova_compute[185723]: 2026-02-16 13:43:44.803 185727 DEBUG oslo_concurrency.lockutils [None req-5ebbdce6-00be-45c3-bcf7-9f97301139cd e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Releasing lock "refresh_cache-e5e9f487-f690-47b2-aaed-59236907f08b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 13:43:44 compute-0 nova_compute[185723]: 2026-02-16 13:43:44.803 185727 DEBUG nova.compute.manager [None req-5ebbdce6-00be-45c3-bcf7-9f97301139cd e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: e5e9f487-f690-47b2-aaed-59236907f08b] Instance network_info: |[{"id": "abc17ab2-80ac-4b01-990a-6eb58e6238ba", "address": "fa:16:3e:eb:ae:a5", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapabc17ab2-80", "ovs_interfaceid": "abc17ab2-80ac-4b01-990a-6eb58e6238ba", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Feb 16 13:43:44 compute-0 nova_compute[185723]: 2026-02-16 13:43:44.804 185727 DEBUG oslo_concurrency.lockutils [req-84a70273-a5b1-410f-9700-a58fc544d446 req-976f987b-4393-4382-9cd3-a669784da162 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquired lock "refresh_cache-e5e9f487-f690-47b2-aaed-59236907f08b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 13:43:44 compute-0 nova_compute[185723]: 2026-02-16 13:43:44.804 185727 DEBUG nova.network.neutron [req-84a70273-a5b1-410f-9700-a58fc544d446 req-976f987b-4393-4382-9cd3-a669784da162 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: e5e9f487-f690-47b2-aaed-59236907f08b] Refreshing network info cache for port abc17ab2-80ac-4b01-990a-6eb58e6238ba _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 16 13:43:44 compute-0 nova_compute[185723]: 2026-02-16 13:43:44.808 185727 DEBUG nova.virt.libvirt.driver [None req-5ebbdce6-00be-45c3-bcf7-9f97301139cd e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: e5e9f487-f690-47b2-aaed-59236907f08b] Start _get_guest_xml network_info=[{"id": "abc17ab2-80ac-4b01-990a-6eb58e6238ba", "address": "fa:16:3e:eb:ae:a5", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapabc17ab2-80", "ovs_interfaceid": "abc17ab2-80ac-4b01-990a-6eb58e6238ba", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-16T13:17:01Z,direct_url=<?>,disk_format='qcow2',id=6fb9af7f-2971-4890-a777-6e99e888717f,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='fa8ec31824694513a42cc22c81880a5c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-16T13:17:03Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'encrypted': False, 'device_type': 'disk', 'disk_bus': 'virtio', 'encryption_options': None, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'size': 0, 'image_id': '6fb9af7f-2971-4890-a777-6e99e888717f'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Feb 16 13:43:44 compute-0 nova_compute[185723]: 2026-02-16 13:43:44.812 185727 WARNING nova.virt.libvirt.driver [None req-5ebbdce6-00be-45c3-bcf7-9f97301139cd e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 13:43:44 compute-0 nova_compute[185723]: 2026-02-16 13:43:44.817 185727 DEBUG nova.virt.libvirt.host [None req-5ebbdce6-00be-45c3-bcf7-9f97301139cd e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Feb 16 13:43:44 compute-0 nova_compute[185723]: 2026-02-16 13:43:44.818 185727 DEBUG nova.virt.libvirt.host [None req-5ebbdce6-00be-45c3-bcf7-9f97301139cd e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Feb 16 13:43:44 compute-0 nova_compute[185723]: 2026-02-16 13:43:44.827 185727 DEBUG nova.virt.libvirt.host [None req-5ebbdce6-00be-45c3-bcf7-9f97301139cd e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Feb 16 13:43:44 compute-0 nova_compute[185723]: 2026-02-16 13:43:44.828 185727 DEBUG nova.virt.libvirt.host [None req-5ebbdce6-00be-45c3-bcf7-9f97301139cd e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Feb 16 13:43:44 compute-0 nova_compute[185723]: 2026-02-16 13:43:44.829 185727 DEBUG nova.virt.libvirt.driver [None req-5ebbdce6-00be-45c3-bcf7-9f97301139cd e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Feb 16 13:43:44 compute-0 nova_compute[185723]: 2026-02-16 13:43:44.829 185727 DEBUG nova.virt.hardware [None req-5ebbdce6-00be-45c3-bcf7-9f97301139cd e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-02-16T13:16:59Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='6d89f72c-1760-421e-a5f2-83dfc3723b84',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-16T13:17:01Z,direct_url=<?>,disk_format='qcow2',id=6fb9af7f-2971-4890-a777-6e99e888717f,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='fa8ec31824694513a42cc22c81880a5c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-16T13:17:03Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Feb 16 13:43:44 compute-0 nova_compute[185723]: 2026-02-16 13:43:44.830 185727 DEBUG nova.virt.hardware [None req-5ebbdce6-00be-45c3-bcf7-9f97301139cd e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Feb 16 13:43:44 compute-0 nova_compute[185723]: 2026-02-16 13:43:44.830 185727 DEBUG nova.virt.hardware [None req-5ebbdce6-00be-45c3-bcf7-9f97301139cd e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Feb 16 13:43:44 compute-0 nova_compute[185723]: 2026-02-16 13:43:44.830 185727 DEBUG nova.virt.hardware [None req-5ebbdce6-00be-45c3-bcf7-9f97301139cd e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Feb 16 13:43:44 compute-0 nova_compute[185723]: 2026-02-16 13:43:44.831 185727 DEBUG nova.virt.hardware [None req-5ebbdce6-00be-45c3-bcf7-9f97301139cd e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Feb 16 13:43:44 compute-0 nova_compute[185723]: 2026-02-16 13:43:44.831 185727 DEBUG nova.virt.hardware [None req-5ebbdce6-00be-45c3-bcf7-9f97301139cd e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Feb 16 13:43:44 compute-0 nova_compute[185723]: 2026-02-16 13:43:44.831 185727 DEBUG nova.virt.hardware [None req-5ebbdce6-00be-45c3-bcf7-9f97301139cd e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Feb 16 13:43:44 compute-0 nova_compute[185723]: 2026-02-16 13:43:44.831 185727 DEBUG nova.virt.hardware [None req-5ebbdce6-00be-45c3-bcf7-9f97301139cd e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Feb 16 13:43:44 compute-0 nova_compute[185723]: 2026-02-16 13:43:44.832 185727 DEBUG nova.virt.hardware [None req-5ebbdce6-00be-45c3-bcf7-9f97301139cd e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Feb 16 13:43:44 compute-0 nova_compute[185723]: 2026-02-16 13:43:44.832 185727 DEBUG nova.virt.hardware [None req-5ebbdce6-00be-45c3-bcf7-9f97301139cd e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Feb 16 13:43:44 compute-0 nova_compute[185723]: 2026-02-16 13:43:44.832 185727 DEBUG nova.virt.hardware [None req-5ebbdce6-00be-45c3-bcf7-9f97301139cd e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Feb 16 13:43:44 compute-0 nova_compute[185723]: 2026-02-16 13:43:44.836 185727 DEBUG nova.virt.libvirt.vif [None req-5ebbdce6-00be-45c3-bcf7-9f97301139cd e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-16T13:43:35Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-2076948568',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-2076948568',id=20,image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='76c271745e704d5fa97fe16a7dcd4a81',ramdisk_id='',reservation_id='r-1ickj6a9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-1085993185',owner_user_name='tempest-TestExecuteStrategies-1085993185-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-16T13:43:36Z,user_data=None,user_id='e19cd2d8a8894526ba620ca3249e9a63',uuid=e5e9f487-f690-47b2-aaed-59236907f08b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "abc17ab2-80ac-4b01-990a-6eb58e6238ba", "address": "fa:16:3e:eb:ae:a5", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapabc17ab2-80", "ovs_interfaceid": "abc17ab2-80ac-4b01-990a-6eb58e6238ba", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Feb 16 13:43:44 compute-0 nova_compute[185723]: 2026-02-16 13:43:44.837 185727 DEBUG nova.network.os_vif_util [None req-5ebbdce6-00be-45c3-bcf7-9f97301139cd e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Converting VIF {"id": "abc17ab2-80ac-4b01-990a-6eb58e6238ba", "address": "fa:16:3e:eb:ae:a5", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapabc17ab2-80", "ovs_interfaceid": "abc17ab2-80ac-4b01-990a-6eb58e6238ba", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 13:43:44 compute-0 nova_compute[185723]: 2026-02-16 13:43:44.837 185727 DEBUG nova.network.os_vif_util [None req-5ebbdce6-00be-45c3-bcf7-9f97301139cd e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:eb:ae:a5,bridge_name='br-int',has_traffic_filtering=True,id=abc17ab2-80ac-4b01-990a-6eb58e6238ba,network=Network(62a1ccdd-3048-4bbf-acc8-c791bff79ee8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapabc17ab2-80') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 13:43:44 compute-0 nova_compute[185723]: 2026-02-16 13:43:44.838 185727 DEBUG nova.objects.instance [None req-5ebbdce6-00be-45c3-bcf7-9f97301139cd e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lazy-loading 'pci_devices' on Instance uuid e5e9f487-f690-47b2-aaed-59236907f08b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 13:43:44 compute-0 nova_compute[185723]: 2026-02-16 13:43:44.861 185727 DEBUG nova.virt.libvirt.driver [None req-5ebbdce6-00be-45c3-bcf7-9f97301139cd e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: e5e9f487-f690-47b2-aaed-59236907f08b] End _get_guest_xml xml=<domain type="kvm">
Feb 16 13:43:44 compute-0 nova_compute[185723]:   <uuid>e5e9f487-f690-47b2-aaed-59236907f08b</uuid>
Feb 16 13:43:44 compute-0 nova_compute[185723]:   <name>instance-00000014</name>
Feb 16 13:43:44 compute-0 nova_compute[185723]:   <memory>131072</memory>
Feb 16 13:43:44 compute-0 nova_compute[185723]:   <vcpu>1</vcpu>
Feb 16 13:43:44 compute-0 nova_compute[185723]:   <metadata>
Feb 16 13:43:44 compute-0 nova_compute[185723]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb 16 13:43:44 compute-0 nova_compute[185723]:       <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Feb 16 13:43:44 compute-0 nova_compute[185723]:       <nova:name>tempest-TestExecuteStrategies-server-2076948568</nova:name>
Feb 16 13:43:44 compute-0 nova_compute[185723]:       <nova:creationTime>2026-02-16 13:43:44</nova:creationTime>
Feb 16 13:43:44 compute-0 nova_compute[185723]:       <nova:flavor name="m1.nano">
Feb 16 13:43:44 compute-0 nova_compute[185723]:         <nova:memory>128</nova:memory>
Feb 16 13:43:44 compute-0 nova_compute[185723]:         <nova:disk>1</nova:disk>
Feb 16 13:43:44 compute-0 nova_compute[185723]:         <nova:swap>0</nova:swap>
Feb 16 13:43:44 compute-0 nova_compute[185723]:         <nova:ephemeral>0</nova:ephemeral>
Feb 16 13:43:44 compute-0 nova_compute[185723]:         <nova:vcpus>1</nova:vcpus>
Feb 16 13:43:44 compute-0 nova_compute[185723]:       </nova:flavor>
Feb 16 13:43:44 compute-0 nova_compute[185723]:       <nova:owner>
Feb 16 13:43:44 compute-0 nova_compute[185723]:         <nova:user uuid="e19cd2d8a8894526ba620ca3249e9a63">tempest-TestExecuteStrategies-1085993185-project-member</nova:user>
Feb 16 13:43:44 compute-0 nova_compute[185723]:         <nova:project uuid="76c271745e704d5fa97fe16a7dcd4a81">tempest-TestExecuteStrategies-1085993185</nova:project>
Feb 16 13:43:44 compute-0 nova_compute[185723]:       </nova:owner>
Feb 16 13:43:44 compute-0 nova_compute[185723]:       <nova:root type="image" uuid="6fb9af7f-2971-4890-a777-6e99e888717f"/>
Feb 16 13:43:44 compute-0 nova_compute[185723]:       <nova:ports>
Feb 16 13:43:44 compute-0 nova_compute[185723]:         <nova:port uuid="abc17ab2-80ac-4b01-990a-6eb58e6238ba">
Feb 16 13:43:44 compute-0 nova_compute[185723]:           <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Feb 16 13:43:44 compute-0 nova_compute[185723]:         </nova:port>
Feb 16 13:43:44 compute-0 nova_compute[185723]:       </nova:ports>
Feb 16 13:43:44 compute-0 nova_compute[185723]:     </nova:instance>
Feb 16 13:43:44 compute-0 nova_compute[185723]:   </metadata>
Feb 16 13:43:44 compute-0 nova_compute[185723]:   <sysinfo type="smbios">
Feb 16 13:43:44 compute-0 nova_compute[185723]:     <system>
Feb 16 13:43:44 compute-0 nova_compute[185723]:       <entry name="manufacturer">RDO</entry>
Feb 16 13:43:44 compute-0 nova_compute[185723]:       <entry name="product">OpenStack Compute</entry>
Feb 16 13:43:44 compute-0 nova_compute[185723]:       <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Feb 16 13:43:44 compute-0 nova_compute[185723]:       <entry name="serial">e5e9f487-f690-47b2-aaed-59236907f08b</entry>
Feb 16 13:43:44 compute-0 nova_compute[185723]:       <entry name="uuid">e5e9f487-f690-47b2-aaed-59236907f08b</entry>
Feb 16 13:43:44 compute-0 nova_compute[185723]:       <entry name="family">Virtual Machine</entry>
Feb 16 13:43:44 compute-0 nova_compute[185723]:     </system>
Feb 16 13:43:44 compute-0 nova_compute[185723]:   </sysinfo>
Feb 16 13:43:44 compute-0 nova_compute[185723]:   <os>
Feb 16 13:43:44 compute-0 nova_compute[185723]:     <type arch="x86_64" machine="q35">hvm</type>
Feb 16 13:43:44 compute-0 nova_compute[185723]:     <boot dev="hd"/>
Feb 16 13:43:44 compute-0 nova_compute[185723]:     <smbios mode="sysinfo"/>
Feb 16 13:43:44 compute-0 nova_compute[185723]:   </os>
Feb 16 13:43:44 compute-0 nova_compute[185723]:   <features>
Feb 16 13:43:44 compute-0 nova_compute[185723]:     <acpi/>
Feb 16 13:43:44 compute-0 nova_compute[185723]:     <apic/>
Feb 16 13:43:44 compute-0 nova_compute[185723]:     <vmcoreinfo/>
Feb 16 13:43:44 compute-0 nova_compute[185723]:   </features>
Feb 16 13:43:44 compute-0 nova_compute[185723]:   <clock offset="utc">
Feb 16 13:43:44 compute-0 nova_compute[185723]:     <timer name="pit" tickpolicy="delay"/>
Feb 16 13:43:44 compute-0 nova_compute[185723]:     <timer name="rtc" tickpolicy="catchup"/>
Feb 16 13:43:44 compute-0 nova_compute[185723]:     <timer name="hpet" present="no"/>
Feb 16 13:43:44 compute-0 nova_compute[185723]:   </clock>
Feb 16 13:43:44 compute-0 nova_compute[185723]:   <cpu mode="custom" match="exact">
Feb 16 13:43:44 compute-0 nova_compute[185723]:     <model>Nehalem</model>
Feb 16 13:43:44 compute-0 nova_compute[185723]:     <topology sockets="1" cores="1" threads="1"/>
Feb 16 13:43:44 compute-0 nova_compute[185723]:   </cpu>
Feb 16 13:43:44 compute-0 nova_compute[185723]:   <devices>
Feb 16 13:43:44 compute-0 nova_compute[185723]:     <disk type="file" device="disk">
Feb 16 13:43:44 compute-0 nova_compute[185723]:       <driver name="qemu" type="qcow2" cache="none"/>
Feb 16 13:43:44 compute-0 nova_compute[185723]:       <source file="/var/lib/nova/instances/e5e9f487-f690-47b2-aaed-59236907f08b/disk"/>
Feb 16 13:43:44 compute-0 nova_compute[185723]:       <target dev="vda" bus="virtio"/>
Feb 16 13:43:44 compute-0 nova_compute[185723]:     </disk>
Feb 16 13:43:44 compute-0 nova_compute[185723]:     <disk type="file" device="cdrom">
Feb 16 13:43:44 compute-0 nova_compute[185723]:       <driver name="qemu" type="raw" cache="none"/>
Feb 16 13:43:44 compute-0 nova_compute[185723]:       <source file="/var/lib/nova/instances/e5e9f487-f690-47b2-aaed-59236907f08b/disk.config"/>
Feb 16 13:43:44 compute-0 nova_compute[185723]:       <target dev="sda" bus="sata"/>
Feb 16 13:43:44 compute-0 nova_compute[185723]:     </disk>
Feb 16 13:43:44 compute-0 nova_compute[185723]:     <interface type="ethernet">
Feb 16 13:43:44 compute-0 nova_compute[185723]:       <mac address="fa:16:3e:eb:ae:a5"/>
Feb 16 13:43:44 compute-0 nova_compute[185723]:       <model type="virtio"/>
Feb 16 13:43:44 compute-0 nova_compute[185723]:       <driver name="vhost" rx_queue_size="512"/>
Feb 16 13:43:44 compute-0 nova_compute[185723]:       <mtu size="1442"/>
Feb 16 13:43:44 compute-0 nova_compute[185723]:       <target dev="tapabc17ab2-80"/>
Feb 16 13:43:44 compute-0 nova_compute[185723]:     </interface>
Feb 16 13:43:44 compute-0 nova_compute[185723]:     <serial type="pty">
Feb 16 13:43:44 compute-0 nova_compute[185723]:       <log file="/var/lib/nova/instances/e5e9f487-f690-47b2-aaed-59236907f08b/console.log" append="off"/>
Feb 16 13:43:44 compute-0 nova_compute[185723]:     </serial>
Feb 16 13:43:44 compute-0 nova_compute[185723]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Feb 16 13:43:44 compute-0 nova_compute[185723]:     <video>
Feb 16 13:43:44 compute-0 nova_compute[185723]:       <model type="virtio"/>
Feb 16 13:43:44 compute-0 nova_compute[185723]:     </video>
Feb 16 13:43:44 compute-0 nova_compute[185723]:     <input type="tablet" bus="usb"/>
Feb 16 13:43:44 compute-0 nova_compute[185723]:     <rng model="virtio">
Feb 16 13:43:44 compute-0 nova_compute[185723]:       <backend model="random">/dev/urandom</backend>
Feb 16 13:43:44 compute-0 nova_compute[185723]:     </rng>
Feb 16 13:43:44 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root"/>
Feb 16 13:43:44 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:43:44 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:43:44 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:43:44 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:43:44 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:43:44 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:43:44 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:43:44 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:43:44 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:43:44 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:43:44 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:43:44 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:43:44 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:43:44 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:43:44 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:43:44 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:43:44 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:43:44 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:43:44 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:43:44 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:43:44 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:43:44 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:43:44 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:43:44 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:43:44 compute-0 nova_compute[185723]:     <controller type="usb" index="0"/>
Feb 16 13:43:44 compute-0 nova_compute[185723]:     <memballoon model="virtio">
Feb 16 13:43:44 compute-0 nova_compute[185723]:       <stats period="10"/>
Feb 16 13:43:44 compute-0 nova_compute[185723]:     </memballoon>
Feb 16 13:43:44 compute-0 nova_compute[185723]:   </devices>
Feb 16 13:43:44 compute-0 nova_compute[185723]: </domain>
Feb 16 13:43:44 compute-0 nova_compute[185723]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Feb 16 13:43:44 compute-0 nova_compute[185723]: 2026-02-16 13:43:44.863 185727 DEBUG nova.compute.manager [None req-5ebbdce6-00be-45c3-bcf7-9f97301139cd e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: e5e9f487-f690-47b2-aaed-59236907f08b] Preparing to wait for external event network-vif-plugged-abc17ab2-80ac-4b01-990a-6eb58e6238ba prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Feb 16 13:43:44 compute-0 nova_compute[185723]: 2026-02-16 13:43:44.863 185727 DEBUG oslo_concurrency.lockutils [None req-5ebbdce6-00be-45c3-bcf7-9f97301139cd e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Acquiring lock "e5e9f487-f690-47b2-aaed-59236907f08b-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:43:44 compute-0 nova_compute[185723]: 2026-02-16 13:43:44.863 185727 DEBUG oslo_concurrency.lockutils [None req-5ebbdce6-00be-45c3-bcf7-9f97301139cd e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "e5e9f487-f690-47b2-aaed-59236907f08b-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:43:44 compute-0 nova_compute[185723]: 2026-02-16 13:43:44.864 185727 DEBUG oslo_concurrency.lockutils [None req-5ebbdce6-00be-45c3-bcf7-9f97301139cd e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "e5e9f487-f690-47b2-aaed-59236907f08b-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:43:44 compute-0 nova_compute[185723]: 2026-02-16 13:43:44.865 185727 DEBUG nova.virt.libvirt.vif [None req-5ebbdce6-00be-45c3-bcf7-9f97301139cd e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-16T13:43:35Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-2076948568',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-2076948568',id=20,image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='76c271745e704d5fa97fe16a7dcd4a81',ramdisk_id='',reservation_id='r-1ickj6a9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-1085993185',owner_user_name='tempest-TestExecuteStrategies-1085993185-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-16T13:43:36Z,user_data=None,user_id='e19cd2d8a8894526ba620ca3249e9a63',uuid=e5e9f487-f690-47b2-aaed-59236907f08b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "abc17ab2-80ac-4b01-990a-6eb58e6238ba", "address": "fa:16:3e:eb:ae:a5", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapabc17ab2-80", "ovs_interfaceid": "abc17ab2-80ac-4b01-990a-6eb58e6238ba", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Feb 16 13:43:44 compute-0 nova_compute[185723]: 2026-02-16 13:43:44.865 185727 DEBUG nova.network.os_vif_util [None req-5ebbdce6-00be-45c3-bcf7-9f97301139cd e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Converting VIF {"id": "abc17ab2-80ac-4b01-990a-6eb58e6238ba", "address": "fa:16:3e:eb:ae:a5", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapabc17ab2-80", "ovs_interfaceid": "abc17ab2-80ac-4b01-990a-6eb58e6238ba", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 13:43:44 compute-0 nova_compute[185723]: 2026-02-16 13:43:44.866 185727 DEBUG nova.network.os_vif_util [None req-5ebbdce6-00be-45c3-bcf7-9f97301139cd e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:eb:ae:a5,bridge_name='br-int',has_traffic_filtering=True,id=abc17ab2-80ac-4b01-990a-6eb58e6238ba,network=Network(62a1ccdd-3048-4bbf-acc8-c791bff79ee8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapabc17ab2-80') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 13:43:44 compute-0 nova_compute[185723]: 2026-02-16 13:43:44.867 185727 DEBUG os_vif [None req-5ebbdce6-00be-45c3-bcf7-9f97301139cd e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:eb:ae:a5,bridge_name='br-int',has_traffic_filtering=True,id=abc17ab2-80ac-4b01-990a-6eb58e6238ba,network=Network(62a1ccdd-3048-4bbf-acc8-c791bff79ee8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapabc17ab2-80') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Feb 16 13:43:44 compute-0 nova_compute[185723]: 2026-02-16 13:43:44.869 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:43:44 compute-0 nova_compute[185723]: 2026-02-16 13:43:44.869 185727 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:43:44 compute-0 nova_compute[185723]: 2026-02-16 13:43:44.870 185727 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 13:43:44 compute-0 nova_compute[185723]: 2026-02-16 13:43:44.873 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:43:44 compute-0 nova_compute[185723]: 2026-02-16 13:43:44.874 185727 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapabc17ab2-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:43:44 compute-0 nova_compute[185723]: 2026-02-16 13:43:44.875 185727 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapabc17ab2-80, col_values=(('external_ids', {'iface-id': 'abc17ab2-80ac-4b01-990a-6eb58e6238ba', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:eb:ae:a5', 'vm-uuid': 'e5e9f487-f690-47b2-aaed-59236907f08b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:43:44 compute-0 nova_compute[185723]: 2026-02-16 13:43:44.876 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:43:44 compute-0 NetworkManager[56177]: <info>  [1771249424.8784] manager: (tapabc17ab2-80): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/66)
Feb 16 13:43:44 compute-0 nova_compute[185723]: 2026-02-16 13:43:44.879 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 13:43:44 compute-0 nova_compute[185723]: 2026-02-16 13:43:44.882 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:43:44 compute-0 nova_compute[185723]: 2026-02-16 13:43:44.884 185727 INFO os_vif [None req-5ebbdce6-00be-45c3-bcf7-9f97301139cd e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:eb:ae:a5,bridge_name='br-int',has_traffic_filtering=True,id=abc17ab2-80ac-4b01-990a-6eb58e6238ba,network=Network(62a1ccdd-3048-4bbf-acc8-c791bff79ee8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapabc17ab2-80')
Feb 16 13:43:44 compute-0 nova_compute[185723]: 2026-02-16 13:43:44.934 185727 DEBUG nova.virt.libvirt.driver [None req-5ebbdce6-00be-45c3-bcf7-9f97301139cd e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 16 13:43:44 compute-0 nova_compute[185723]: 2026-02-16 13:43:44.935 185727 DEBUG nova.virt.libvirt.driver [None req-5ebbdce6-00be-45c3-bcf7-9f97301139cd e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 16 13:43:44 compute-0 nova_compute[185723]: 2026-02-16 13:43:44.935 185727 DEBUG nova.virt.libvirt.driver [None req-5ebbdce6-00be-45c3-bcf7-9f97301139cd e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] No VIF found with MAC fa:16:3e:eb:ae:a5, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Feb 16 13:43:44 compute-0 nova_compute[185723]: 2026-02-16 13:43:44.935 185727 INFO nova.virt.libvirt.driver [None req-5ebbdce6-00be-45c3-bcf7-9f97301139cd e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: e5e9f487-f690-47b2-aaed-59236907f08b] Using config drive
Feb 16 13:43:45 compute-0 podman[213626]: 2026-02-16 13:43:45.023082373 +0000 UTC m=+0.056713737 container health_status 4ec6105d1727f201f656d7e6e358931be8ceabc5af37fb5ad779abc620122180 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 16 13:43:45 compute-0 nova_compute[185723]: 2026-02-16 13:43:45.147 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:43:45 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:43:45.622 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=b0e583b2-47d7-4bde-bbd6-282143e0c194, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '23'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:43:45 compute-0 nova_compute[185723]: 2026-02-16 13:43:45.782 185727 INFO nova.virt.libvirt.driver [None req-5ebbdce6-00be-45c3-bcf7-9f97301139cd e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: e5e9f487-f690-47b2-aaed-59236907f08b] Creating config drive at /var/lib/nova/instances/e5e9f487-f690-47b2-aaed-59236907f08b/disk.config
Feb 16 13:43:45 compute-0 nova_compute[185723]: 2026-02-16 13:43:45.786 185727 DEBUG oslo_concurrency.processutils [None req-5ebbdce6-00be-45c3-bcf7-9f97301139cd e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/e5e9f487-f690-47b2-aaed-59236907f08b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpy33gsdyx execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:43:45 compute-0 nova_compute[185723]: 2026-02-16 13:43:45.904 185727 DEBUG oslo_concurrency.processutils [None req-5ebbdce6-00be-45c3-bcf7-9f97301139cd e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/e5e9f487-f690-47b2-aaed-59236907f08b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpy33gsdyx" returned: 0 in 0.118s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:43:45 compute-0 kernel: tapabc17ab2-80: entered promiscuous mode
Feb 16 13:43:45 compute-0 NetworkManager[56177]: <info>  [1771249425.9655] manager: (tapabc17ab2-80): new Tun device (/org/freedesktop/NetworkManager/Devices/67)
Feb 16 13:43:45 compute-0 ovn_controller[96072]: 2026-02-16T13:43:45Z|00163|binding|INFO|Claiming lport abc17ab2-80ac-4b01-990a-6eb58e6238ba for this chassis.
Feb 16 13:43:45 compute-0 ovn_controller[96072]: 2026-02-16T13:43:45Z|00164|binding|INFO|abc17ab2-80ac-4b01-990a-6eb58e6238ba: Claiming fa:16:3e:eb:ae:a5 10.100.0.14
Feb 16 13:43:45 compute-0 systemd-udevd[213665]: Network interface NamePolicy= disabled on kernel command line.
Feb 16 13:43:45 compute-0 nova_compute[185723]: 2026-02-16 13:43:45.991 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:43:45 compute-0 nova_compute[185723]: 2026-02-16 13:43:45.994 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:43:46 compute-0 NetworkManager[56177]: <info>  [1771249426.0038] device (tapabc17ab2-80): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 16 13:43:46 compute-0 NetworkManager[56177]: <info>  [1771249426.0051] device (tapabc17ab2-80): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Feb 16 13:43:46 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:43:46.003 105360 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:eb:ae:a5 10.100.0.14'], port_security=['fa:16:3e:eb:ae:a5 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'e5e9f487-f690-47b2-aaed-59236907f08b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '76c271745e704d5fa97fe16a7dcd4a81', 'neutron:revision_number': '2', 'neutron:security_group_ids': '832f9d98-96fb-45e2-8c11-0f4534711ee2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fb3ae2f5-242d-4288-83f4-c053c76499e5, chassis=[<ovs.db.idl.Row object at 0x7f2e53046760>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2e53046760>], logical_port=abc17ab2-80ac-4b01-990a-6eb58e6238ba) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 13:43:46 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:43:46.004 105360 INFO neutron.agent.ovn.metadata.agent [-] Port abc17ab2-80ac-4b01-990a-6eb58e6238ba in datapath 62a1ccdd-3048-4bbf-acc8-c791bff79ee8 bound to our chassis
Feb 16 13:43:46 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:43:46.005 105360 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 62a1ccdd-3048-4bbf-acc8-c791bff79ee8
Feb 16 13:43:46 compute-0 ovn_controller[96072]: 2026-02-16T13:43:46Z|00165|binding|INFO|Setting lport abc17ab2-80ac-4b01-990a-6eb58e6238ba ovn-installed in OVS
Feb 16 13:43:46 compute-0 ovn_controller[96072]: 2026-02-16T13:43:46Z|00166|binding|INFO|Setting lport abc17ab2-80ac-4b01-990a-6eb58e6238ba up in Southbound
Feb 16 13:43:46 compute-0 nova_compute[185723]: 2026-02-16 13:43:46.010 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:43:46 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:43:46.016 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[572880ef-ee42-4330-ac97-43b566eab699]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:43:46 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:43:46.017 105360 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap62a1ccdd-31 in ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Feb 16 13:43:46 compute-0 systemd-machined[155229]: New machine qemu-15-instance-00000014.
Feb 16 13:43:46 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:43:46.019 206438 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap62a1ccdd-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Feb 16 13:43:46 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:43:46.019 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[93f02a11-7cfc-4b6e-b2a4-82dc67182604]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:43:46 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:43:46.020 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[d58a76f7-dff8-4500-b48b-06f1887eda05]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:43:46 compute-0 systemd[1]: Started Virtual Machine qemu-15-instance-00000014.
Feb 16 13:43:46 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:43:46.033 105762 DEBUG oslo.privsep.daemon [-] privsep: reply[e63f00d8-2b2a-4dd0-b123-00793dd5000e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:43:46 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:43:46.044 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[cbaa4278-2db5-48c4-9e6a-a0a368cdff17]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:43:46 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:43:46.060 206452 DEBUG oslo.privsep.daemon [-] privsep: reply[7e9ccfd9-9ed2-4d0e-b4c9-68e7ee75d798]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:43:46 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:43:46.066 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[34e45bca-5771-45e8-b6a9-3dc16f184b7d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:43:46 compute-0 NetworkManager[56177]: <info>  [1771249426.0675] manager: (tap62a1ccdd-30): new Veth device (/org/freedesktop/NetworkManager/Devices/68)
Feb 16 13:43:46 compute-0 systemd-udevd[213669]: Network interface NamePolicy= disabled on kernel command line.
Feb 16 13:43:46 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:43:46.093 206452 DEBUG oslo.privsep.daemon [-] privsep: reply[61c03b28-c25e-4307-8afe-99d9bf46d545]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:43:46 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:43:46.097 206452 DEBUG oslo.privsep.daemon [-] privsep: reply[36206f48-0a05-4f47-8295-07220ff06f7d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:43:46 compute-0 NetworkManager[56177]: <info>  [1771249426.1149] device (tap62a1ccdd-30): carrier: link connected
Feb 16 13:43:46 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:43:46.117 206452 DEBUG oslo.privsep.daemon [-] privsep: reply[23eb2dd1-6a5c-4b6a-ad50-ab840606ef61]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:43:46 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:43:46.131 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[ff057241-5a78-466d-b9d3-f2ada11f1046]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap62a1ccdd-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a9:94:92'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 48], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 550951, 'reachable_time': 24640, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 213701, 'error': None, 'target': 'ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:43:46 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:43:46.141 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[311143bd-09a3-450a-a793-bbe80cae5e2f]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fea9:9492'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 550951, 'tstamp': 550951}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 213702, 'error': None, 'target': 'ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:43:46 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:43:46.156 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[fab52bba-6cdd-4979-a2c1-0c5ecdbdf5dd]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap62a1ccdd-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a9:94:92'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 48], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 550951, 'reachable_time': 24640, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 213703, 'error': None, 'target': 'ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:43:46 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:43:46.181 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[8fbf0d21-bae8-47bf-9309-810cc11f8095]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:43:46 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:43:46.229 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[1571f9bd-c77f-48b5-91af-1239fbe9aeb5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:43:46 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:43:46.231 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap62a1ccdd-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:43:46 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:43:46.231 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 13:43:46 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:43:46.232 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap62a1ccdd-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:43:46 compute-0 nova_compute[185723]: 2026-02-16 13:43:46.233 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:43:46 compute-0 kernel: tap62a1ccdd-30: entered promiscuous mode
Feb 16 13:43:46 compute-0 NetworkManager[56177]: <info>  [1771249426.2343] manager: (tap62a1ccdd-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/69)
Feb 16 13:43:46 compute-0 nova_compute[185723]: 2026-02-16 13:43:46.235 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:43:46 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:43:46.238 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap62a1ccdd-30, col_values=(('external_ids', {'iface-id': 'ac21d57d-f71e-4560-b6aa-e9f6e3838308'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:43:46 compute-0 ovn_controller[96072]: 2026-02-16T13:43:46Z|00167|binding|INFO|Releasing lport ac21d57d-f71e-4560-b6aa-e9f6e3838308 from this chassis (sb_readonly=0)
Feb 16 13:43:46 compute-0 nova_compute[185723]: 2026-02-16 13:43:46.239 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:43:46 compute-0 nova_compute[185723]: 2026-02-16 13:43:46.240 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:43:46 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:43:46.240 105360 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/62a1ccdd-3048-4bbf-acc8-c791bff79ee8.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/62a1ccdd-3048-4bbf-acc8-c791bff79ee8.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Feb 16 13:43:46 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:43:46.241 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[5b830471-e972-4971-b1be-0725705cbc2c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:43:46 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:43:46.242 105360 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Feb 16 13:43:46 compute-0 ovn_metadata_agent[105355]: global
Feb 16 13:43:46 compute-0 ovn_metadata_agent[105355]:     log         /dev/log local0 debug
Feb 16 13:43:46 compute-0 ovn_metadata_agent[105355]:     log-tag     haproxy-metadata-proxy-62a1ccdd-3048-4bbf-acc8-c791bff79ee8
Feb 16 13:43:46 compute-0 ovn_metadata_agent[105355]:     user        root
Feb 16 13:43:46 compute-0 ovn_metadata_agent[105355]:     group       root
Feb 16 13:43:46 compute-0 ovn_metadata_agent[105355]:     maxconn     1024
Feb 16 13:43:46 compute-0 ovn_metadata_agent[105355]:     pidfile     /var/lib/neutron/external/pids/62a1ccdd-3048-4bbf-acc8-c791bff79ee8.pid.haproxy
Feb 16 13:43:46 compute-0 ovn_metadata_agent[105355]:     daemon
Feb 16 13:43:46 compute-0 ovn_metadata_agent[105355]: 
Feb 16 13:43:46 compute-0 ovn_metadata_agent[105355]: defaults
Feb 16 13:43:46 compute-0 ovn_metadata_agent[105355]:     log global
Feb 16 13:43:46 compute-0 ovn_metadata_agent[105355]:     mode http
Feb 16 13:43:46 compute-0 ovn_metadata_agent[105355]:     option httplog
Feb 16 13:43:46 compute-0 ovn_metadata_agent[105355]:     option dontlognull
Feb 16 13:43:46 compute-0 ovn_metadata_agent[105355]:     option http-server-close
Feb 16 13:43:46 compute-0 ovn_metadata_agent[105355]:     option forwardfor
Feb 16 13:43:46 compute-0 ovn_metadata_agent[105355]:     retries                 3
Feb 16 13:43:46 compute-0 ovn_metadata_agent[105355]:     timeout http-request    30s
Feb 16 13:43:46 compute-0 ovn_metadata_agent[105355]:     timeout connect         30s
Feb 16 13:43:46 compute-0 ovn_metadata_agent[105355]:     timeout client          32s
Feb 16 13:43:46 compute-0 ovn_metadata_agent[105355]:     timeout server          32s
Feb 16 13:43:46 compute-0 ovn_metadata_agent[105355]:     timeout http-keep-alive 30s
Feb 16 13:43:46 compute-0 ovn_metadata_agent[105355]: 
Feb 16 13:43:46 compute-0 ovn_metadata_agent[105355]: 
Feb 16 13:43:46 compute-0 ovn_metadata_agent[105355]: listen listener
Feb 16 13:43:46 compute-0 ovn_metadata_agent[105355]:     bind 169.254.169.254:80
Feb 16 13:43:46 compute-0 ovn_metadata_agent[105355]:     server metadata /var/lib/neutron/metadata_proxy
Feb 16 13:43:46 compute-0 ovn_metadata_agent[105355]:     http-request add-header X-OVN-Network-ID 62a1ccdd-3048-4bbf-acc8-c791bff79ee8
Feb 16 13:43:46 compute-0 ovn_metadata_agent[105355]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Feb 16 13:43:46 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:43:46.243 105360 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'env', 'PROCESS_TAG=haproxy-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/62a1ccdd-3048-4bbf-acc8-c791bff79ee8.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Feb 16 13:43:46 compute-0 nova_compute[185723]: 2026-02-16 13:43:46.243 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:43:46 compute-0 nova_compute[185723]: 2026-02-16 13:43:46.262 185727 DEBUG nova.virt.driver [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] Emitting event <LifecycleEvent: 1771249426.262146, e5e9f487-f690-47b2-aaed-59236907f08b => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 13:43:46 compute-0 nova_compute[185723]: 2026-02-16 13:43:46.263 185727 INFO nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: e5e9f487-f690-47b2-aaed-59236907f08b] VM Started (Lifecycle Event)
Feb 16 13:43:46 compute-0 nova_compute[185723]: 2026-02-16 13:43:46.283 185727 DEBUG nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: e5e9f487-f690-47b2-aaed-59236907f08b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:43:46 compute-0 nova_compute[185723]: 2026-02-16 13:43:46.287 185727 DEBUG nova.virt.driver [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] Emitting event <LifecycleEvent: 1771249426.262476, e5e9f487-f690-47b2-aaed-59236907f08b => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 13:43:46 compute-0 nova_compute[185723]: 2026-02-16 13:43:46.287 185727 INFO nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: e5e9f487-f690-47b2-aaed-59236907f08b] VM Paused (Lifecycle Event)
Feb 16 13:43:46 compute-0 nova_compute[185723]: 2026-02-16 13:43:46.316 185727 DEBUG nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: e5e9f487-f690-47b2-aaed-59236907f08b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:43:46 compute-0 nova_compute[185723]: 2026-02-16 13:43:46.321 185727 DEBUG nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: e5e9f487-f690-47b2-aaed-59236907f08b] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 16 13:43:46 compute-0 nova_compute[185723]: 2026-02-16 13:43:46.345 185727 INFO nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: e5e9f487-f690-47b2-aaed-59236907f08b] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 16 13:43:46 compute-0 podman[213740]: 2026-02-16 13:43:46.59843285 +0000 UTC m=+0.081492411 container create 4fd7d8f73edfc720b4031ed6d16086fa8d0733c87d92b2a6e99faeef54334971 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Feb 16 13:43:46 compute-0 podman[213740]: 2026-02-16 13:43:46.537289244 +0000 UTC m=+0.020348825 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Feb 16 13:43:46 compute-0 systemd[1]: Started libpod-conmon-4fd7d8f73edfc720b4031ed6d16086fa8d0733c87d92b2a6e99faeef54334971.scope.
Feb 16 13:43:46 compute-0 systemd[1]: Started libcrun container.
Feb 16 13:43:46 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/94becb21cc7476a7bacb97d85f00285439de3e453447abec59d7416e107ac9bf/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 16 13:43:46 compute-0 podman[213740]: 2026-02-16 13:43:46.680164096 +0000 UTC m=+0.163223677 container init 4fd7d8f73edfc720b4031ed6d16086fa8d0733c87d92b2a6e99faeef54334971 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Feb 16 13:43:46 compute-0 podman[213740]: 2026-02-16 13:43:46.685098238 +0000 UTC m=+0.168157809 container start 4fd7d8f73edfc720b4031ed6d16086fa8d0733c87d92b2a6e99faeef54334971 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 16 13:43:46 compute-0 neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8[213756]: [NOTICE]   (213760) : New worker (213762) forked
Feb 16 13:43:46 compute-0 neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8[213756]: [NOTICE]   (213760) : Loading success.
Feb 16 13:43:46 compute-0 nova_compute[185723]: 2026-02-16 13:43:46.814 185727 DEBUG nova.compute.manager [req-336fb8ec-f4f2-41d2-9ec5-d5735b050963 req-078edb6b-9407-4e6b-89c3-37f5c3361e0c faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: e5e9f487-f690-47b2-aaed-59236907f08b] Received event network-vif-plugged-abc17ab2-80ac-4b01-990a-6eb58e6238ba external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:43:46 compute-0 nova_compute[185723]: 2026-02-16 13:43:46.815 185727 DEBUG oslo_concurrency.lockutils [req-336fb8ec-f4f2-41d2-9ec5-d5735b050963 req-078edb6b-9407-4e6b-89c3-37f5c3361e0c faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "e5e9f487-f690-47b2-aaed-59236907f08b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:43:46 compute-0 nova_compute[185723]: 2026-02-16 13:43:46.815 185727 DEBUG oslo_concurrency.lockutils [req-336fb8ec-f4f2-41d2-9ec5-d5735b050963 req-078edb6b-9407-4e6b-89c3-37f5c3361e0c faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "e5e9f487-f690-47b2-aaed-59236907f08b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:43:46 compute-0 nova_compute[185723]: 2026-02-16 13:43:46.816 185727 DEBUG oslo_concurrency.lockutils [req-336fb8ec-f4f2-41d2-9ec5-d5735b050963 req-078edb6b-9407-4e6b-89c3-37f5c3361e0c faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "e5e9f487-f690-47b2-aaed-59236907f08b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:43:46 compute-0 nova_compute[185723]: 2026-02-16 13:43:46.816 185727 DEBUG nova.compute.manager [req-336fb8ec-f4f2-41d2-9ec5-d5735b050963 req-078edb6b-9407-4e6b-89c3-37f5c3361e0c faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: e5e9f487-f690-47b2-aaed-59236907f08b] Processing event network-vif-plugged-abc17ab2-80ac-4b01-990a-6eb58e6238ba _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Feb 16 13:43:46 compute-0 nova_compute[185723]: 2026-02-16 13:43:46.817 185727 DEBUG nova.compute.manager [None req-5ebbdce6-00be-45c3-bcf7-9f97301139cd e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: e5e9f487-f690-47b2-aaed-59236907f08b] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Feb 16 13:43:46 compute-0 nova_compute[185723]: 2026-02-16 13:43:46.822 185727 DEBUG nova.virt.driver [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] Emitting event <LifecycleEvent: 1771249426.8217585, e5e9f487-f690-47b2-aaed-59236907f08b => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 13:43:46 compute-0 nova_compute[185723]: 2026-02-16 13:43:46.822 185727 INFO nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: e5e9f487-f690-47b2-aaed-59236907f08b] VM Resumed (Lifecycle Event)
Feb 16 13:43:46 compute-0 nova_compute[185723]: 2026-02-16 13:43:46.824 185727 DEBUG nova.virt.libvirt.driver [None req-5ebbdce6-00be-45c3-bcf7-9f97301139cd e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: e5e9f487-f690-47b2-aaed-59236907f08b] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Feb 16 13:43:46 compute-0 nova_compute[185723]: 2026-02-16 13:43:46.828 185727 INFO nova.virt.libvirt.driver [-] [instance: e5e9f487-f690-47b2-aaed-59236907f08b] Instance spawned successfully.
Feb 16 13:43:46 compute-0 nova_compute[185723]: 2026-02-16 13:43:46.829 185727 DEBUG nova.virt.libvirt.driver [None req-5ebbdce6-00be-45c3-bcf7-9f97301139cd e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: e5e9f487-f690-47b2-aaed-59236907f08b] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Feb 16 13:43:46 compute-0 nova_compute[185723]: 2026-02-16 13:43:46.847 185727 DEBUG nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: e5e9f487-f690-47b2-aaed-59236907f08b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:43:46 compute-0 nova_compute[185723]: 2026-02-16 13:43:46.854 185727 DEBUG nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: e5e9f487-f690-47b2-aaed-59236907f08b] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 16 13:43:46 compute-0 nova_compute[185723]: 2026-02-16 13:43:46.858 185727 DEBUG nova.virt.libvirt.driver [None req-5ebbdce6-00be-45c3-bcf7-9f97301139cd e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: e5e9f487-f690-47b2-aaed-59236907f08b] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:43:46 compute-0 nova_compute[185723]: 2026-02-16 13:43:46.859 185727 DEBUG nova.virt.libvirt.driver [None req-5ebbdce6-00be-45c3-bcf7-9f97301139cd e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: e5e9f487-f690-47b2-aaed-59236907f08b] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:43:46 compute-0 nova_compute[185723]: 2026-02-16 13:43:46.859 185727 DEBUG nova.virt.libvirt.driver [None req-5ebbdce6-00be-45c3-bcf7-9f97301139cd e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: e5e9f487-f690-47b2-aaed-59236907f08b] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:43:46 compute-0 nova_compute[185723]: 2026-02-16 13:43:46.860 185727 DEBUG nova.virt.libvirt.driver [None req-5ebbdce6-00be-45c3-bcf7-9f97301139cd e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: e5e9f487-f690-47b2-aaed-59236907f08b] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:43:46 compute-0 nova_compute[185723]: 2026-02-16 13:43:46.860 185727 DEBUG nova.virt.libvirt.driver [None req-5ebbdce6-00be-45c3-bcf7-9f97301139cd e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: e5e9f487-f690-47b2-aaed-59236907f08b] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:43:46 compute-0 nova_compute[185723]: 2026-02-16 13:43:46.861 185727 DEBUG nova.virt.libvirt.driver [None req-5ebbdce6-00be-45c3-bcf7-9f97301139cd e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: e5e9f487-f690-47b2-aaed-59236907f08b] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:43:46 compute-0 nova_compute[185723]: 2026-02-16 13:43:46.897 185727 INFO nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: e5e9f487-f690-47b2-aaed-59236907f08b] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 16 13:43:46 compute-0 nova_compute[185723]: 2026-02-16 13:43:46.941 185727 INFO nova.compute.manager [None req-5ebbdce6-00be-45c3-bcf7-9f97301139cd e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: e5e9f487-f690-47b2-aaed-59236907f08b] Took 10.49 seconds to spawn the instance on the hypervisor.
Feb 16 13:43:46 compute-0 nova_compute[185723]: 2026-02-16 13:43:46.942 185727 DEBUG nova.compute.manager [None req-5ebbdce6-00be-45c3-bcf7-9f97301139cd e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: e5e9f487-f690-47b2-aaed-59236907f08b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:43:46 compute-0 nova_compute[185723]: 2026-02-16 13:43:46.950 185727 DEBUG nova.network.neutron [req-84a70273-a5b1-410f-9700-a58fc544d446 req-976f987b-4393-4382-9cd3-a669784da162 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: e5e9f487-f690-47b2-aaed-59236907f08b] Updated VIF entry in instance network info cache for port abc17ab2-80ac-4b01-990a-6eb58e6238ba. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 16 13:43:46 compute-0 nova_compute[185723]: 2026-02-16 13:43:46.950 185727 DEBUG nova.network.neutron [req-84a70273-a5b1-410f-9700-a58fc544d446 req-976f987b-4393-4382-9cd3-a669784da162 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: e5e9f487-f690-47b2-aaed-59236907f08b] Updating instance_info_cache with network_info: [{"id": "abc17ab2-80ac-4b01-990a-6eb58e6238ba", "address": "fa:16:3e:eb:ae:a5", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapabc17ab2-80", "ovs_interfaceid": "abc17ab2-80ac-4b01-990a-6eb58e6238ba", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 13:43:46 compute-0 nova_compute[185723]: 2026-02-16 13:43:46.984 185727 DEBUG oslo_concurrency.lockutils [req-84a70273-a5b1-410f-9700-a58fc544d446 req-976f987b-4393-4382-9cd3-a669784da162 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Releasing lock "refresh_cache-e5e9f487-f690-47b2-aaed-59236907f08b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 13:43:47 compute-0 nova_compute[185723]: 2026-02-16 13:43:47.034 185727 INFO nova.compute.manager [None req-5ebbdce6-00be-45c3-bcf7-9f97301139cd e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: e5e9f487-f690-47b2-aaed-59236907f08b] Took 11.02 seconds to build instance.
Feb 16 13:43:47 compute-0 nova_compute[185723]: 2026-02-16 13:43:47.059 185727 DEBUG oslo_concurrency.lockutils [None req-5ebbdce6-00be-45c3-bcf7-9f97301139cd e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "e5e9f487-f690-47b2-aaed-59236907f08b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 11.119s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:43:47 compute-0 nova_compute[185723]: 2026-02-16 13:43:47.059 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "e5e9f487-f690-47b2-aaed-59236907f08b" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 10.224s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:43:47 compute-0 nova_compute[185723]: 2026-02-16 13:43:47.059 185727 INFO nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] [instance: e5e9f487-f690-47b2-aaed-59236907f08b] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 16 13:43:47 compute-0 nova_compute[185723]: 2026-02-16 13:43:47.060 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "e5e9f487-f690-47b2-aaed-59236907f08b" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:43:48 compute-0 sshd-session[213771]: Invalid user postgres from 188.166.42.159 port 38186
Feb 16 13:43:48 compute-0 sshd-session[213771]: Connection closed by invalid user postgres 188.166.42.159 port 38186 [preauth]
Feb 16 13:43:48 compute-0 nova_compute[185723]: 2026-02-16 13:43:48.908 185727 DEBUG nova.compute.manager [req-632ef195-0939-4baf-b6a4-593c7f9ee9e6 req-1b77815d-6d2a-4b27-a36e-ff515efb9548 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: e5e9f487-f690-47b2-aaed-59236907f08b] Received event network-vif-plugged-abc17ab2-80ac-4b01-990a-6eb58e6238ba external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:43:48 compute-0 nova_compute[185723]: 2026-02-16 13:43:48.909 185727 DEBUG oslo_concurrency.lockutils [req-632ef195-0939-4baf-b6a4-593c7f9ee9e6 req-1b77815d-6d2a-4b27-a36e-ff515efb9548 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "e5e9f487-f690-47b2-aaed-59236907f08b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:43:48 compute-0 nova_compute[185723]: 2026-02-16 13:43:48.909 185727 DEBUG oslo_concurrency.lockutils [req-632ef195-0939-4baf-b6a4-593c7f9ee9e6 req-1b77815d-6d2a-4b27-a36e-ff515efb9548 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "e5e9f487-f690-47b2-aaed-59236907f08b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:43:48 compute-0 nova_compute[185723]: 2026-02-16 13:43:48.909 185727 DEBUG oslo_concurrency.lockutils [req-632ef195-0939-4baf-b6a4-593c7f9ee9e6 req-1b77815d-6d2a-4b27-a36e-ff515efb9548 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "e5e9f487-f690-47b2-aaed-59236907f08b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:43:48 compute-0 nova_compute[185723]: 2026-02-16 13:43:48.909 185727 DEBUG nova.compute.manager [req-632ef195-0939-4baf-b6a4-593c7f9ee9e6 req-1b77815d-6d2a-4b27-a36e-ff515efb9548 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: e5e9f487-f690-47b2-aaed-59236907f08b] No waiting events found dispatching network-vif-plugged-abc17ab2-80ac-4b01-990a-6eb58e6238ba pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:43:48 compute-0 nova_compute[185723]: 2026-02-16 13:43:48.910 185727 WARNING nova.compute.manager [req-632ef195-0939-4baf-b6a4-593c7f9ee9e6 req-1b77815d-6d2a-4b27-a36e-ff515efb9548 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: e5e9f487-f690-47b2-aaed-59236907f08b] Received unexpected event network-vif-plugged-abc17ab2-80ac-4b01-990a-6eb58e6238ba for instance with vm_state active and task_state None.
Feb 16 13:43:49 compute-0 nova_compute[185723]: 2026-02-16 13:43:49.877 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:43:50 compute-0 nova_compute[185723]: 2026-02-16 13:43:50.149 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:43:54 compute-0 nova_compute[185723]: 2026-02-16 13:43:54.879 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:43:55 compute-0 nova_compute[185723]: 2026-02-16 13:43:55.152 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:43:58 compute-0 ovn_controller[96072]: 2026-02-16T13:43:58Z|00024|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:eb:ae:a5 10.100.0.14
Feb 16 13:43:58 compute-0 ovn_controller[96072]: 2026-02-16T13:43:58Z|00025|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:eb:ae:a5 10.100.0.14
Feb 16 13:43:59 compute-0 podman[195053]: time="2026-02-16T13:43:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:43:59 compute-0 podman[195053]: @ - - [16/Feb/2026:13:43:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17244 "" "Go-http-client/1.1"
Feb 16 13:43:59 compute-0 podman[195053]: @ - - [16/Feb/2026:13:43:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2640 "" "Go-http-client/1.1"
Feb 16 13:43:59 compute-0 nova_compute[185723]: 2026-02-16 13:43:59.881 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:44:00 compute-0 nova_compute[185723]: 2026-02-16 13:44:00.153 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:44:01 compute-0 openstack_network_exporter[197909]: ERROR   13:44:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:44:01 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:44:01 compute-0 openstack_network_exporter[197909]: ERROR   13:44:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:44:01 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:44:02 compute-0 podman[213786]: 2026-02-16 13:44:02.01363673 +0000 UTC m=+0.046644018 container health_status d69bd0be854ec8043fcba555b70c2cd311c7a2510d07d6bc9929fe7684abece9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible)
Feb 16 13:44:02 compute-0 podman[213785]: 2026-02-16 13:44:02.014124512 +0000 UTC m=+0.046407082 container health_status 93151dcbec9f159583042df5101bdd7b5b1bcb5f3117955b4bc9cd1c0e465b98 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, config_id=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, distribution-scope=public, build-date=2026-02-05T04:57:10Z, managed_by=edpm_ansible, name=ubi9/ubi-minimal, org.opencontainers.image.created=2026-02-05T04:57:10Z, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, version=9.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, release=1770267347, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Feb 16 13:44:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:44:03.237 105360 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:44:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:44:03.238 105360 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:44:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:44:03.239 105360 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:44:04 compute-0 nova_compute[185723]: 2026-02-16 13:44:04.884 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:44:05 compute-0 nova_compute[185723]: 2026-02-16 13:44:05.156 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:44:06 compute-0 podman[213823]: 2026-02-16 13:44:06.034562475 +0000 UTC m=+0.076245921 container health_status 19961a2f872cc72daf954f4dd556f980b5f58f137d145776df6132c167a8a93c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.license=GPLv2)
Feb 16 13:44:09 compute-0 nova_compute[185723]: 2026-02-16 13:44:09.887 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:44:10 compute-0 nova_compute[185723]: 2026-02-16 13:44:10.205 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:44:14 compute-0 nova_compute[185723]: 2026-02-16 13:44:14.888 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:44:15 compute-0 nova_compute[185723]: 2026-02-16 13:44:15.208 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:44:16 compute-0 ovn_controller[96072]: 2026-02-16T13:44:16Z|00168|memory_trim|INFO|Detected inactivity (last active 30009 ms ago): trimming memory
Feb 16 13:44:16 compute-0 podman[213850]: 2026-02-16 13:44:16.013503327 +0000 UTC m=+0.051685682 container health_status 4ec6105d1727f201f656d7e6e358931be8ceabc5af37fb5ad779abc620122180 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 16 13:44:19 compute-0 nova_compute[185723]: 2026-02-16 13:44:19.450 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:44:19 compute-0 nova_compute[185723]: 2026-02-16 13:44:19.450 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:44:19 compute-0 nova_compute[185723]: 2026-02-16 13:44:19.890 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:44:20 compute-0 nova_compute[185723]: 2026-02-16 13:44:20.210 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:44:20 compute-0 nova_compute[185723]: 2026-02-16 13:44:20.433 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:44:20 compute-0 nova_compute[185723]: 2026-02-16 13:44:20.468 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:44:20 compute-0 nova_compute[185723]: 2026-02-16 13:44:20.468 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:44:20 compute-0 nova_compute[185723]: 2026-02-16 13:44:20.469 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:44:20 compute-0 nova_compute[185723]: 2026-02-16 13:44:20.469 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 16 13:44:20 compute-0 nova_compute[185723]: 2026-02-16 13:44:20.545 185727 DEBUG oslo_concurrency.processutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e5e9f487-f690-47b2-aaed-59236907f08b/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:44:20 compute-0 nova_compute[185723]: 2026-02-16 13:44:20.607 185727 DEBUG oslo_concurrency.processutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e5e9f487-f690-47b2-aaed-59236907f08b/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:44:20 compute-0 nova_compute[185723]: 2026-02-16 13:44:20.607 185727 DEBUG oslo_concurrency.processutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e5e9f487-f690-47b2-aaed-59236907f08b/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:44:20 compute-0 nova_compute[185723]: 2026-02-16 13:44:20.685 185727 DEBUG oslo_concurrency.processutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e5e9f487-f690-47b2-aaed-59236907f08b/disk --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:44:20 compute-0 nova_compute[185723]: 2026-02-16 13:44:20.824 185727 WARNING nova.virt.libvirt.driver [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 13:44:20 compute-0 nova_compute[185723]: 2026-02-16 13:44:20.825 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5654MB free_disk=73.19636154174805GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 16 13:44:20 compute-0 nova_compute[185723]: 2026-02-16 13:44:20.826 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:44:20 compute-0 nova_compute[185723]: 2026-02-16 13:44:20.826 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:44:22 compute-0 nova_compute[185723]: 2026-02-16 13:44:22.653 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Instance e5e9f487-f690-47b2-aaed-59236907f08b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 16 13:44:22 compute-0 nova_compute[185723]: 2026-02-16 13:44:22.653 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 16 13:44:22 compute-0 nova_compute[185723]: 2026-02-16 13:44:22.654 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 16 13:44:22 compute-0 nova_compute[185723]: 2026-02-16 13:44:22.767 185727 DEBUG nova.compute.provider_tree [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Inventory has not changed in ProviderTree for provider: c9501a85-df32-4b8f-bce0-9425ef1e7866 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 13:44:22 compute-0 nova_compute[185723]: 2026-02-16 13:44:22.793 185727 DEBUG nova.scheduler.client.report [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Inventory has not changed for provider c9501a85-df32-4b8f-bce0-9425ef1e7866 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 13:44:22 compute-0 nova_compute[185723]: 2026-02-16 13:44:22.918 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 16 13:44:22 compute-0 nova_compute[185723]: 2026-02-16 13:44:22.918 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.093s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:44:24 compute-0 nova_compute[185723]: 2026-02-16 13:44:24.892 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:44:24 compute-0 nova_compute[185723]: 2026-02-16 13:44:24.918 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:44:24 compute-0 nova_compute[185723]: 2026-02-16 13:44:24.919 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:44:24 compute-0 nova_compute[185723]: 2026-02-16 13:44:24.919 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 16 13:44:24 compute-0 nova_compute[185723]: 2026-02-16 13:44:24.919 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 16 13:44:24 compute-0 sshd-session[213883]: Invalid user test from 146.190.22.227 port 59006
Feb 16 13:44:25 compute-0 nova_compute[185723]: 2026-02-16 13:44:25.212 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:44:25 compute-0 sshd-session[213883]: Connection closed by invalid user test 146.190.22.227 port 59006 [preauth]
Feb 16 13:44:25 compute-0 sshd-session[213885]: Invalid user admin from 64.227.72.94 port 59204
Feb 16 13:44:25 compute-0 sshd-session[213885]: Connection closed by invalid user admin 64.227.72.94 port 59204 [preauth]
Feb 16 13:44:25 compute-0 nova_compute[185723]: 2026-02-16 13:44:25.632 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Acquiring lock "refresh_cache-e5e9f487-f690-47b2-aaed-59236907f08b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 13:44:25 compute-0 nova_compute[185723]: 2026-02-16 13:44:25.632 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Acquired lock "refresh_cache-e5e9f487-f690-47b2-aaed-59236907f08b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 13:44:25 compute-0 nova_compute[185723]: 2026-02-16 13:44:25.632 185727 DEBUG nova.network.neutron [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] [instance: e5e9f487-f690-47b2-aaed-59236907f08b] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Feb 16 13:44:25 compute-0 nova_compute[185723]: 2026-02-16 13:44:25.632 185727 DEBUG nova.objects.instance [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lazy-loading 'info_cache' on Instance uuid e5e9f487-f690-47b2-aaed-59236907f08b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 13:44:29 compute-0 podman[195053]: time="2026-02-16T13:44:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:44:29 compute-0 podman[195053]: @ - - [16/Feb/2026:13:44:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17244 "" "Go-http-client/1.1"
Feb 16 13:44:29 compute-0 podman[195053]: @ - - [16/Feb/2026:13:44:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2640 "" "Go-http-client/1.1"
Feb 16 13:44:29 compute-0 nova_compute[185723]: 2026-02-16 13:44:29.770 185727 DEBUG nova.network.neutron [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] [instance: e5e9f487-f690-47b2-aaed-59236907f08b] Updating instance_info_cache with network_info: [{"id": "abc17ab2-80ac-4b01-990a-6eb58e6238ba", "address": "fa:16:3e:eb:ae:a5", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapabc17ab2-80", "ovs_interfaceid": "abc17ab2-80ac-4b01-990a-6eb58e6238ba", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 13:44:29 compute-0 nova_compute[185723]: 2026-02-16 13:44:29.820 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Releasing lock "refresh_cache-e5e9f487-f690-47b2-aaed-59236907f08b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 13:44:29 compute-0 nova_compute[185723]: 2026-02-16 13:44:29.820 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] [instance: e5e9f487-f690-47b2-aaed-59236907f08b] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Feb 16 13:44:29 compute-0 nova_compute[185723]: 2026-02-16 13:44:29.821 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:44:29 compute-0 nova_compute[185723]: 2026-02-16 13:44:29.821 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:44:29 compute-0 nova_compute[185723]: 2026-02-16 13:44:29.821 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:44:29 compute-0 nova_compute[185723]: 2026-02-16 13:44:29.821 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:44:29 compute-0 nova_compute[185723]: 2026-02-16 13:44:29.822 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 16 13:44:29 compute-0 nova_compute[185723]: 2026-02-16 13:44:29.893 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:44:30 compute-0 nova_compute[185723]: 2026-02-16 13:44:30.214 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:44:31 compute-0 openstack_network_exporter[197909]: ERROR   13:44:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:44:31 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:44:31 compute-0 openstack_network_exporter[197909]: ERROR   13:44:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:44:31 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:44:33 compute-0 podman[213888]: 2026-02-16 13:44:33.011376654 +0000 UTC m=+0.045515960 container health_status d69bd0be854ec8043fcba555b70c2cd311c7a2510d07d6bc9929fe7684abece9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Feb 16 13:44:33 compute-0 podman[213887]: 2026-02-16 13:44:33.011369844 +0000 UTC m=+0.050599176 container health_status 93151dcbec9f159583042df5101bdd7b5b1bcb5f3117955b4bc9cd1c0e465b98 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., org.opencontainers.image.created=2026-02-05T04:57:10Z, version=9.7, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1770267347, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, container_name=openstack_network_exporter, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-02-05T04:57:10Z, config_id=openstack_network_exporter, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.buildah.version=1.33.7, name=ubi9/ubi-minimal, com.redhat.component=ubi9-minimal-container)
Feb 16 13:44:34 compute-0 nova_compute[185723]: 2026-02-16 13:44:34.894 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:44:35 compute-0 nova_compute[185723]: 2026-02-16 13:44:35.217 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:44:37 compute-0 podman[213928]: 2026-02-16 13:44:37.023619214 +0000 UTC m=+0.064658284 container health_status 19961a2f872cc72daf954f4dd556f980b5f58f137d145776df6132c167a8a93c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ovn_controller, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Feb 16 13:44:39 compute-0 sshd-session[213954]: Invalid user postgres from 188.166.42.159 port 50446
Feb 16 13:44:39 compute-0 sshd-session[213954]: Connection closed by invalid user postgres 188.166.42.159 port 50446 [preauth]
Feb 16 13:44:39 compute-0 nova_compute[185723]: 2026-02-16 13:44:39.896 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:44:40 compute-0 nova_compute[185723]: 2026-02-16 13:44:40.218 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:44:41 compute-0 nova_compute[185723]: 2026-02-16 13:44:41.703 185727 DEBUG nova.virt.libvirt.driver [None req-665a2939-c829-4159-b077-d3b1b28b59e7 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 5a1cf877-f781-4088-8f98-19d39a95d5bc] Creating tmpfile /var/lib/nova/instances/tmpeh4lji_d to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10041
Feb 16 13:44:41 compute-0 nova_compute[185723]: 2026-02-16 13:44:41.704 185727 DEBUG nova.compute.manager [None req-665a2939-c829-4159-b077-d3b1b28b59e7 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpeh4lji_d',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.9/site-packages/nova/compute/manager.py:8476
Feb 16 13:44:42 compute-0 nova_compute[185723]: 2026-02-16 13:44:42.722 185727 DEBUG nova.compute.manager [None req-665a2939-c829-4159-b077-d3b1b28b59e7 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpeh4lji_d',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='5a1cf877-f781-4088-8f98-19d39a95d5bc',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8604
Feb 16 13:44:42 compute-0 nova_compute[185723]: 2026-02-16 13:44:42.759 185727 DEBUG oslo_concurrency.lockutils [None req-665a2939-c829-4159-b077-d3b1b28b59e7 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "refresh_cache-5a1cf877-f781-4088-8f98-19d39a95d5bc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 13:44:42 compute-0 nova_compute[185723]: 2026-02-16 13:44:42.759 185727 DEBUG oslo_concurrency.lockutils [None req-665a2939-c829-4159-b077-d3b1b28b59e7 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Acquired lock "refresh_cache-5a1cf877-f781-4088-8f98-19d39a95d5bc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 13:44:42 compute-0 nova_compute[185723]: 2026-02-16 13:44:42.760 185727 DEBUG nova.network.neutron [None req-665a2939-c829-4159-b077-d3b1b28b59e7 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 5a1cf877-f781-4088-8f98-19d39a95d5bc] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 16 13:44:44 compute-0 nova_compute[185723]: 2026-02-16 13:44:44.897 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:44:44 compute-0 systemd[1]: virtproxyd.service: Deactivated successfully.
Feb 16 13:44:45 compute-0 nova_compute[185723]: 2026-02-16 13:44:45.219 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:44:45 compute-0 nova_compute[185723]: 2026-02-16 13:44:45.997 185727 DEBUG nova.network.neutron [None req-665a2939-c829-4159-b077-d3b1b28b59e7 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 5a1cf877-f781-4088-8f98-19d39a95d5bc] Updating instance_info_cache with network_info: [{"id": "b5736eee-a7c7-4376-87f8-2ba8e852813f", "address": "fa:16:3e:e4:03:04", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb5736eee-a7", "ovs_interfaceid": "b5736eee-a7c7-4376-87f8-2ba8e852813f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 13:44:47 compute-0 podman[213965]: 2026-02-16 13:44:47.026207525 +0000 UTC m=+0.069494544 container health_status 4ec6105d1727f201f656d7e6e358931be8ceabc5af37fb5ad779abc620122180 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 16 13:44:47 compute-0 nova_compute[185723]: 2026-02-16 13:44:47.509 185727 DEBUG oslo_concurrency.lockutils [None req-665a2939-c829-4159-b077-d3b1b28b59e7 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Releasing lock "refresh_cache-5a1cf877-f781-4088-8f98-19d39a95d5bc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 13:44:47 compute-0 nova_compute[185723]: 2026-02-16 13:44:47.511 185727 DEBUG nova.virt.libvirt.driver [None req-665a2939-c829-4159-b077-d3b1b28b59e7 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 5a1cf877-f781-4088-8f98-19d39a95d5bc] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpeh4lji_d',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='5a1cf877-f781-4088-8f98-19d39a95d5bc',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10827
Feb 16 13:44:47 compute-0 nova_compute[185723]: 2026-02-16 13:44:47.511 185727 DEBUG nova.virt.libvirt.driver [None req-665a2939-c829-4159-b077-d3b1b28b59e7 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 5a1cf877-f781-4088-8f98-19d39a95d5bc] Creating instance directory: /var/lib/nova/instances/5a1cf877-f781-4088-8f98-19d39a95d5bc pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10840
Feb 16 13:44:47 compute-0 nova_compute[185723]: 2026-02-16 13:44:47.512 185727 DEBUG nova.virt.libvirt.driver [None req-665a2939-c829-4159-b077-d3b1b28b59e7 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 5a1cf877-f781-4088-8f98-19d39a95d5bc] Creating disk.info with the contents: {'/var/lib/nova/instances/5a1cf877-f781-4088-8f98-19d39a95d5bc/disk': 'qcow2', '/var/lib/nova/instances/5a1cf877-f781-4088-8f98-19d39a95d5bc/disk.config': 'raw'} pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10854
Feb 16 13:44:47 compute-0 nova_compute[185723]: 2026-02-16 13:44:47.512 185727 DEBUG nova.virt.libvirt.driver [None req-665a2939-c829-4159-b077-d3b1b28b59e7 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 5a1cf877-f781-4088-8f98-19d39a95d5bc] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10864
Feb 16 13:44:47 compute-0 nova_compute[185723]: 2026-02-16 13:44:47.512 185727 DEBUG nova.objects.instance [None req-665a2939-c829-4159-b077-d3b1b28b59e7 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lazy-loading 'trusted_certs' on Instance uuid 5a1cf877-f781-4088-8f98-19d39a95d5bc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 13:44:47 compute-0 nova_compute[185723]: 2026-02-16 13:44:47.540 185727 DEBUG oslo_concurrency.processutils [None req-665a2939-c829-4159-b077-d3b1b28b59e7 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:44:47 compute-0 nova_compute[185723]: 2026-02-16 13:44:47.609 185727 DEBUG oslo_concurrency.processutils [None req-665a2939-c829-4159-b077-d3b1b28b59e7 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:44:47 compute-0 nova_compute[185723]: 2026-02-16 13:44:47.611 185727 DEBUG oslo_concurrency.lockutils [None req-665a2939-c829-4159-b077-d3b1b28b59e7 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "755ae6cb63977e865a71a8c38a1a67ce95c9d0b7" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:44:47 compute-0 nova_compute[185723]: 2026-02-16 13:44:47.611 185727 DEBUG oslo_concurrency.lockutils [None req-665a2939-c829-4159-b077-d3b1b28b59e7 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "755ae6cb63977e865a71a8c38a1a67ce95c9d0b7" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:44:47 compute-0 nova_compute[185723]: 2026-02-16 13:44:47.628 185727 DEBUG oslo_concurrency.processutils [None req-665a2939-c829-4159-b077-d3b1b28b59e7 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:44:47 compute-0 nova_compute[185723]: 2026-02-16 13:44:47.691 185727 DEBUG oslo_concurrency.processutils [None req-665a2939-c829-4159-b077-d3b1b28b59e7 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:44:47 compute-0 nova_compute[185723]: 2026-02-16 13:44:47.692 185727 DEBUG oslo_concurrency.processutils [None req-665a2939-c829-4159-b077-d3b1b28b59e7 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7,backing_fmt=raw /var/lib/nova/instances/5a1cf877-f781-4088-8f98-19d39a95d5bc/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:44:47 compute-0 nova_compute[185723]: 2026-02-16 13:44:47.721 185727 DEBUG oslo_concurrency.processutils [None req-665a2939-c829-4159-b077-d3b1b28b59e7 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7,backing_fmt=raw /var/lib/nova/instances/5a1cf877-f781-4088-8f98-19d39a95d5bc/disk 1073741824" returned: 0 in 0.029s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:44:47 compute-0 nova_compute[185723]: 2026-02-16 13:44:47.723 185727 DEBUG oslo_concurrency.lockutils [None req-665a2939-c829-4159-b077-d3b1b28b59e7 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "755ae6cb63977e865a71a8c38a1a67ce95c9d0b7" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.111s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:44:47 compute-0 nova_compute[185723]: 2026-02-16 13:44:47.723 185727 DEBUG oslo_concurrency.processutils [None req-665a2939-c829-4159-b077-d3b1b28b59e7 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:44:47 compute-0 nova_compute[185723]: 2026-02-16 13:44:47.771 185727 DEBUG oslo_concurrency.processutils [None req-665a2939-c829-4159-b077-d3b1b28b59e7 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json" returned: 0 in 0.048s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:44:47 compute-0 nova_compute[185723]: 2026-02-16 13:44:47.773 185727 DEBUG nova.virt.disk.api [None req-665a2939-c829-4159-b077-d3b1b28b59e7 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Checking if we can resize image /var/lib/nova/instances/5a1cf877-f781-4088-8f98-19d39a95d5bc/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Feb 16 13:44:47 compute-0 nova_compute[185723]: 2026-02-16 13:44:47.773 185727 DEBUG oslo_concurrency.processutils [None req-665a2939-c829-4159-b077-d3b1b28b59e7 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5a1cf877-f781-4088-8f98-19d39a95d5bc/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:44:47 compute-0 nova_compute[185723]: 2026-02-16 13:44:47.825 185727 DEBUG oslo_concurrency.processutils [None req-665a2939-c829-4159-b077-d3b1b28b59e7 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5a1cf877-f781-4088-8f98-19d39a95d5bc/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:44:47 compute-0 nova_compute[185723]: 2026-02-16 13:44:47.827 185727 DEBUG nova.virt.disk.api [None req-665a2939-c829-4159-b077-d3b1b28b59e7 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Cannot resize image /var/lib/nova/instances/5a1cf877-f781-4088-8f98-19d39a95d5bc/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Feb 16 13:44:47 compute-0 nova_compute[185723]: 2026-02-16 13:44:47.827 185727 DEBUG nova.objects.instance [None req-665a2939-c829-4159-b077-d3b1b28b59e7 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lazy-loading 'migration_context' on Instance uuid 5a1cf877-f781-4088-8f98-19d39a95d5bc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 13:44:47 compute-0 nova_compute[185723]: 2026-02-16 13:44:47.846 185727 DEBUG oslo_concurrency.processutils [None req-665a2939-c829-4159-b077-d3b1b28b59e7 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/5a1cf877-f781-4088-8f98-19d39a95d5bc/disk.config 485376 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:44:47 compute-0 nova_compute[185723]: 2026-02-16 13:44:47.866 185727 DEBUG oslo_concurrency.processutils [None req-665a2939-c829-4159-b077-d3b1b28b59e7 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/5a1cf877-f781-4088-8f98-19d39a95d5bc/disk.config 485376" returned: 0 in 0.020s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:44:47 compute-0 nova_compute[185723]: 2026-02-16 13:44:47.867 185727 DEBUG nova.virt.libvirt.volume.remotefs [None req-665a2939-c829-4159-b077-d3b1b28b59e7 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Copying file compute-1.ctlplane.example.com:/var/lib/nova/instances/5a1cf877-f781-4088-8f98-19d39a95d5bc/disk.config to /var/lib/nova/instances/5a1cf877-f781-4088-8f98-19d39a95d5bc copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Feb 16 13:44:47 compute-0 nova_compute[185723]: 2026-02-16 13:44:47.868 185727 DEBUG oslo_concurrency.processutils [None req-665a2939-c829-4159-b077-d3b1b28b59e7 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Running cmd (subprocess): scp -C -r compute-1.ctlplane.example.com:/var/lib/nova/instances/5a1cf877-f781-4088-8f98-19d39a95d5bc/disk.config /var/lib/nova/instances/5a1cf877-f781-4088-8f98-19d39a95d5bc execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:44:48 compute-0 nova_compute[185723]: 2026-02-16 13:44:48.305 185727 DEBUG oslo_concurrency.processutils [None req-665a2939-c829-4159-b077-d3b1b28b59e7 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] CMD "scp -C -r compute-1.ctlplane.example.com:/var/lib/nova/instances/5a1cf877-f781-4088-8f98-19d39a95d5bc/disk.config /var/lib/nova/instances/5a1cf877-f781-4088-8f98-19d39a95d5bc" returned: 0 in 0.437s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:44:48 compute-0 nova_compute[185723]: 2026-02-16 13:44:48.306 185727 DEBUG nova.virt.libvirt.driver [None req-665a2939-c829-4159-b077-d3b1b28b59e7 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 5a1cf877-f781-4088-8f98-19d39a95d5bc] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10794
Feb 16 13:44:48 compute-0 nova_compute[185723]: 2026-02-16 13:44:48.307 185727 DEBUG nova.virt.libvirt.vif [None req-665a2939-c829-4159-b077-d3b1b28b59e7 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-16T13:43:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-444598461',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-444598461',id=19,image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-16T13:43:28Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='76c271745e704d5fa97fe16a7dcd4a81',ramdisk_id='',reservation_id='r-dq2i0im0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-1085993185',owner_user_name='tempest-TestExecuteStrategies-1085993185-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2026-02-16T13:43:28Z,user_data=None,user_id='e19cd2d8a8894526ba620ca3249e9a63',uuid=5a1cf877-f781-4088-8f98-19d39a95d5bc,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b5736eee-a7c7-4376-87f8-2ba8e852813f", "address": "fa:16:3e:e4:03:04", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapb5736eee-a7", "ovs_interfaceid": "b5736eee-a7c7-4376-87f8-2ba8e852813f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Feb 16 13:44:48 compute-0 nova_compute[185723]: 2026-02-16 13:44:48.307 185727 DEBUG nova.network.os_vif_util [None req-665a2939-c829-4159-b077-d3b1b28b59e7 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Converting VIF {"id": "b5736eee-a7c7-4376-87f8-2ba8e852813f", "address": "fa:16:3e:e4:03:04", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapb5736eee-a7", "ovs_interfaceid": "b5736eee-a7c7-4376-87f8-2ba8e852813f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 13:44:48 compute-0 nova_compute[185723]: 2026-02-16 13:44:48.308 185727 DEBUG nova.network.os_vif_util [None req-665a2939-c829-4159-b077-d3b1b28b59e7 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:e4:03:04,bridge_name='br-int',has_traffic_filtering=True,id=b5736eee-a7c7-4376-87f8-2ba8e852813f,network=Network(62a1ccdd-3048-4bbf-acc8-c791bff79ee8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb5736eee-a7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 13:44:48 compute-0 nova_compute[185723]: 2026-02-16 13:44:48.309 185727 DEBUG os_vif [None req-665a2939-c829-4159-b077-d3b1b28b59e7 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:e4:03:04,bridge_name='br-int',has_traffic_filtering=True,id=b5736eee-a7c7-4376-87f8-2ba8e852813f,network=Network(62a1ccdd-3048-4bbf-acc8-c791bff79ee8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb5736eee-a7') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Feb 16 13:44:48 compute-0 nova_compute[185723]: 2026-02-16 13:44:48.310 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:44:48 compute-0 nova_compute[185723]: 2026-02-16 13:44:48.311 185727 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:44:48 compute-0 nova_compute[185723]: 2026-02-16 13:44:48.311 185727 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 13:44:48 compute-0 nova_compute[185723]: 2026-02-16 13:44:48.313 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:44:48 compute-0 nova_compute[185723]: 2026-02-16 13:44:48.314 185727 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb5736eee-a7, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:44:48 compute-0 nova_compute[185723]: 2026-02-16 13:44:48.314 185727 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapb5736eee-a7, col_values=(('external_ids', {'iface-id': 'b5736eee-a7c7-4376-87f8-2ba8e852813f', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:e4:03:04', 'vm-uuid': '5a1cf877-f781-4088-8f98-19d39a95d5bc'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:44:48 compute-0 nova_compute[185723]: 2026-02-16 13:44:48.315 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:44:48 compute-0 NetworkManager[56177]: <info>  [1771249488.3169] manager: (tapb5736eee-a7): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/70)
Feb 16 13:44:48 compute-0 nova_compute[185723]: 2026-02-16 13:44:48.318 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 13:44:48 compute-0 nova_compute[185723]: 2026-02-16 13:44:48.321 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:44:48 compute-0 nova_compute[185723]: 2026-02-16 13:44:48.322 185727 INFO os_vif [None req-665a2939-c829-4159-b077-d3b1b28b59e7 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:e4:03:04,bridge_name='br-int',has_traffic_filtering=True,id=b5736eee-a7c7-4376-87f8-2ba8e852813f,network=Network(62a1ccdd-3048-4bbf-acc8-c791bff79ee8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb5736eee-a7')
Feb 16 13:44:48 compute-0 nova_compute[185723]: 2026-02-16 13:44:48.322 185727 DEBUG nova.virt.libvirt.driver [None req-665a2939-c829-4159-b077-d3b1b28b59e7 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10954
Feb 16 13:44:48 compute-0 nova_compute[185723]: 2026-02-16 13:44:48.322 185727 DEBUG nova.compute.manager [None req-665a2939-c829-4159-b077-d3b1b28b59e7 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpeh4lji_d',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='5a1cf877-f781-4088-8f98-19d39a95d5bc',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8668
Feb 16 13:44:49 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:44:49.311 105360 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=24, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '52:96:1b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '16:6c:7a:bd:49:39'}, ipsec=False) old=SB_Global(nb_cfg=23) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 13:44:49 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:44:49.312 105360 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 16 13:44:49 compute-0 nova_compute[185723]: 2026-02-16 13:44:49.313 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:44:50 compute-0 sshd-session[214010]: Invalid user test from 146.190.226.24 port 38042
Feb 16 13:44:50 compute-0 nova_compute[185723]: 2026-02-16 13:44:50.105 185727 DEBUG nova.network.neutron [None req-665a2939-c829-4159-b077-d3b1b28b59e7 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 5a1cf877-f781-4088-8f98-19d39a95d5bc] Port b5736eee-a7c7-4376-87f8-2ba8e852813f updated with migration profile {'migrating_to': 'compute-0.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.9/site-packages/nova/network/neutron.py:354
Feb 16 13:44:50 compute-0 nova_compute[185723]: 2026-02-16 13:44:50.107 185727 DEBUG nova.compute.manager [None req-665a2939-c829-4159-b077-d3b1b28b59e7 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpeh4lji_d',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='5a1cf877-f781-4088-8f98-19d39a95d5bc',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8723
Feb 16 13:44:50 compute-0 nova_compute[185723]: 2026-02-16 13:44:50.222 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:44:50 compute-0 systemd[1]: Starting libvirt proxy daemon...
Feb 16 13:44:50 compute-0 systemd[1]: Started libvirt proxy daemon.
Feb 16 13:44:50 compute-0 sshd-session[214010]: Connection closed by invalid user test 146.190.226.24 port 38042 [preauth]
Feb 16 13:44:50 compute-0 kernel: tapb5736eee-a7: entered promiscuous mode
Feb 16 13:44:50 compute-0 NetworkManager[56177]: <info>  [1771249490.4768] manager: (tapb5736eee-a7): new Tun device (/org/freedesktop/NetworkManager/Devices/71)
Feb 16 13:44:50 compute-0 ovn_controller[96072]: 2026-02-16T13:44:50Z|00169|binding|INFO|Claiming lport b5736eee-a7c7-4376-87f8-2ba8e852813f for this additional chassis.
Feb 16 13:44:50 compute-0 nova_compute[185723]: 2026-02-16 13:44:50.477 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:44:50 compute-0 ovn_controller[96072]: 2026-02-16T13:44:50Z|00170|binding|INFO|b5736eee-a7c7-4376-87f8-2ba8e852813f: Claiming fa:16:3e:e4:03:04 10.100.0.11
Feb 16 13:44:50 compute-0 ovn_controller[96072]: 2026-02-16T13:44:50Z|00171|binding|INFO|Setting lport b5736eee-a7c7-4376-87f8-2ba8e852813f ovn-installed in OVS
Feb 16 13:44:50 compute-0 nova_compute[185723]: 2026-02-16 13:44:50.486 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:44:50 compute-0 nova_compute[185723]: 2026-02-16 13:44:50.487 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:44:50 compute-0 systemd-machined[155229]: New machine qemu-16-instance-00000013.
Feb 16 13:44:50 compute-0 systemd-udevd[214048]: Network interface NamePolicy= disabled on kernel command line.
Feb 16 13:44:50 compute-0 NetworkManager[56177]: <info>  [1771249490.5220] device (tapb5736eee-a7): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 16 13:44:50 compute-0 NetworkManager[56177]: <info>  [1771249490.5230] device (tapb5736eee-a7): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Feb 16 13:44:50 compute-0 systemd[1]: Started Virtual Machine qemu-16-instance-00000013.
Feb 16 13:44:51 compute-0 nova_compute[185723]: 2026-02-16 13:44:51.614 185727 DEBUG nova.virt.driver [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] Emitting event <LifecycleEvent: 1771249491.6139104, 5a1cf877-f781-4088-8f98-19d39a95d5bc => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 13:44:51 compute-0 nova_compute[185723]: 2026-02-16 13:44:51.615 185727 INFO nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: 5a1cf877-f781-4088-8f98-19d39a95d5bc] VM Started (Lifecycle Event)
Feb 16 13:44:51 compute-0 nova_compute[185723]: 2026-02-16 13:44:51.977 185727 DEBUG nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: 5a1cf877-f781-4088-8f98-19d39a95d5bc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:44:52 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:44:52.315 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=b0e583b2-47d7-4bde-bbd6-282143e0c194, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '24'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:44:52 compute-0 nova_compute[185723]: 2026-02-16 13:44:52.379 185727 DEBUG nova.virt.driver [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] Emitting event <LifecycleEvent: 1771249492.3788435, 5a1cf877-f781-4088-8f98-19d39a95d5bc => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 13:44:52 compute-0 nova_compute[185723]: 2026-02-16 13:44:52.380 185727 INFO nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: 5a1cf877-f781-4088-8f98-19d39a95d5bc] VM Resumed (Lifecycle Event)
Feb 16 13:44:52 compute-0 nova_compute[185723]: 2026-02-16 13:44:52.412 185727 DEBUG nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: 5a1cf877-f781-4088-8f98-19d39a95d5bc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:44:52 compute-0 nova_compute[185723]: 2026-02-16 13:44:52.414 185727 DEBUG nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: 5a1cf877-f781-4088-8f98-19d39a95d5bc] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 16 13:44:52 compute-0 nova_compute[185723]: 2026-02-16 13:44:52.452 185727 INFO nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: 5a1cf877-f781-4088-8f98-19d39a95d5bc] During the sync_power process the instance has moved from host compute-1.ctlplane.example.com to host compute-0.ctlplane.example.com
Feb 16 13:44:53 compute-0 nova_compute[185723]: 2026-02-16 13:44:53.318 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:44:55 compute-0 ovn_controller[96072]: 2026-02-16T13:44:55Z|00172|binding|INFO|Claiming lport b5736eee-a7c7-4376-87f8-2ba8e852813f for this chassis.
Feb 16 13:44:55 compute-0 ovn_controller[96072]: 2026-02-16T13:44:55Z|00173|binding|INFO|b5736eee-a7c7-4376-87f8-2ba8e852813f: Claiming fa:16:3e:e4:03:04 10.100.0.11
Feb 16 13:44:55 compute-0 ovn_controller[96072]: 2026-02-16T13:44:55Z|00174|binding|INFO|Setting lport b5736eee-a7c7-4376-87f8-2ba8e852813f up in Southbound
Feb 16 13:44:55 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:44:55.161 105360 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e4:03:04 10.100.0.11'], port_security=['fa:16:3e:e4:03:04 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '5a1cf877-f781-4088-8f98-19d39a95d5bc', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '76c271745e704d5fa97fe16a7dcd4a81', 'neutron:revision_number': '11', 'neutron:security_group_ids': '832f9d98-96fb-45e2-8c11-0f4534711ee2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fb3ae2f5-242d-4288-83f4-c053c76499e5, chassis=[<ovs.db.idl.Row object at 0x7f2e53046760>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2e53046760>], logical_port=b5736eee-a7c7-4376-87f8-2ba8e852813f) old=Port_Binding(up=[False], additional_chassis=[<ovs.db.idl.Row object at 0x7f2e53046760>], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 13:44:55 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:44:55.162 105360 INFO neutron.agent.ovn.metadata.agent [-] Port b5736eee-a7c7-4376-87f8-2ba8e852813f in datapath 62a1ccdd-3048-4bbf-acc8-c791bff79ee8 bound to our chassis
Feb 16 13:44:55 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:44:55.163 105360 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 62a1ccdd-3048-4bbf-acc8-c791bff79ee8
Feb 16 13:44:55 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:44:55.176 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[227c3810-5018-475e-9779-b2066b645797]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:44:55 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:44:55.200 206452 DEBUG oslo.privsep.daemon [-] privsep: reply[3c04ac2e-7aa0-4630-b92b-827a4b3c7e17]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:44:55 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:44:55.203 206452 DEBUG oslo.privsep.daemon [-] privsep: reply[bfabfe10-0afc-4712-a36f-66b04a034e3c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:44:55 compute-0 nova_compute[185723]: 2026-02-16 13:44:55.224 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:44:55 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:44:55.225 206452 DEBUG oslo.privsep.daemon [-] privsep: reply[9bb3e2cb-9da2-4e63-b2b8-6457c417e522]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:44:55 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:44:55.241 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[7145c80a-5c0d-4a5a-ba63-010392256cb8]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap62a1ccdd-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a9:94:92'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 13, 'tx_packets': 5, 'rx_bytes': 826, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 13, 'tx_packets': 5, 'rx_bytes': 826, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 48], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 550951, 'reachable_time': 43761, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214082, 'error': None, 'target': 'ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:44:55 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:44:55.259 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[ca5f40fc-48f4-4f86-b114-94ed1d23de70]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap62a1ccdd-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 550959, 'tstamp': 550959}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214083, 'error': None, 'target': 'ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap62a1ccdd-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 550961, 'tstamp': 550961}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214083, 'error': None, 'target': 'ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:44:55 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:44:55.261 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap62a1ccdd-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:44:55 compute-0 nova_compute[185723]: 2026-02-16 13:44:55.263 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:44:55 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:44:55.264 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap62a1ccdd-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:44:55 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:44:55.264 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 13:44:55 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:44:55.265 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap62a1ccdd-30, col_values=(('external_ids', {'iface-id': 'ac21d57d-f71e-4560-b6aa-e9f6e3838308'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:44:55 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:44:55.265 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 13:44:58 compute-0 nova_compute[185723]: 2026-02-16 13:44:58.321 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:44:58 compute-0 nova_compute[185723]: 2026-02-16 13:44:58.418 185727 INFO nova.compute.manager [None req-665a2939-c829-4159-b077-d3b1b28b59e7 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 5a1cf877-f781-4088-8f98-19d39a95d5bc] Post operation of migration started
Feb 16 13:44:58 compute-0 nova_compute[185723]: 2026-02-16 13:44:58.985 185727 DEBUG oslo_concurrency.lockutils [None req-665a2939-c829-4159-b077-d3b1b28b59e7 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "refresh_cache-5a1cf877-f781-4088-8f98-19d39a95d5bc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 13:44:58 compute-0 nova_compute[185723]: 2026-02-16 13:44:58.985 185727 DEBUG oslo_concurrency.lockutils [None req-665a2939-c829-4159-b077-d3b1b28b59e7 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Acquired lock "refresh_cache-5a1cf877-f781-4088-8f98-19d39a95d5bc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 13:44:58 compute-0 nova_compute[185723]: 2026-02-16 13:44:58.985 185727 DEBUG nova.network.neutron [None req-665a2939-c829-4159-b077-d3b1b28b59e7 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 5a1cf877-f781-4088-8f98-19d39a95d5bc] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 16 13:44:59 compute-0 podman[195053]: time="2026-02-16T13:44:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:44:59 compute-0 podman[195053]: @ - - [16/Feb/2026:13:44:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17244 "" "Go-http-client/1.1"
Feb 16 13:44:59 compute-0 podman[195053]: @ - - [16/Feb/2026:13:44:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2642 "" "Go-http-client/1.1"
Feb 16 13:45:00 compute-0 nova_compute[185723]: 2026-02-16 13:45:00.226 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:45:00 compute-0 nova_compute[185723]: 2026-02-16 13:45:00.977 185727 DEBUG nova.network.neutron [None req-665a2939-c829-4159-b077-d3b1b28b59e7 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 5a1cf877-f781-4088-8f98-19d39a95d5bc] Updating instance_info_cache with network_info: [{"id": "b5736eee-a7c7-4376-87f8-2ba8e852813f", "address": "fa:16:3e:e4:03:04", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb5736eee-a7", "ovs_interfaceid": "b5736eee-a7c7-4376-87f8-2ba8e852813f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 13:45:01 compute-0 nova_compute[185723]: 2026-02-16 13:45:01.105 185727 DEBUG oslo_concurrency.lockutils [None req-665a2939-c829-4159-b077-d3b1b28b59e7 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Releasing lock "refresh_cache-5a1cf877-f781-4088-8f98-19d39a95d5bc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 13:45:01 compute-0 nova_compute[185723]: 2026-02-16 13:45:01.123 185727 DEBUG oslo_concurrency.lockutils [None req-665a2939-c829-4159-b077-d3b1b28b59e7 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:45:01 compute-0 nova_compute[185723]: 2026-02-16 13:45:01.124 185727 DEBUG oslo_concurrency.lockutils [None req-665a2939-c829-4159-b077-d3b1b28b59e7 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:45:01 compute-0 nova_compute[185723]: 2026-02-16 13:45:01.124 185727 DEBUG oslo_concurrency.lockutils [None req-665a2939-c829-4159-b077-d3b1b28b59e7 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:45:01 compute-0 nova_compute[185723]: 2026-02-16 13:45:01.128 185727 INFO nova.virt.libvirt.driver [None req-665a2939-c829-4159-b077-d3b1b28b59e7 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 5a1cf877-f781-4088-8f98-19d39a95d5bc] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Feb 16 13:45:01 compute-0 virtqemud[184843]: Domain id=16 name='instance-00000013' uuid=5a1cf877-f781-4088-8f98-19d39a95d5bc is tainted: custom-monitor
Feb 16 13:45:01 compute-0 openstack_network_exporter[197909]: ERROR   13:45:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:45:01 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:45:01 compute-0 openstack_network_exporter[197909]: ERROR   13:45:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:45:01 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:45:02 compute-0 nova_compute[185723]: 2026-02-16 13:45:02.135 185727 INFO nova.virt.libvirt.driver [None req-665a2939-c829-4159-b077-d3b1b28b59e7 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 5a1cf877-f781-4088-8f98-19d39a95d5bc] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Feb 16 13:45:03 compute-0 nova_compute[185723]: 2026-02-16 13:45:03.169 185727 INFO nova.virt.libvirt.driver [None req-665a2939-c829-4159-b077-d3b1b28b59e7 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 5a1cf877-f781-4088-8f98-19d39a95d5bc] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Feb 16 13:45:03 compute-0 nova_compute[185723]: 2026-02-16 13:45:03.173 185727 DEBUG nova.compute.manager [None req-665a2939-c829-4159-b077-d3b1b28b59e7 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 5a1cf877-f781-4088-8f98-19d39a95d5bc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:45:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:45:03.238 105360 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:45:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:45:03.238 105360 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:45:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:45:03.239 105360 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:45:03 compute-0 podman[214084]: 2026-02-16 13:45:03.266058794 +0000 UTC m=+0.054153233 container health_status 93151dcbec9f159583042df5101bdd7b5b1bcb5f3117955b4bc9cd1c0e465b98 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1770267347, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., build-date=2026-02-05T04:57:10Z, version=9.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.openshift.expose-services=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., config_id=openstack_network_exporter, container_name=openstack_network_exporter, managed_by=edpm_ansible, org.opencontainers.image.created=2026-02-05T04:57:10Z, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.33.7, name=ubi9/ubi-minimal, distribution-scope=public, architecture=x86_64)
Feb 16 13:45:03 compute-0 podman[214085]: 2026-02-16 13:45:03.283403654 +0000 UTC m=+0.072127109 container health_status d69bd0be854ec8043fcba555b70c2cd311c7a2510d07d6bc9929fe7684abece9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, managed_by=edpm_ansible)
Feb 16 13:45:03 compute-0 nova_compute[185723]: 2026-02-16 13:45:03.306 185727 DEBUG nova.objects.instance [None req-665a2939-c829-4159-b077-d3b1b28b59e7 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 5a1cf877-f781-4088-8f98-19d39a95d5bc] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Feb 16 13:45:03 compute-0 nova_compute[185723]: 2026-02-16 13:45:03.324 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:45:05 compute-0 nova_compute[185723]: 2026-02-16 13:45:05.227 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:45:08 compute-0 podman[214123]: 2026-02-16 13:45:08.046368671 +0000 UTC m=+0.086579277 container health_status 19961a2f872cc72daf954f4dd556f980b5f58f137d145776df6132c167a8a93c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Feb 16 13:45:08 compute-0 nova_compute[185723]: 2026-02-16 13:45:08.325 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:45:08 compute-0 nova_compute[185723]: 2026-02-16 13:45:08.434 185727 DEBUG oslo_concurrency.lockutils [None req-5be2052d-dba6-4a4e-961a-ca20a6cf33a7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Acquiring lock "e5e9f487-f690-47b2-aaed-59236907f08b" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:45:08 compute-0 nova_compute[185723]: 2026-02-16 13:45:08.435 185727 DEBUG oslo_concurrency.lockutils [None req-5be2052d-dba6-4a4e-961a-ca20a6cf33a7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "e5e9f487-f690-47b2-aaed-59236907f08b" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:45:08 compute-0 nova_compute[185723]: 2026-02-16 13:45:08.435 185727 DEBUG oslo_concurrency.lockutils [None req-5be2052d-dba6-4a4e-961a-ca20a6cf33a7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Acquiring lock "e5e9f487-f690-47b2-aaed-59236907f08b-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:45:08 compute-0 nova_compute[185723]: 2026-02-16 13:45:08.435 185727 DEBUG oslo_concurrency.lockutils [None req-5be2052d-dba6-4a4e-961a-ca20a6cf33a7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "e5e9f487-f690-47b2-aaed-59236907f08b-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:45:08 compute-0 nova_compute[185723]: 2026-02-16 13:45:08.435 185727 DEBUG oslo_concurrency.lockutils [None req-5be2052d-dba6-4a4e-961a-ca20a6cf33a7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "e5e9f487-f690-47b2-aaed-59236907f08b-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:45:08 compute-0 nova_compute[185723]: 2026-02-16 13:45:08.436 185727 INFO nova.compute.manager [None req-5be2052d-dba6-4a4e-961a-ca20a6cf33a7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: e5e9f487-f690-47b2-aaed-59236907f08b] Terminating instance
Feb 16 13:45:08 compute-0 nova_compute[185723]: 2026-02-16 13:45:08.437 185727 DEBUG nova.compute.manager [None req-5be2052d-dba6-4a4e-961a-ca20a6cf33a7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: e5e9f487-f690-47b2-aaed-59236907f08b] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Feb 16 13:45:08 compute-0 kernel: tapabc17ab2-80 (unregistering): left promiscuous mode
Feb 16 13:45:08 compute-0 NetworkManager[56177]: <info>  [1771249508.4637] device (tapabc17ab2-80): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Feb 16 13:45:08 compute-0 ovn_controller[96072]: 2026-02-16T13:45:08Z|00175|binding|INFO|Releasing lport abc17ab2-80ac-4b01-990a-6eb58e6238ba from this chassis (sb_readonly=0)
Feb 16 13:45:08 compute-0 ovn_controller[96072]: 2026-02-16T13:45:08Z|00176|binding|INFO|Setting lport abc17ab2-80ac-4b01-990a-6eb58e6238ba down in Southbound
Feb 16 13:45:08 compute-0 ovn_controller[96072]: 2026-02-16T13:45:08Z|00177|binding|INFO|Removing iface tapabc17ab2-80 ovn-installed in OVS
Feb 16 13:45:08 compute-0 nova_compute[185723]: 2026-02-16 13:45:08.469 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:45:08 compute-0 nova_compute[185723]: 2026-02-16 13:45:08.478 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:45:08 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:45:08.481 105360 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:eb:ae:a5 10.100.0.14'], port_security=['fa:16:3e:eb:ae:a5 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'e5e9f487-f690-47b2-aaed-59236907f08b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '76c271745e704d5fa97fe16a7dcd4a81', 'neutron:revision_number': '4', 'neutron:security_group_ids': '832f9d98-96fb-45e2-8c11-0f4534711ee2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fb3ae2f5-242d-4288-83f4-c053c76499e5, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2e53046760>], logical_port=abc17ab2-80ac-4b01-990a-6eb58e6238ba) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2e53046760>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 13:45:08 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:45:08.483 105360 INFO neutron.agent.ovn.metadata.agent [-] Port abc17ab2-80ac-4b01-990a-6eb58e6238ba in datapath 62a1ccdd-3048-4bbf-acc8-c791bff79ee8 unbound from our chassis
Feb 16 13:45:08 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:45:08.484 105360 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 62a1ccdd-3048-4bbf-acc8-c791bff79ee8
Feb 16 13:45:08 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:45:08.496 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[c795fbb2-bf12-494c-9d13-dedc4f9d2f99]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:45:08 compute-0 systemd[1]: machine-qemu\x2d15\x2dinstance\x2d00000014.scope: Deactivated successfully.
Feb 16 13:45:08 compute-0 systemd[1]: machine-qemu\x2d15\x2dinstance\x2d00000014.scope: Consumed 14.875s CPU time.
Feb 16 13:45:08 compute-0 systemd-machined[155229]: Machine qemu-15-instance-00000014 terminated.
Feb 16 13:45:08 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:45:08.519 206452 DEBUG oslo.privsep.daemon [-] privsep: reply[3276dab5-f170-4115-93b3-5b0c44f39e44]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:45:08 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:45:08.523 206452 DEBUG oslo.privsep.daemon [-] privsep: reply[b14c9741-220b-4a2e-b4ae-d178dcad3703]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:45:08 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:45:08.544 206452 DEBUG oslo.privsep.daemon [-] privsep: reply[d1c735cc-0551-48a0-8a2f-140503069a3d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:45:08 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:45:08.562 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[ae96da58-030d-4669-b97e-e690e4a942be]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap62a1ccdd-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a9:94:92'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 28, 'tx_packets': 7, 'rx_bytes': 1456, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 28, 'tx_packets': 7, 'rx_bytes': 1456, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 48], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 550951, 'reachable_time': 43761, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214162, 'error': None, 'target': 'ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:45:08 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:45:08.573 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[092c7708-c705-402c-8340-2c6edc7647fb]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap62a1ccdd-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 550959, 'tstamp': 550959}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214163, 'error': None, 'target': 'ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap62a1ccdd-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 550961, 'tstamp': 550961}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214163, 'error': None, 'target': 'ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:45:08 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:45:08.575 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap62a1ccdd-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:45:08 compute-0 nova_compute[185723]: 2026-02-16 13:45:08.576 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:45:08 compute-0 nova_compute[185723]: 2026-02-16 13:45:08.581 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:45:08 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:45:08.582 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap62a1ccdd-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:45:08 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:45:08.582 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 13:45:08 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:45:08.583 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap62a1ccdd-30, col_values=(('external_ids', {'iface-id': 'ac21d57d-f71e-4560-b6aa-e9f6e3838308'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:45:08 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:45:08.583 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 13:45:08 compute-0 nova_compute[185723]: 2026-02-16 13:45:08.652 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:45:08 compute-0 nova_compute[185723]: 2026-02-16 13:45:08.656 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:45:08 compute-0 nova_compute[185723]: 2026-02-16 13:45:08.689 185727 INFO nova.virt.libvirt.driver [-] [instance: e5e9f487-f690-47b2-aaed-59236907f08b] Instance destroyed successfully.
Feb 16 13:45:08 compute-0 nova_compute[185723]: 2026-02-16 13:45:08.690 185727 DEBUG nova.objects.instance [None req-5be2052d-dba6-4a4e-961a-ca20a6cf33a7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lazy-loading 'resources' on Instance uuid e5e9f487-f690-47b2-aaed-59236907f08b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 13:45:08 compute-0 nova_compute[185723]: 2026-02-16 13:45:08.713 185727 DEBUG nova.virt.libvirt.vif [None req-5be2052d-dba6-4a4e-961a-ca20a6cf33a7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-16T13:43:35Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-2076948568',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-2076948568',id=20,image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-16T13:43:46Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='76c271745e704d5fa97fe16a7dcd4a81',ramdisk_id='',reservation_id='r-1ickj6a9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-1085993185',owner_user_name='tempest-TestExecuteStrategies-1085993185-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-16T13:43:47Z,user_data=None,user_id='e19cd2d8a8894526ba620ca3249e9a63',uuid=e5e9f487-f690-47b2-aaed-59236907f08b,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "abc17ab2-80ac-4b01-990a-6eb58e6238ba", "address": "fa:16:3e:eb:ae:a5", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapabc17ab2-80", "ovs_interfaceid": "abc17ab2-80ac-4b01-990a-6eb58e6238ba", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Feb 16 13:45:08 compute-0 nova_compute[185723]: 2026-02-16 13:45:08.713 185727 DEBUG nova.network.os_vif_util [None req-5be2052d-dba6-4a4e-961a-ca20a6cf33a7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Converting VIF {"id": "abc17ab2-80ac-4b01-990a-6eb58e6238ba", "address": "fa:16:3e:eb:ae:a5", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapabc17ab2-80", "ovs_interfaceid": "abc17ab2-80ac-4b01-990a-6eb58e6238ba", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 13:45:08 compute-0 nova_compute[185723]: 2026-02-16 13:45:08.714 185727 DEBUG nova.network.os_vif_util [None req-5be2052d-dba6-4a4e-961a-ca20a6cf33a7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:eb:ae:a5,bridge_name='br-int',has_traffic_filtering=True,id=abc17ab2-80ac-4b01-990a-6eb58e6238ba,network=Network(62a1ccdd-3048-4bbf-acc8-c791bff79ee8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapabc17ab2-80') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 13:45:08 compute-0 nova_compute[185723]: 2026-02-16 13:45:08.714 185727 DEBUG os_vif [None req-5be2052d-dba6-4a4e-961a-ca20a6cf33a7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:eb:ae:a5,bridge_name='br-int',has_traffic_filtering=True,id=abc17ab2-80ac-4b01-990a-6eb58e6238ba,network=Network(62a1ccdd-3048-4bbf-acc8-c791bff79ee8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapabc17ab2-80') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Feb 16 13:45:08 compute-0 nova_compute[185723]: 2026-02-16 13:45:08.715 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:45:08 compute-0 nova_compute[185723]: 2026-02-16 13:45:08.716 185727 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapabc17ab2-80, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:45:08 compute-0 nova_compute[185723]: 2026-02-16 13:45:08.717 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:45:08 compute-0 nova_compute[185723]: 2026-02-16 13:45:08.718 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:45:08 compute-0 nova_compute[185723]: 2026-02-16 13:45:08.720 185727 INFO os_vif [None req-5be2052d-dba6-4a4e-961a-ca20a6cf33a7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:eb:ae:a5,bridge_name='br-int',has_traffic_filtering=True,id=abc17ab2-80ac-4b01-990a-6eb58e6238ba,network=Network(62a1ccdd-3048-4bbf-acc8-c791bff79ee8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapabc17ab2-80')
Feb 16 13:45:08 compute-0 nova_compute[185723]: 2026-02-16 13:45:08.720 185727 INFO nova.virt.libvirt.driver [None req-5be2052d-dba6-4a4e-961a-ca20a6cf33a7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: e5e9f487-f690-47b2-aaed-59236907f08b] Deleting instance files /var/lib/nova/instances/e5e9f487-f690-47b2-aaed-59236907f08b_del
Feb 16 13:45:08 compute-0 nova_compute[185723]: 2026-02-16 13:45:08.721 185727 INFO nova.virt.libvirt.driver [None req-5be2052d-dba6-4a4e-961a-ca20a6cf33a7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: e5e9f487-f690-47b2-aaed-59236907f08b] Deletion of /var/lib/nova/instances/e5e9f487-f690-47b2-aaed-59236907f08b_del complete
Feb 16 13:45:08 compute-0 nova_compute[185723]: 2026-02-16 13:45:08.807 185727 INFO nova.compute.manager [None req-5be2052d-dba6-4a4e-961a-ca20a6cf33a7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: e5e9f487-f690-47b2-aaed-59236907f08b] Took 0.37 seconds to destroy the instance on the hypervisor.
Feb 16 13:45:08 compute-0 nova_compute[185723]: 2026-02-16 13:45:08.807 185727 DEBUG oslo.service.loopingcall [None req-5be2052d-dba6-4a4e-961a-ca20a6cf33a7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Feb 16 13:45:08 compute-0 nova_compute[185723]: 2026-02-16 13:45:08.808 185727 DEBUG nova.compute.manager [-] [instance: e5e9f487-f690-47b2-aaed-59236907f08b] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Feb 16 13:45:08 compute-0 nova_compute[185723]: 2026-02-16 13:45:08.808 185727 DEBUG nova.network.neutron [-] [instance: e5e9f487-f690-47b2-aaed-59236907f08b] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Feb 16 13:45:09 compute-0 nova_compute[185723]: 2026-02-16 13:45:09.557 185727 DEBUG nova.compute.manager [req-b7fb506f-7df0-4e9d-8133-c9aa3bf7e983 req-2072ae40-ce68-4cb0-8ff7-d3974f10a751 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: e5e9f487-f690-47b2-aaed-59236907f08b] Received event network-vif-unplugged-abc17ab2-80ac-4b01-990a-6eb58e6238ba external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:45:09 compute-0 nova_compute[185723]: 2026-02-16 13:45:09.558 185727 DEBUG oslo_concurrency.lockutils [req-b7fb506f-7df0-4e9d-8133-c9aa3bf7e983 req-2072ae40-ce68-4cb0-8ff7-d3974f10a751 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "e5e9f487-f690-47b2-aaed-59236907f08b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:45:09 compute-0 nova_compute[185723]: 2026-02-16 13:45:09.558 185727 DEBUG oslo_concurrency.lockutils [req-b7fb506f-7df0-4e9d-8133-c9aa3bf7e983 req-2072ae40-ce68-4cb0-8ff7-d3974f10a751 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "e5e9f487-f690-47b2-aaed-59236907f08b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:45:09 compute-0 nova_compute[185723]: 2026-02-16 13:45:09.558 185727 DEBUG oslo_concurrency.lockutils [req-b7fb506f-7df0-4e9d-8133-c9aa3bf7e983 req-2072ae40-ce68-4cb0-8ff7-d3974f10a751 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "e5e9f487-f690-47b2-aaed-59236907f08b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:45:09 compute-0 nova_compute[185723]: 2026-02-16 13:45:09.558 185727 DEBUG nova.compute.manager [req-b7fb506f-7df0-4e9d-8133-c9aa3bf7e983 req-2072ae40-ce68-4cb0-8ff7-d3974f10a751 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: e5e9f487-f690-47b2-aaed-59236907f08b] No waiting events found dispatching network-vif-unplugged-abc17ab2-80ac-4b01-990a-6eb58e6238ba pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:45:09 compute-0 nova_compute[185723]: 2026-02-16 13:45:09.559 185727 DEBUG nova.compute.manager [req-b7fb506f-7df0-4e9d-8133-c9aa3bf7e983 req-2072ae40-ce68-4cb0-8ff7-d3974f10a751 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: e5e9f487-f690-47b2-aaed-59236907f08b] Received event network-vif-unplugged-abc17ab2-80ac-4b01-990a-6eb58e6238ba for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Feb 16 13:45:09 compute-0 nova_compute[185723]: 2026-02-16 13:45:09.914 185727 DEBUG nova.network.neutron [-] [instance: e5e9f487-f690-47b2-aaed-59236907f08b] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 13:45:09 compute-0 nova_compute[185723]: 2026-02-16 13:45:09.950 185727 INFO nova.compute.manager [-] [instance: e5e9f487-f690-47b2-aaed-59236907f08b] Took 1.14 seconds to deallocate network for instance.
Feb 16 13:45:10 compute-0 nova_compute[185723]: 2026-02-16 13:45:10.019 185727 DEBUG oslo_concurrency.lockutils [None req-5be2052d-dba6-4a4e-961a-ca20a6cf33a7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:45:10 compute-0 nova_compute[185723]: 2026-02-16 13:45:10.019 185727 DEBUG oslo_concurrency.lockutils [None req-5be2052d-dba6-4a4e-961a-ca20a6cf33a7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:45:10 compute-0 nova_compute[185723]: 2026-02-16 13:45:10.121 185727 DEBUG nova.compute.provider_tree [None req-5be2052d-dba6-4a4e-961a-ca20a6cf33a7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Inventory has not changed in ProviderTree for provider: c9501a85-df32-4b8f-bce0-9425ef1e7866 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 13:45:10 compute-0 nova_compute[185723]: 2026-02-16 13:45:10.144 185727 DEBUG nova.scheduler.client.report [None req-5be2052d-dba6-4a4e-961a-ca20a6cf33a7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Inventory has not changed for provider c9501a85-df32-4b8f-bce0-9425ef1e7866 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 13:45:10 compute-0 nova_compute[185723]: 2026-02-16 13:45:10.170 185727 DEBUG oslo_concurrency.lockutils [None req-5be2052d-dba6-4a4e-961a-ca20a6cf33a7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.151s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:45:10 compute-0 nova_compute[185723]: 2026-02-16 13:45:10.242 185727 INFO nova.scheduler.client.report [None req-5be2052d-dba6-4a4e-961a-ca20a6cf33a7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Deleted allocations for instance e5e9f487-f690-47b2-aaed-59236907f08b
Feb 16 13:45:10 compute-0 nova_compute[185723]: 2026-02-16 13:45:10.274 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:45:10 compute-0 nova_compute[185723]: 2026-02-16 13:45:10.334 185727 DEBUG oslo_concurrency.lockutils [None req-5be2052d-dba6-4a4e-961a-ca20a6cf33a7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "e5e9f487-f690-47b2-aaed-59236907f08b" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.900s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:45:10 compute-0 nova_compute[185723]: 2026-02-16 13:45:10.893 185727 DEBUG oslo_concurrency.lockutils [None req-fcea5fe2-5552-40e8-9aa1-35d03b76ec2b e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Acquiring lock "5a1cf877-f781-4088-8f98-19d39a95d5bc" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:45:10 compute-0 nova_compute[185723]: 2026-02-16 13:45:10.894 185727 DEBUG oslo_concurrency.lockutils [None req-fcea5fe2-5552-40e8-9aa1-35d03b76ec2b e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "5a1cf877-f781-4088-8f98-19d39a95d5bc" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:45:10 compute-0 nova_compute[185723]: 2026-02-16 13:45:10.894 185727 DEBUG oslo_concurrency.lockutils [None req-fcea5fe2-5552-40e8-9aa1-35d03b76ec2b e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Acquiring lock "5a1cf877-f781-4088-8f98-19d39a95d5bc-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:45:10 compute-0 nova_compute[185723]: 2026-02-16 13:45:10.894 185727 DEBUG oslo_concurrency.lockutils [None req-fcea5fe2-5552-40e8-9aa1-35d03b76ec2b e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "5a1cf877-f781-4088-8f98-19d39a95d5bc-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:45:10 compute-0 nova_compute[185723]: 2026-02-16 13:45:10.894 185727 DEBUG oslo_concurrency.lockutils [None req-fcea5fe2-5552-40e8-9aa1-35d03b76ec2b e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "5a1cf877-f781-4088-8f98-19d39a95d5bc-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:45:10 compute-0 nova_compute[185723]: 2026-02-16 13:45:10.895 185727 INFO nova.compute.manager [None req-fcea5fe2-5552-40e8-9aa1-35d03b76ec2b e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 5a1cf877-f781-4088-8f98-19d39a95d5bc] Terminating instance
Feb 16 13:45:10 compute-0 nova_compute[185723]: 2026-02-16 13:45:10.896 185727 DEBUG nova.compute.manager [None req-fcea5fe2-5552-40e8-9aa1-35d03b76ec2b e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 5a1cf877-f781-4088-8f98-19d39a95d5bc] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Feb 16 13:45:10 compute-0 kernel: tapb5736eee-a7 (unregistering): left promiscuous mode
Feb 16 13:45:10 compute-0 NetworkManager[56177]: <info>  [1771249510.9239] device (tapb5736eee-a7): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Feb 16 13:45:10 compute-0 nova_compute[185723]: 2026-02-16 13:45:10.925 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:45:10 compute-0 ovn_controller[96072]: 2026-02-16T13:45:10Z|00178|binding|INFO|Releasing lport b5736eee-a7c7-4376-87f8-2ba8e852813f from this chassis (sb_readonly=0)
Feb 16 13:45:10 compute-0 ovn_controller[96072]: 2026-02-16T13:45:10Z|00179|binding|INFO|Setting lport b5736eee-a7c7-4376-87f8-2ba8e852813f down in Southbound
Feb 16 13:45:10 compute-0 ovn_controller[96072]: 2026-02-16T13:45:10Z|00180|binding|INFO|Removing iface tapb5736eee-a7 ovn-installed in OVS
Feb 16 13:45:10 compute-0 nova_compute[185723]: 2026-02-16 13:45:10.928 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:45:10 compute-0 nova_compute[185723]: 2026-02-16 13:45:10.934 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:45:10 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:45:10.934 105360 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e4:03:04 10.100.0.11'], port_security=['fa:16:3e:e4:03:04 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '5a1cf877-f781-4088-8f98-19d39a95d5bc', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '76c271745e704d5fa97fe16a7dcd4a81', 'neutron:revision_number': '13', 'neutron:security_group_ids': '832f9d98-96fb-45e2-8c11-0f4534711ee2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fb3ae2f5-242d-4288-83f4-c053c76499e5, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2e53046760>], logical_port=b5736eee-a7c7-4376-87f8-2ba8e852813f) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2e53046760>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 13:45:10 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:45:10.936 105360 INFO neutron.agent.ovn.metadata.agent [-] Port b5736eee-a7c7-4376-87f8-2ba8e852813f in datapath 62a1ccdd-3048-4bbf-acc8-c791bff79ee8 unbound from our chassis
Feb 16 13:45:10 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:45:10.937 105360 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 62a1ccdd-3048-4bbf-acc8-c791bff79ee8, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 16 13:45:10 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:45:10.938 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[e5de6a35-32d2-4d21-a030-eeb593946194]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:45:10 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:45:10.939 105360 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8 namespace which is not needed anymore
Feb 16 13:45:10 compute-0 systemd[1]: machine-qemu\x2d16\x2dinstance\x2d00000013.scope: Deactivated successfully.
Feb 16 13:45:10 compute-0 systemd[1]: machine-qemu\x2d16\x2dinstance\x2d00000013.scope: Consumed 2.214s CPU time.
Feb 16 13:45:10 compute-0 systemd-machined[155229]: Machine qemu-16-instance-00000013 terminated.
Feb 16 13:45:11 compute-0 neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8[213756]: [NOTICE]   (213760) : haproxy version is 2.8.14-c23fe91
Feb 16 13:45:11 compute-0 neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8[213756]: [NOTICE]   (213760) : path to executable is /usr/sbin/haproxy
Feb 16 13:45:11 compute-0 neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8[213756]: [WARNING]  (213760) : Exiting Master process...
Feb 16 13:45:11 compute-0 neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8[213756]: [ALERT]    (213760) : Current worker (213762) exited with code 143 (Terminated)
Feb 16 13:45:11 compute-0 neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8[213756]: [WARNING]  (213760) : All workers exited. Exiting... (0)
Feb 16 13:45:11 compute-0 systemd[1]: libpod-4fd7d8f73edfc720b4031ed6d16086fa8d0733c87d92b2a6e99faeef54334971.scope: Deactivated successfully.
Feb 16 13:45:11 compute-0 podman[214203]: 2026-02-16 13:45:11.057327532 +0000 UTC m=+0.043065288 container died 4fd7d8f73edfc720b4031ed6d16086fa8d0733c87d92b2a6e99faeef54334971 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true)
Feb 16 13:45:11 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-4fd7d8f73edfc720b4031ed6d16086fa8d0733c87d92b2a6e99faeef54334971-userdata-shm.mount: Deactivated successfully.
Feb 16 13:45:11 compute-0 systemd[1]: var-lib-containers-storage-overlay-94becb21cc7476a7bacb97d85f00285439de3e453447abec59d7416e107ac9bf-merged.mount: Deactivated successfully.
Feb 16 13:45:11 compute-0 podman[214203]: 2026-02-16 13:45:11.090484624 +0000 UTC m=+0.076222380 container cleanup 4fd7d8f73edfc720b4031ed6d16086fa8d0733c87d92b2a6e99faeef54334971 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.license=GPLv2)
Feb 16 13:45:11 compute-0 systemd[1]: libpod-conmon-4fd7d8f73edfc720b4031ed6d16086fa8d0733c87d92b2a6e99faeef54334971.scope: Deactivated successfully.
Feb 16 13:45:11 compute-0 NetworkManager[56177]: <info>  [1771249511.1130] manager: (tapb5736eee-a7): new Tun device (/org/freedesktop/NetworkManager/Devices/72)
Feb 16 13:45:11 compute-0 nova_compute[185723]: 2026-02-16 13:45:11.141 185727 INFO nova.virt.libvirt.driver [-] [instance: 5a1cf877-f781-4088-8f98-19d39a95d5bc] Instance destroyed successfully.
Feb 16 13:45:11 compute-0 nova_compute[185723]: 2026-02-16 13:45:11.141 185727 DEBUG nova.objects.instance [None req-fcea5fe2-5552-40e8-9aa1-35d03b76ec2b e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lazy-loading 'resources' on Instance uuid 5a1cf877-f781-4088-8f98-19d39a95d5bc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 13:45:11 compute-0 podman[214232]: 2026-02-16 13:45:11.148211335 +0000 UTC m=+0.040307920 container remove 4fd7d8f73edfc720b4031ed6d16086fa8d0733c87d92b2a6e99faeef54334971 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 16 13:45:11 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:45:11.152 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[003088fb-da8f-4861-aa4c-34f300f8816d]: (4, ('Mon Feb 16 01:45:11 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8 (4fd7d8f73edfc720b4031ed6d16086fa8d0733c87d92b2a6e99faeef54334971)\n4fd7d8f73edfc720b4031ed6d16086fa8d0733c87d92b2a6e99faeef54334971\nMon Feb 16 01:45:11 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8 (4fd7d8f73edfc720b4031ed6d16086fa8d0733c87d92b2a6e99faeef54334971)\n4fd7d8f73edfc720b4031ed6d16086fa8d0733c87d92b2a6e99faeef54334971\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:45:11 compute-0 nova_compute[185723]: 2026-02-16 13:45:11.154 185727 DEBUG nova.virt.libvirt.vif [None req-fcea5fe2-5552-40e8-9aa1-35d03b76ec2b e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-02-16T13:43:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-444598461',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-444598461',id=19,image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-16T13:43:28Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='76c271745e704d5fa97fe16a7dcd4a81',ramdisk_id='',reservation_id='r-dq2i0im0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-1085993185',owner_user_name='tempest-TestExecuteStrategies-1085993185-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-16T13:45:03Z,user_data=None,user_id='e19cd2d8a8894526ba620ca3249e9a63',uuid=5a1cf877-f781-4088-8f98-19d39a95d5bc,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b5736eee-a7c7-4376-87f8-2ba8e852813f", "address": "fa:16:3e:e4:03:04", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb5736eee-a7", "ovs_interfaceid": "b5736eee-a7c7-4376-87f8-2ba8e852813f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Feb 16 13:45:11 compute-0 nova_compute[185723]: 2026-02-16 13:45:11.154 185727 DEBUG nova.network.os_vif_util [None req-fcea5fe2-5552-40e8-9aa1-35d03b76ec2b e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Converting VIF {"id": "b5736eee-a7c7-4376-87f8-2ba8e852813f", "address": "fa:16:3e:e4:03:04", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb5736eee-a7", "ovs_interfaceid": "b5736eee-a7c7-4376-87f8-2ba8e852813f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 13:45:11 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:45:11.154 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[d1052a5a-c62a-4af5-9787-acc9b1acc61a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:45:11 compute-0 nova_compute[185723]: 2026-02-16 13:45:11.155 185727 DEBUG nova.network.os_vif_util [None req-fcea5fe2-5552-40e8-9aa1-35d03b76ec2b e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:e4:03:04,bridge_name='br-int',has_traffic_filtering=True,id=b5736eee-a7c7-4376-87f8-2ba8e852813f,network=Network(62a1ccdd-3048-4bbf-acc8-c791bff79ee8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb5736eee-a7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 13:45:11 compute-0 nova_compute[185723]: 2026-02-16 13:45:11.155 185727 DEBUG os_vif [None req-fcea5fe2-5552-40e8-9aa1-35d03b76ec2b e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:e4:03:04,bridge_name='br-int',has_traffic_filtering=True,id=b5736eee-a7c7-4376-87f8-2ba8e852813f,network=Network(62a1ccdd-3048-4bbf-acc8-c791bff79ee8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb5736eee-a7') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Feb 16 13:45:11 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:45:11.155 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap62a1ccdd-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:45:11 compute-0 nova_compute[185723]: 2026-02-16 13:45:11.157 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:45:11 compute-0 kernel: tap62a1ccdd-30: left promiscuous mode
Feb 16 13:45:11 compute-0 nova_compute[185723]: 2026-02-16 13:45:11.158 185727 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb5736eee-a7, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:45:11 compute-0 nova_compute[185723]: 2026-02-16 13:45:11.159 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:45:11 compute-0 nova_compute[185723]: 2026-02-16 13:45:11.160 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:45:11 compute-0 nova_compute[185723]: 2026-02-16 13:45:11.161 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 13:45:11 compute-0 nova_compute[185723]: 2026-02-16 13:45:11.164 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:45:11 compute-0 nova_compute[185723]: 2026-02-16 13:45:11.164 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:45:11 compute-0 nova_compute[185723]: 2026-02-16 13:45:11.166 185727 INFO os_vif [None req-fcea5fe2-5552-40e8-9aa1-35d03b76ec2b e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:e4:03:04,bridge_name='br-int',has_traffic_filtering=True,id=b5736eee-a7c7-4376-87f8-2ba8e852813f,network=Network(62a1ccdd-3048-4bbf-acc8-c791bff79ee8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb5736eee-a7')
Feb 16 13:45:11 compute-0 nova_compute[185723]: 2026-02-16 13:45:11.167 185727 INFO nova.virt.libvirt.driver [None req-fcea5fe2-5552-40e8-9aa1-35d03b76ec2b e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 5a1cf877-f781-4088-8f98-19d39a95d5bc] Deleting instance files /var/lib/nova/instances/5a1cf877-f781-4088-8f98-19d39a95d5bc_del
Feb 16 13:45:11 compute-0 nova_compute[185723]: 2026-02-16 13:45:11.167 185727 INFO nova.virt.libvirt.driver [None req-fcea5fe2-5552-40e8-9aa1-35d03b76ec2b e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 5a1cf877-f781-4088-8f98-19d39a95d5bc] Deletion of /var/lib/nova/instances/5a1cf877-f781-4088-8f98-19d39a95d5bc_del complete
Feb 16 13:45:11 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:45:11.167 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[e892d691-ca4d-4c0b-90ef-07bcd448df95]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:45:11 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:45:11.179 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[1a868291-6cb8-4983-a672-992f1881347b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:45:11 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:45:11.180 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[5325e99a-ca76-4904-aa0a-3bb1626b0fe5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:45:11 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:45:11.192 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[220764ab-1e38-45fc-9127-67bc0f82664b]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 550945, 'reachable_time': 36589, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214266, 'error': None, 'target': 'ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:45:11 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:45:11.194 105762 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Feb 16 13:45:11 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:45:11.194 105762 DEBUG oslo.privsep.daemon [-] privsep: reply[e8044bf6-32d8-4d43-b9e2-868342884063]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:45:11 compute-0 systemd[1]: run-netns-ovnmeta\x2d62a1ccdd\x2d3048\x2d4bbf\x2dacc8\x2dc791bff79ee8.mount: Deactivated successfully.
Feb 16 13:45:11 compute-0 nova_compute[185723]: 2026-02-16 13:45:11.234 185727 INFO nova.compute.manager [None req-fcea5fe2-5552-40e8-9aa1-35d03b76ec2b e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 5a1cf877-f781-4088-8f98-19d39a95d5bc] Took 0.34 seconds to destroy the instance on the hypervisor.
Feb 16 13:45:11 compute-0 nova_compute[185723]: 2026-02-16 13:45:11.235 185727 DEBUG oslo.service.loopingcall [None req-fcea5fe2-5552-40e8-9aa1-35d03b76ec2b e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Feb 16 13:45:11 compute-0 nova_compute[185723]: 2026-02-16 13:45:11.235 185727 DEBUG nova.compute.manager [-] [instance: 5a1cf877-f781-4088-8f98-19d39a95d5bc] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Feb 16 13:45:11 compute-0 nova_compute[185723]: 2026-02-16 13:45:11.236 185727 DEBUG nova.network.neutron [-] [instance: 5a1cf877-f781-4088-8f98-19d39a95d5bc] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Feb 16 13:45:11 compute-0 nova_compute[185723]: 2026-02-16 13:45:11.975 185727 DEBUG nova.compute.manager [req-4ab500bc-213c-4416-9558-dc335a6e22b0 req-340f5325-02da-457c-a986-30c205f1b338 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: e5e9f487-f690-47b2-aaed-59236907f08b] Received event network-vif-plugged-abc17ab2-80ac-4b01-990a-6eb58e6238ba external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:45:11 compute-0 nova_compute[185723]: 2026-02-16 13:45:11.975 185727 DEBUG oslo_concurrency.lockutils [req-4ab500bc-213c-4416-9558-dc335a6e22b0 req-340f5325-02da-457c-a986-30c205f1b338 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "e5e9f487-f690-47b2-aaed-59236907f08b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:45:11 compute-0 nova_compute[185723]: 2026-02-16 13:45:11.975 185727 DEBUG oslo_concurrency.lockutils [req-4ab500bc-213c-4416-9558-dc335a6e22b0 req-340f5325-02da-457c-a986-30c205f1b338 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "e5e9f487-f690-47b2-aaed-59236907f08b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:45:11 compute-0 nova_compute[185723]: 2026-02-16 13:45:11.976 185727 DEBUG oslo_concurrency.lockutils [req-4ab500bc-213c-4416-9558-dc335a6e22b0 req-340f5325-02da-457c-a986-30c205f1b338 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "e5e9f487-f690-47b2-aaed-59236907f08b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:45:11 compute-0 nova_compute[185723]: 2026-02-16 13:45:11.976 185727 DEBUG nova.compute.manager [req-4ab500bc-213c-4416-9558-dc335a6e22b0 req-340f5325-02da-457c-a986-30c205f1b338 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: e5e9f487-f690-47b2-aaed-59236907f08b] No waiting events found dispatching network-vif-plugged-abc17ab2-80ac-4b01-990a-6eb58e6238ba pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:45:11 compute-0 nova_compute[185723]: 2026-02-16 13:45:11.976 185727 WARNING nova.compute.manager [req-4ab500bc-213c-4416-9558-dc335a6e22b0 req-340f5325-02da-457c-a986-30c205f1b338 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: e5e9f487-f690-47b2-aaed-59236907f08b] Received unexpected event network-vif-plugged-abc17ab2-80ac-4b01-990a-6eb58e6238ba for instance with vm_state deleted and task_state None.
Feb 16 13:45:11 compute-0 nova_compute[185723]: 2026-02-16 13:45:11.976 185727 DEBUG nova.compute.manager [req-4ab500bc-213c-4416-9558-dc335a6e22b0 req-340f5325-02da-457c-a986-30c205f1b338 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: e5e9f487-f690-47b2-aaed-59236907f08b] Received event network-vif-deleted-abc17ab2-80ac-4b01-990a-6eb58e6238ba external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:45:12 compute-0 nova_compute[185723]: 2026-02-16 13:45:12.982 185727 DEBUG nova.compute.manager [req-b5f43953-8495-4d4b-acdd-56f2ea017037 req-483e0972-a50f-4c43-abfe-dca942a95cdb faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 5a1cf877-f781-4088-8f98-19d39a95d5bc] Received event network-vif-unplugged-b5736eee-a7c7-4376-87f8-2ba8e852813f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:45:12 compute-0 nova_compute[185723]: 2026-02-16 13:45:12.982 185727 DEBUG oslo_concurrency.lockutils [req-b5f43953-8495-4d4b-acdd-56f2ea017037 req-483e0972-a50f-4c43-abfe-dca942a95cdb faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "5a1cf877-f781-4088-8f98-19d39a95d5bc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:45:12 compute-0 nova_compute[185723]: 2026-02-16 13:45:12.983 185727 DEBUG oslo_concurrency.lockutils [req-b5f43953-8495-4d4b-acdd-56f2ea017037 req-483e0972-a50f-4c43-abfe-dca942a95cdb faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "5a1cf877-f781-4088-8f98-19d39a95d5bc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:45:12 compute-0 nova_compute[185723]: 2026-02-16 13:45:12.983 185727 DEBUG oslo_concurrency.lockutils [req-b5f43953-8495-4d4b-acdd-56f2ea017037 req-483e0972-a50f-4c43-abfe-dca942a95cdb faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "5a1cf877-f781-4088-8f98-19d39a95d5bc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:45:12 compute-0 nova_compute[185723]: 2026-02-16 13:45:12.983 185727 DEBUG nova.compute.manager [req-b5f43953-8495-4d4b-acdd-56f2ea017037 req-483e0972-a50f-4c43-abfe-dca942a95cdb faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 5a1cf877-f781-4088-8f98-19d39a95d5bc] No waiting events found dispatching network-vif-unplugged-b5736eee-a7c7-4376-87f8-2ba8e852813f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:45:12 compute-0 nova_compute[185723]: 2026-02-16 13:45:12.983 185727 DEBUG nova.compute.manager [req-b5f43953-8495-4d4b-acdd-56f2ea017037 req-483e0972-a50f-4c43-abfe-dca942a95cdb faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 5a1cf877-f781-4088-8f98-19d39a95d5bc] Received event network-vif-unplugged-b5736eee-a7c7-4376-87f8-2ba8e852813f for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Feb 16 13:45:13 compute-0 nova_compute[185723]: 2026-02-16 13:45:13.078 185727 DEBUG nova.network.neutron [-] [instance: 5a1cf877-f781-4088-8f98-19d39a95d5bc] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 13:45:13 compute-0 nova_compute[185723]: 2026-02-16 13:45:13.116 185727 INFO nova.compute.manager [-] [instance: 5a1cf877-f781-4088-8f98-19d39a95d5bc] Took 1.88 seconds to deallocate network for instance.
Feb 16 13:45:13 compute-0 nova_compute[185723]: 2026-02-16 13:45:13.177 185727 DEBUG oslo_concurrency.lockutils [None req-fcea5fe2-5552-40e8-9aa1-35d03b76ec2b e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:45:13 compute-0 nova_compute[185723]: 2026-02-16 13:45:13.178 185727 DEBUG oslo_concurrency.lockutils [None req-fcea5fe2-5552-40e8-9aa1-35d03b76ec2b e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:45:13 compute-0 nova_compute[185723]: 2026-02-16 13:45:13.186 185727 DEBUG oslo_concurrency.lockutils [None req-fcea5fe2-5552-40e8-9aa1-35d03b76ec2b e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.009s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:45:13 compute-0 nova_compute[185723]: 2026-02-16 13:45:13.236 185727 INFO nova.scheduler.client.report [None req-fcea5fe2-5552-40e8-9aa1-35d03b76ec2b e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Deleted allocations for instance 5a1cf877-f781-4088-8f98-19d39a95d5bc
Feb 16 13:45:13 compute-0 nova_compute[185723]: 2026-02-16 13:45:13.370 185727 DEBUG oslo_concurrency.lockutils [None req-fcea5fe2-5552-40e8-9aa1-35d03b76ec2b e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "5a1cf877-f781-4088-8f98-19d39a95d5bc" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.477s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:45:13 compute-0 sshd-session[214267]: Invalid user admin from 64.227.72.94 port 58186
Feb 16 13:45:14 compute-0 sshd-session[214267]: Connection closed by invalid user admin 64.227.72.94 port 58186 [preauth]
Feb 16 13:45:14 compute-0 nova_compute[185723]: 2026-02-16 13:45:14.185 185727 DEBUG nova.compute.manager [req-b61b7854-f46e-413b-a5f1-775d7359ab48 req-da248fd3-4537-4047-9e8b-f0a1c48c0296 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 5a1cf877-f781-4088-8f98-19d39a95d5bc] Received event network-vif-deleted-b5736eee-a7c7-4376-87f8-2ba8e852813f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:45:15 compute-0 nova_compute[185723]: 2026-02-16 13:45:15.113 185727 DEBUG nova.compute.manager [req-40150a49-228c-495d-8ba3-9591ff013aa5 req-50439232-7ca1-4ac7-ae83-ad3b5fd1a5e3 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 5a1cf877-f781-4088-8f98-19d39a95d5bc] Received event network-vif-plugged-b5736eee-a7c7-4376-87f8-2ba8e852813f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:45:15 compute-0 nova_compute[185723]: 2026-02-16 13:45:15.114 185727 DEBUG oslo_concurrency.lockutils [req-40150a49-228c-495d-8ba3-9591ff013aa5 req-50439232-7ca1-4ac7-ae83-ad3b5fd1a5e3 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "5a1cf877-f781-4088-8f98-19d39a95d5bc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:45:15 compute-0 nova_compute[185723]: 2026-02-16 13:45:15.114 185727 DEBUG oslo_concurrency.lockutils [req-40150a49-228c-495d-8ba3-9591ff013aa5 req-50439232-7ca1-4ac7-ae83-ad3b5fd1a5e3 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "5a1cf877-f781-4088-8f98-19d39a95d5bc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:45:15 compute-0 nova_compute[185723]: 2026-02-16 13:45:15.115 185727 DEBUG oslo_concurrency.lockutils [req-40150a49-228c-495d-8ba3-9591ff013aa5 req-50439232-7ca1-4ac7-ae83-ad3b5fd1a5e3 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "5a1cf877-f781-4088-8f98-19d39a95d5bc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:45:15 compute-0 nova_compute[185723]: 2026-02-16 13:45:15.115 185727 DEBUG nova.compute.manager [req-40150a49-228c-495d-8ba3-9591ff013aa5 req-50439232-7ca1-4ac7-ae83-ad3b5fd1a5e3 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 5a1cf877-f781-4088-8f98-19d39a95d5bc] No waiting events found dispatching network-vif-plugged-b5736eee-a7c7-4376-87f8-2ba8e852813f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:45:15 compute-0 nova_compute[185723]: 2026-02-16 13:45:15.115 185727 WARNING nova.compute.manager [req-40150a49-228c-495d-8ba3-9591ff013aa5 req-50439232-7ca1-4ac7-ae83-ad3b5fd1a5e3 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 5a1cf877-f781-4088-8f98-19d39a95d5bc] Received unexpected event network-vif-plugged-b5736eee-a7c7-4376-87f8-2ba8e852813f for instance with vm_state deleted and task_state None.
Feb 16 13:45:15 compute-0 nova_compute[185723]: 2026-02-16 13:45:15.276 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:45:16 compute-0 nova_compute[185723]: 2026-02-16 13:45:16.801 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:45:18 compute-0 podman[214269]: 2026-02-16 13:45:18.011237997 +0000 UTC m=+0.046563186 container health_status 4ec6105d1727f201f656d7e6e358931be8ceabc5af37fb5ad779abc620122180 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 16 13:45:20 compute-0 nova_compute[185723]: 2026-02-16 13:45:20.278 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:45:20 compute-0 nova_compute[185723]: 2026-02-16 13:45:20.433 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:45:20 compute-0 nova_compute[185723]: 2026-02-16 13:45:20.561 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:45:20 compute-0 nova_compute[185723]: 2026-02-16 13:45:20.562 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:45:20 compute-0 nova_compute[185723]: 2026-02-16 13:45:20.562 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:45:20 compute-0 nova_compute[185723]: 2026-02-16 13:45:20.562 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 16 13:45:20 compute-0 nova_compute[185723]: 2026-02-16 13:45:20.715 185727 WARNING nova.virt.libvirt.driver [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 13:45:20 compute-0 nova_compute[185723]: 2026-02-16 13:45:20.717 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5817MB free_disk=73.22406005859375GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 16 13:45:20 compute-0 nova_compute[185723]: 2026-02-16 13:45:20.717 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:45:20 compute-0 nova_compute[185723]: 2026-02-16 13:45:20.717 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:45:20 compute-0 nova_compute[185723]: 2026-02-16 13:45:20.822 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 16 13:45:20 compute-0 nova_compute[185723]: 2026-02-16 13:45:20.823 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 16 13:45:20 compute-0 nova_compute[185723]: 2026-02-16 13:45:20.852 185727 DEBUG nova.compute.provider_tree [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Inventory has not changed in ProviderTree for provider: c9501a85-df32-4b8f-bce0-9425ef1e7866 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 13:45:20 compute-0 nova_compute[185723]: 2026-02-16 13:45:20.896 185727 DEBUG nova.scheduler.client.report [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Inventory has not changed for provider c9501a85-df32-4b8f-bce0-9425ef1e7866 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 13:45:20 compute-0 nova_compute[185723]: 2026-02-16 13:45:20.948 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 16 13:45:20 compute-0 nova_compute[185723]: 2026-02-16 13:45:20.949 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.232s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:45:21 compute-0 nova_compute[185723]: 2026-02-16 13:45:21.804 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:45:21 compute-0 nova_compute[185723]: 2026-02-16 13:45:21.950 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:45:21 compute-0 nova_compute[185723]: 2026-02-16 13:45:21.951 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:45:23 compute-0 nova_compute[185723]: 2026-02-16 13:45:23.433 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:45:23 compute-0 nova_compute[185723]: 2026-02-16 13:45:23.434 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 16 13:45:23 compute-0 nova_compute[185723]: 2026-02-16 13:45:23.434 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 16 13:45:23 compute-0 nova_compute[185723]: 2026-02-16 13:45:23.553 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 16 13:45:23 compute-0 nova_compute[185723]: 2026-02-16 13:45:23.555 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:45:23 compute-0 nova_compute[185723]: 2026-02-16 13:45:23.688 185727 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1771249508.6875508, e5e9f487-f690-47b2-aaed-59236907f08b => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 13:45:23 compute-0 nova_compute[185723]: 2026-02-16 13:45:23.688 185727 INFO nova.compute.manager [-] [instance: e5e9f487-f690-47b2-aaed-59236907f08b] VM Stopped (Lifecycle Event)
Feb 16 13:45:23 compute-0 nova_compute[185723]: 2026-02-16 13:45:23.713 185727 DEBUG nova.compute.manager [None req-a243b0e6-af9a-405f-8326-d4fb0efc2285 - - - - - -] [instance: e5e9f487-f690-47b2-aaed-59236907f08b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:45:24 compute-0 nova_compute[185723]: 2026-02-16 13:45:24.433 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:45:24 compute-0 nova_compute[185723]: 2026-02-16 13:45:24.434 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:45:25 compute-0 nova_compute[185723]: 2026-02-16 13:45:25.280 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:45:25 compute-0 nova_compute[185723]: 2026-02-16 13:45:25.433 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:45:25 compute-0 nova_compute[185723]: 2026-02-16 13:45:25.434 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:45:25 compute-0 nova_compute[185723]: 2026-02-16 13:45:25.434 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 16 13:45:26 compute-0 nova_compute[185723]: 2026-02-16 13:45:26.137 185727 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1771249511.1352036, 5a1cf877-f781-4088-8f98-19d39a95d5bc => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 13:45:26 compute-0 nova_compute[185723]: 2026-02-16 13:45:26.137 185727 INFO nova.compute.manager [-] [instance: 5a1cf877-f781-4088-8f98-19d39a95d5bc] VM Stopped (Lifecycle Event)
Feb 16 13:45:26 compute-0 nova_compute[185723]: 2026-02-16 13:45:26.168 185727 DEBUG nova.compute.manager [None req-f4a7c882-85db-4dc5-94d3-ced02b736c46 - - - - - -] [instance: 5a1cf877-f781-4088-8f98-19d39a95d5bc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:45:26 compute-0 nova_compute[185723]: 2026-02-16 13:45:26.807 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:45:27 compute-0 nova_compute[185723]: 2026-02-16 13:45:27.429 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:45:29 compute-0 podman[195053]: time="2026-02-16T13:45:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:45:29 compute-0 podman[195053]: @ - - [16/Feb/2026:13:45:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16012 "" "Go-http-client/1.1"
Feb 16 13:45:29 compute-0 podman[195053]: @ - - [16/Feb/2026:13:45:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2180 "" "Go-http-client/1.1"
Feb 16 13:45:30 compute-0 nova_compute[185723]: 2026-02-16 13:45:30.281 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:45:31 compute-0 openstack_network_exporter[197909]: ERROR   13:45:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:45:31 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:45:31 compute-0 openstack_network_exporter[197909]: ERROR   13:45:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:45:31 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:45:31 compute-0 nova_compute[185723]: 2026-02-16 13:45:31.809 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:45:32 compute-0 sshd-session[214296]: Invalid user postgres from 188.166.42.159 port 43490
Feb 16 13:45:32 compute-0 sshd-session[214296]: Connection closed by invalid user postgres 188.166.42.159 port 43490 [preauth]
Feb 16 13:45:34 compute-0 podman[214299]: 2026-02-16 13:45:34.012224824 +0000 UTC m=+0.050268838 container health_status d69bd0be854ec8043fcba555b70c2cd311c7a2510d07d6bc9929fe7684abece9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Feb 16 13:45:34 compute-0 podman[214298]: 2026-02-16 13:45:34.017503525 +0000 UTC m=+0.056486202 container health_status 93151dcbec9f159583042df5101bdd7b5b1bcb5f3117955b4bc9cd1c0e465b98 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, release=1770267347, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.component=ubi9-minimal-container, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2026-02-05T04:57:10Z, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.created=2026-02-05T04:57:10Z, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, managed_by=edpm_ansible, io.buildah.version=1.33.7, version=9.7, vcs-type=git, maintainer=Red Hat, Inc., name=ubi9/ubi-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., container_name=openstack_network_exporter)
Feb 16 13:45:35 compute-0 nova_compute[185723]: 2026-02-16 13:45:35.283 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:45:36 compute-0 nova_compute[185723]: 2026-02-16 13:45:36.811 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:45:39 compute-0 podman[214336]: 2026-02-16 13:45:39.026950862 +0000 UTC m=+0.069066694 container health_status 19961a2f872cc72daf954f4dd556f980b5f58f137d145776df6132c167a8a93c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.build-date=20260127, managed_by=edpm_ansible)
Feb 16 13:45:40 compute-0 nova_compute[185723]: 2026-02-16 13:45:40.284 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:45:41 compute-0 nova_compute[185723]: 2026-02-16 13:45:41.814 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:45:43 compute-0 ovn_controller[96072]: 2026-02-16T13:45:43Z|00181|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Feb 16 13:45:45 compute-0 nova_compute[185723]: 2026-02-16 13:45:45.285 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:45:46 compute-0 nova_compute[185723]: 2026-02-16 13:45:46.817 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:45:49 compute-0 podman[214362]: 2026-02-16 13:45:49.027203075 +0000 UTC m=+0.070215081 container health_status 4ec6105d1727f201f656d7e6e358931be8ceabc5af37fb5ad779abc620122180 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 16 13:45:50 compute-0 nova_compute[185723]: 2026-02-16 13:45:50.288 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:45:51 compute-0 nova_compute[185723]: 2026-02-16 13:45:51.819 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:45:55 compute-0 nova_compute[185723]: 2026-02-16 13:45:55.330 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:45:56 compute-0 nova_compute[185723]: 2026-02-16 13:45:56.821 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:45:59 compute-0 podman[195053]: time="2026-02-16T13:45:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:45:59 compute-0 podman[195053]: @ - - [16/Feb/2026:13:45:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16012 "" "Go-http-client/1.1"
Feb 16 13:45:59 compute-0 podman[195053]: @ - - [16/Feb/2026:13:45:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2179 "" "Go-http-client/1.1"
Feb 16 13:46:00 compute-0 nova_compute[185723]: 2026-02-16 13:46:00.331 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:46:00 compute-0 sshd-session[214386]: Invalid user test from 146.190.226.24 port 43084
Feb 16 13:46:01 compute-0 sshd-session[214386]: Connection closed by invalid user test 146.190.226.24 port 43084 [preauth]
Feb 16 13:46:01 compute-0 openstack_network_exporter[197909]: ERROR   13:46:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:46:01 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:46:01 compute-0 openstack_network_exporter[197909]: ERROR   13:46:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:46:01 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:46:01 compute-0 nova_compute[185723]: 2026-02-16 13:46:01.823 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:46:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:46:03.239 105360 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:46:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:46:03.240 105360 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:46:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:46:03.240 105360 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:46:03 compute-0 sshd-session[214388]: Invalid user ubuntu from 64.227.72.94 port 59026
Feb 16 13:46:03 compute-0 sshd-session[214388]: Connection closed by invalid user ubuntu 64.227.72.94 port 59026 [preauth]
Feb 16 13:46:04 compute-0 sshd-session[214390]: Invalid user nagios from 146.190.22.227 port 33718
Feb 16 13:46:04 compute-0 podman[214392]: 2026-02-16 13:46:04.825983604 +0000 UTC m=+0.049076777 container health_status 93151dcbec9f159583042df5101bdd7b5b1bcb5f3117955b4bc9cd1c0e465b98 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, managed_by=edpm_ansible, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-type=git, build-date=2026-02-05T04:57:10Z, org.opencontainers.image.created=2026-02-05T04:57:10Z, name=ubi9/ubi-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1770267347, architecture=x86_64, config_id=openstack_network_exporter, io.buildah.version=1.33.7, container_name=openstack_network_exporter, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Feb 16 13:46:04 compute-0 podman[214393]: 2026-02-16 13:46:04.831162233 +0000 UTC m=+0.049138020 container health_status d69bd0be854ec8043fcba555b70c2cd311c7a2510d07d6bc9929fe7684abece9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Feb 16 13:46:04 compute-0 sshd-session[214390]: Connection closed by invalid user nagios 146.190.22.227 port 33718 [preauth]
Feb 16 13:46:05 compute-0 nova_compute[185723]: 2026-02-16 13:46:05.333 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:46:06 compute-0 nova_compute[185723]: 2026-02-16 13:46:06.825 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:46:10 compute-0 podman[214431]: 2026-02-16 13:46:10.044291318 +0000 UTC m=+0.078800915 container health_status 19961a2f872cc72daf954f4dd556f980b5f58f137d145776df6132c167a8a93c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2)
Feb 16 13:46:10 compute-0 nova_compute[185723]: 2026-02-16 13:46:10.335 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:46:11 compute-0 nova_compute[185723]: 2026-02-16 13:46:11.828 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:46:15 compute-0 nova_compute[185723]: 2026-02-16 13:46:15.375 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:46:16 compute-0 nova_compute[185723]: 2026-02-16 13:46:16.831 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:46:20 compute-0 podman[214458]: 2026-02-16 13:46:20.028287698 +0000 UTC m=+0.066110050 container health_status 4ec6105d1727f201f656d7e6e358931be8ceabc5af37fb5ad779abc620122180 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 16 13:46:20 compute-0 nova_compute[185723]: 2026-02-16 13:46:20.377 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:46:21 compute-0 nova_compute[185723]: 2026-02-16 13:46:21.433 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:46:21 compute-0 nova_compute[185723]: 2026-02-16 13:46:21.463 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:46:21 compute-0 nova_compute[185723]: 2026-02-16 13:46:21.464 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:46:21 compute-0 nova_compute[185723]: 2026-02-16 13:46:21.464 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:46:21 compute-0 nova_compute[185723]: 2026-02-16 13:46:21.464 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 16 13:46:21 compute-0 nova_compute[185723]: 2026-02-16 13:46:21.619 185727 WARNING nova.virt.libvirt.driver [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 13:46:21 compute-0 nova_compute[185723]: 2026-02-16 13:46:21.621 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5859MB free_disk=73.22407913208008GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 16 13:46:21 compute-0 nova_compute[185723]: 2026-02-16 13:46:21.621 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:46:21 compute-0 nova_compute[185723]: 2026-02-16 13:46:21.621 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:46:21 compute-0 nova_compute[185723]: 2026-02-16 13:46:21.712 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 16 13:46:21 compute-0 nova_compute[185723]: 2026-02-16 13:46:21.713 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 16 13:46:21 compute-0 nova_compute[185723]: 2026-02-16 13:46:21.728 185727 DEBUG nova.scheduler.client.report [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Refreshing inventories for resource provider c9501a85-df32-4b8f-bce0-9425ef1e7866 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Feb 16 13:46:21 compute-0 nova_compute[185723]: 2026-02-16 13:46:21.758 185727 DEBUG nova.scheduler.client.report [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Updating ProviderTree inventory for provider c9501a85-df32-4b8f-bce0-9425ef1e7866 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Feb 16 13:46:21 compute-0 nova_compute[185723]: 2026-02-16 13:46:21.758 185727 DEBUG nova.compute.provider_tree [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Updating inventory in ProviderTree for provider c9501a85-df32-4b8f-bce0-9425ef1e7866 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Feb 16 13:46:21 compute-0 nova_compute[185723]: 2026-02-16 13:46:21.776 185727 DEBUG nova.scheduler.client.report [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Refreshing aggregate associations for resource provider c9501a85-df32-4b8f-bce0-9425ef1e7866, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Feb 16 13:46:21 compute-0 nova_compute[185723]: 2026-02-16 13:46:21.814 185727 DEBUG nova.scheduler.client.report [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Refreshing trait associations for resource provider c9501a85-df32-4b8f-bce0-9425ef1e7866, traits: COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_MMX,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_SSE,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_SSE2,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_SECURITY_TPM_1_2,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_VOLUME_EXTEND,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_STORAGE_BUS_USB,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_SECURITY_TPM_2_0,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_ACCELERATORS,COMPUTE_STORAGE_BUS_FDC,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NODE,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_SSE41,COMPUTE_IMAGE_TYPE_AKI _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Feb 16 13:46:21 compute-0 nova_compute[185723]: 2026-02-16 13:46:21.834 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:46:21 compute-0 nova_compute[185723]: 2026-02-16 13:46:21.838 185727 DEBUG nova.compute.provider_tree [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Inventory has not changed in ProviderTree for provider: c9501a85-df32-4b8f-bce0-9425ef1e7866 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 13:46:21 compute-0 nova_compute[185723]: 2026-02-16 13:46:21.863 185727 DEBUG nova.scheduler.client.report [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Inventory has not changed for provider c9501a85-df32-4b8f-bce0-9425ef1e7866 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 13:46:21 compute-0 nova_compute[185723]: 2026-02-16 13:46:21.865 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 16 13:46:21 compute-0 nova_compute[185723]: 2026-02-16 13:46:21.865 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.243s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:46:22 compute-0 nova_compute[185723]: 2026-02-16 13:46:22.865 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:46:23 compute-0 nova_compute[185723]: 2026-02-16 13:46:23.433 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:46:23 compute-0 nova_compute[185723]: 2026-02-16 13:46:23.434 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 16 13:46:23 compute-0 nova_compute[185723]: 2026-02-16 13:46:23.434 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 16 13:46:23 compute-0 nova_compute[185723]: 2026-02-16 13:46:23.515 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 16 13:46:23 compute-0 nova_compute[185723]: 2026-02-16 13:46:23.516 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:46:24 compute-0 nova_compute[185723]: 2026-02-16 13:46:24.433 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:46:25 compute-0 nova_compute[185723]: 2026-02-16 13:46:25.329 185727 DEBUG oslo_concurrency.lockutils [None req-ca70c9ed-3aa2-4463-acf8-e9bdee047ef7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Acquiring lock "5fc1ad70-adc5-4109-a323-39a1b7137888" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:46:25 compute-0 nova_compute[185723]: 2026-02-16 13:46:25.330 185727 DEBUG oslo_concurrency.lockutils [None req-ca70c9ed-3aa2-4463-acf8-e9bdee047ef7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "5fc1ad70-adc5-4109-a323-39a1b7137888" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:46:25 compute-0 nova_compute[185723]: 2026-02-16 13:46:25.362 185727 DEBUG nova.compute.manager [None req-ca70c9ed-3aa2-4463-acf8-e9bdee047ef7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 5fc1ad70-adc5-4109-a323-39a1b7137888] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Feb 16 13:46:25 compute-0 nova_compute[185723]: 2026-02-16 13:46:25.378 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:46:25 compute-0 nova_compute[185723]: 2026-02-16 13:46:25.428 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:46:25 compute-0 nova_compute[185723]: 2026-02-16 13:46:25.433 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:46:25 compute-0 nova_compute[185723]: 2026-02-16 13:46:25.433 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:46:25 compute-0 nova_compute[185723]: 2026-02-16 13:46:25.445 185727 DEBUG oslo_concurrency.lockutils [None req-ca70c9ed-3aa2-4463-acf8-e9bdee047ef7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:46:25 compute-0 nova_compute[185723]: 2026-02-16 13:46:25.446 185727 DEBUG oslo_concurrency.lockutils [None req-ca70c9ed-3aa2-4463-acf8-e9bdee047ef7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:46:25 compute-0 nova_compute[185723]: 2026-02-16 13:46:25.452 185727 DEBUG nova.virt.hardware [None req-ca70c9ed-3aa2-4463-acf8-e9bdee047ef7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Feb 16 13:46:25 compute-0 nova_compute[185723]: 2026-02-16 13:46:25.452 185727 INFO nova.compute.claims [None req-ca70c9ed-3aa2-4463-acf8-e9bdee047ef7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 5fc1ad70-adc5-4109-a323-39a1b7137888] Claim successful on node compute-0.ctlplane.example.com
Feb 16 13:46:25 compute-0 nova_compute[185723]: 2026-02-16 13:46:25.576 185727 DEBUG nova.compute.provider_tree [None req-ca70c9ed-3aa2-4463-acf8-e9bdee047ef7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Inventory has not changed in ProviderTree for provider: c9501a85-df32-4b8f-bce0-9425ef1e7866 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 13:46:25 compute-0 nova_compute[185723]: 2026-02-16 13:46:25.592 185727 DEBUG nova.scheduler.client.report [None req-ca70c9ed-3aa2-4463-acf8-e9bdee047ef7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Inventory has not changed for provider c9501a85-df32-4b8f-bce0-9425ef1e7866 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 13:46:25 compute-0 nova_compute[185723]: 2026-02-16 13:46:25.612 185727 DEBUG oslo_concurrency.lockutils [None req-ca70c9ed-3aa2-4463-acf8-e9bdee047ef7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.166s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:46:25 compute-0 nova_compute[185723]: 2026-02-16 13:46:25.613 185727 DEBUG nova.compute.manager [None req-ca70c9ed-3aa2-4463-acf8-e9bdee047ef7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 5fc1ad70-adc5-4109-a323-39a1b7137888] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Feb 16 13:46:25 compute-0 nova_compute[185723]: 2026-02-16 13:46:25.689 185727 DEBUG nova.compute.manager [None req-ca70c9ed-3aa2-4463-acf8-e9bdee047ef7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 5fc1ad70-adc5-4109-a323-39a1b7137888] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Feb 16 13:46:25 compute-0 nova_compute[185723]: 2026-02-16 13:46:25.689 185727 DEBUG nova.network.neutron [None req-ca70c9ed-3aa2-4463-acf8-e9bdee047ef7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 5fc1ad70-adc5-4109-a323-39a1b7137888] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Feb 16 13:46:25 compute-0 nova_compute[185723]: 2026-02-16 13:46:25.713 185727 INFO nova.virt.libvirt.driver [None req-ca70c9ed-3aa2-4463-acf8-e9bdee047ef7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 5fc1ad70-adc5-4109-a323-39a1b7137888] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Feb 16 13:46:25 compute-0 nova_compute[185723]: 2026-02-16 13:46:25.731 185727 DEBUG nova.compute.manager [None req-ca70c9ed-3aa2-4463-acf8-e9bdee047ef7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 5fc1ad70-adc5-4109-a323-39a1b7137888] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Feb 16 13:46:25 compute-0 nova_compute[185723]: 2026-02-16 13:46:25.829 185727 DEBUG nova.compute.manager [None req-ca70c9ed-3aa2-4463-acf8-e9bdee047ef7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 5fc1ad70-adc5-4109-a323-39a1b7137888] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Feb 16 13:46:25 compute-0 nova_compute[185723]: 2026-02-16 13:46:25.831 185727 DEBUG nova.virt.libvirt.driver [None req-ca70c9ed-3aa2-4463-acf8-e9bdee047ef7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 5fc1ad70-adc5-4109-a323-39a1b7137888] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Feb 16 13:46:25 compute-0 nova_compute[185723]: 2026-02-16 13:46:25.831 185727 INFO nova.virt.libvirt.driver [None req-ca70c9ed-3aa2-4463-acf8-e9bdee047ef7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 5fc1ad70-adc5-4109-a323-39a1b7137888] Creating image(s)
Feb 16 13:46:25 compute-0 nova_compute[185723]: 2026-02-16 13:46:25.832 185727 DEBUG oslo_concurrency.lockutils [None req-ca70c9ed-3aa2-4463-acf8-e9bdee047ef7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Acquiring lock "/var/lib/nova/instances/5fc1ad70-adc5-4109-a323-39a1b7137888/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:46:25 compute-0 nova_compute[185723]: 2026-02-16 13:46:25.832 185727 DEBUG oslo_concurrency.lockutils [None req-ca70c9ed-3aa2-4463-acf8-e9bdee047ef7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "/var/lib/nova/instances/5fc1ad70-adc5-4109-a323-39a1b7137888/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:46:25 compute-0 nova_compute[185723]: 2026-02-16 13:46:25.833 185727 DEBUG oslo_concurrency.lockutils [None req-ca70c9ed-3aa2-4463-acf8-e9bdee047ef7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "/var/lib/nova/instances/5fc1ad70-adc5-4109-a323-39a1b7137888/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:46:25 compute-0 nova_compute[185723]: 2026-02-16 13:46:25.845 185727 DEBUG oslo_concurrency.processutils [None req-ca70c9ed-3aa2-4463-acf8-e9bdee047ef7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:46:25 compute-0 nova_compute[185723]: 2026-02-16 13:46:25.889 185727 DEBUG oslo_concurrency.processutils [None req-ca70c9ed-3aa2-4463-acf8-e9bdee047ef7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json" returned: 0 in 0.045s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:46:25 compute-0 nova_compute[185723]: 2026-02-16 13:46:25.891 185727 DEBUG oslo_concurrency.lockutils [None req-ca70c9ed-3aa2-4463-acf8-e9bdee047ef7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Acquiring lock "755ae6cb63977e865a71a8c38a1a67ce95c9d0b7" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:46:25 compute-0 nova_compute[185723]: 2026-02-16 13:46:25.891 185727 DEBUG oslo_concurrency.lockutils [None req-ca70c9ed-3aa2-4463-acf8-e9bdee047ef7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "755ae6cb63977e865a71a8c38a1a67ce95c9d0b7" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:46:25 compute-0 nova_compute[185723]: 2026-02-16 13:46:25.902 185727 DEBUG oslo_concurrency.processutils [None req-ca70c9ed-3aa2-4463-acf8-e9bdee047ef7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:46:25 compute-0 sshd-session[214482]: Invalid user postgres from 188.166.42.159 port 35268
Feb 16 13:46:25 compute-0 nova_compute[185723]: 2026-02-16 13:46:25.963 185727 DEBUG oslo_concurrency.processutils [None req-ca70c9ed-3aa2-4463-acf8-e9bdee047ef7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:46:25 compute-0 nova_compute[185723]: 2026-02-16 13:46:25.964 185727 DEBUG oslo_concurrency.processutils [None req-ca70c9ed-3aa2-4463-acf8-e9bdee047ef7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7,backing_fmt=raw /var/lib/nova/instances/5fc1ad70-adc5-4109-a323-39a1b7137888/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:46:25 compute-0 nova_compute[185723]: 2026-02-16 13:46:25.999 185727 DEBUG oslo_concurrency.processutils [None req-ca70c9ed-3aa2-4463-acf8-e9bdee047ef7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7,backing_fmt=raw /var/lib/nova/instances/5fc1ad70-adc5-4109-a323-39a1b7137888/disk 1073741824" returned: 0 in 0.035s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:46:26 compute-0 nova_compute[185723]: 2026-02-16 13:46:26.000 185727 DEBUG oslo_concurrency.lockutils [None req-ca70c9ed-3aa2-4463-acf8-e9bdee047ef7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "755ae6cb63977e865a71a8c38a1a67ce95c9d0b7" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.108s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:46:26 compute-0 nova_compute[185723]: 2026-02-16 13:46:26.000 185727 DEBUG oslo_concurrency.processutils [None req-ca70c9ed-3aa2-4463-acf8-e9bdee047ef7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:46:26 compute-0 nova_compute[185723]: 2026-02-16 13:46:26.051 185727 DEBUG oslo_concurrency.processutils [None req-ca70c9ed-3aa2-4463-acf8-e9bdee047ef7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:46:26 compute-0 nova_compute[185723]: 2026-02-16 13:46:26.052 185727 DEBUG nova.virt.disk.api [None req-ca70c9ed-3aa2-4463-acf8-e9bdee047ef7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Checking if we can resize image /var/lib/nova/instances/5fc1ad70-adc5-4109-a323-39a1b7137888/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Feb 16 13:46:26 compute-0 nova_compute[185723]: 2026-02-16 13:46:26.052 185727 DEBUG oslo_concurrency.processutils [None req-ca70c9ed-3aa2-4463-acf8-e9bdee047ef7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5fc1ad70-adc5-4109-a323-39a1b7137888/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:46:26 compute-0 sshd-session[214482]: Connection closed by invalid user postgres 188.166.42.159 port 35268 [preauth]
Feb 16 13:46:26 compute-0 nova_compute[185723]: 2026-02-16 13:46:26.104 185727 DEBUG oslo_concurrency.processutils [None req-ca70c9ed-3aa2-4463-acf8-e9bdee047ef7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5fc1ad70-adc5-4109-a323-39a1b7137888/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:46:26 compute-0 nova_compute[185723]: 2026-02-16 13:46:26.105 185727 DEBUG nova.virt.disk.api [None req-ca70c9ed-3aa2-4463-acf8-e9bdee047ef7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Cannot resize image /var/lib/nova/instances/5fc1ad70-adc5-4109-a323-39a1b7137888/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Feb 16 13:46:26 compute-0 nova_compute[185723]: 2026-02-16 13:46:26.106 185727 DEBUG nova.objects.instance [None req-ca70c9ed-3aa2-4463-acf8-e9bdee047ef7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lazy-loading 'migration_context' on Instance uuid 5fc1ad70-adc5-4109-a323-39a1b7137888 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 13:46:26 compute-0 nova_compute[185723]: 2026-02-16 13:46:26.122 185727 DEBUG nova.virt.libvirt.driver [None req-ca70c9ed-3aa2-4463-acf8-e9bdee047ef7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 5fc1ad70-adc5-4109-a323-39a1b7137888] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Feb 16 13:46:26 compute-0 nova_compute[185723]: 2026-02-16 13:46:26.122 185727 DEBUG nova.virt.libvirt.driver [None req-ca70c9ed-3aa2-4463-acf8-e9bdee047ef7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 5fc1ad70-adc5-4109-a323-39a1b7137888] Ensure instance console log exists: /var/lib/nova/instances/5fc1ad70-adc5-4109-a323-39a1b7137888/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Feb 16 13:46:26 compute-0 nova_compute[185723]: 2026-02-16 13:46:26.123 185727 DEBUG oslo_concurrency.lockutils [None req-ca70c9ed-3aa2-4463-acf8-e9bdee047ef7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:46:26 compute-0 nova_compute[185723]: 2026-02-16 13:46:26.123 185727 DEBUG oslo_concurrency.lockutils [None req-ca70c9ed-3aa2-4463-acf8-e9bdee047ef7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:46:26 compute-0 nova_compute[185723]: 2026-02-16 13:46:26.124 185727 DEBUG oslo_concurrency.lockutils [None req-ca70c9ed-3aa2-4463-acf8-e9bdee047ef7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:46:26 compute-0 nova_compute[185723]: 2026-02-16 13:46:26.797 185727 DEBUG nova.policy [None req-ca70c9ed-3aa2-4463-acf8-e9bdee047ef7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'e19cd2d8a8894526ba620ca3249e9a63', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '76c271745e704d5fa97fe16a7dcd4a81', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Feb 16 13:46:26 compute-0 nova_compute[185723]: 2026-02-16 13:46:26.836 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:46:27 compute-0 nova_compute[185723]: 2026-02-16 13:46:27.432 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:46:27 compute-0 nova_compute[185723]: 2026-02-16 13:46:27.433 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 16 13:46:27 compute-0 nova_compute[185723]: 2026-02-16 13:46:27.838 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:46:27 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:46:27.840 105360 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=25, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '52:96:1b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '16:6c:7a:bd:49:39'}, ipsec=False) old=SB_Global(nb_cfg=24) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 13:46:27 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:46:27.841 105360 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 16 13:46:28 compute-0 nova_compute[185723]: 2026-02-16 13:46:28.753 185727 DEBUG nova.network.neutron [None req-ca70c9ed-3aa2-4463-acf8-e9bdee047ef7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 5fc1ad70-adc5-4109-a323-39a1b7137888] Successfully created port: 303c8798-0b7a-4dc2-ac3d-fd237012b497 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Feb 16 13:46:29 compute-0 nova_compute[185723]: 2026-02-16 13:46:29.709 185727 DEBUG nova.network.neutron [None req-ca70c9ed-3aa2-4463-acf8-e9bdee047ef7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 5fc1ad70-adc5-4109-a323-39a1b7137888] Successfully updated port: 303c8798-0b7a-4dc2-ac3d-fd237012b497 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Feb 16 13:46:29 compute-0 nova_compute[185723]: 2026-02-16 13:46:29.732 185727 DEBUG oslo_concurrency.lockutils [None req-ca70c9ed-3aa2-4463-acf8-e9bdee047ef7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Acquiring lock "refresh_cache-5fc1ad70-adc5-4109-a323-39a1b7137888" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 13:46:29 compute-0 nova_compute[185723]: 2026-02-16 13:46:29.732 185727 DEBUG oslo_concurrency.lockutils [None req-ca70c9ed-3aa2-4463-acf8-e9bdee047ef7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Acquired lock "refresh_cache-5fc1ad70-adc5-4109-a323-39a1b7137888" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 13:46:29 compute-0 nova_compute[185723]: 2026-02-16 13:46:29.732 185727 DEBUG nova.network.neutron [None req-ca70c9ed-3aa2-4463-acf8-e9bdee047ef7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 5fc1ad70-adc5-4109-a323-39a1b7137888] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 16 13:46:29 compute-0 podman[195053]: time="2026-02-16T13:46:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:46:29 compute-0 podman[195053]: @ - - [16/Feb/2026:13:46:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16012 "" "Go-http-client/1.1"
Feb 16 13:46:29 compute-0 podman[195053]: @ - - [16/Feb/2026:13:46:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2180 "" "Go-http-client/1.1"
Feb 16 13:46:29 compute-0 nova_compute[185723]: 2026-02-16 13:46:29.816 185727 DEBUG nova.compute.manager [req-e62a8be0-5db1-4529-aa57-7c2560781ff0 req-d53289c9-8762-4fe6-adcf-b649cc5fd3ac faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 5fc1ad70-adc5-4109-a323-39a1b7137888] Received event network-changed-303c8798-0b7a-4dc2-ac3d-fd237012b497 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:46:29 compute-0 nova_compute[185723]: 2026-02-16 13:46:29.817 185727 DEBUG nova.compute.manager [req-e62a8be0-5db1-4529-aa57-7c2560781ff0 req-d53289c9-8762-4fe6-adcf-b649cc5fd3ac faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 5fc1ad70-adc5-4109-a323-39a1b7137888] Refreshing instance network info cache due to event network-changed-303c8798-0b7a-4dc2-ac3d-fd237012b497. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 16 13:46:29 compute-0 nova_compute[185723]: 2026-02-16 13:46:29.818 185727 DEBUG oslo_concurrency.lockutils [req-e62a8be0-5db1-4529-aa57-7c2560781ff0 req-d53289c9-8762-4fe6-adcf-b649cc5fd3ac faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "refresh_cache-5fc1ad70-adc5-4109-a323-39a1b7137888" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 13:46:29 compute-0 nova_compute[185723]: 2026-02-16 13:46:29.908 185727 DEBUG nova.network.neutron [None req-ca70c9ed-3aa2-4463-acf8-e9bdee047ef7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 5fc1ad70-adc5-4109-a323-39a1b7137888] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Feb 16 13:46:30 compute-0 nova_compute[185723]: 2026-02-16 13:46:30.401 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:46:31 compute-0 openstack_network_exporter[197909]: ERROR   13:46:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:46:31 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:46:31 compute-0 openstack_network_exporter[197909]: ERROR   13:46:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:46:31 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:46:31 compute-0 nova_compute[185723]: 2026-02-16 13:46:31.838 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:46:31 compute-0 nova_compute[185723]: 2026-02-16 13:46:31.893 185727 DEBUG nova.network.neutron [None req-ca70c9ed-3aa2-4463-acf8-e9bdee047ef7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 5fc1ad70-adc5-4109-a323-39a1b7137888] Updating instance_info_cache with network_info: [{"id": "303c8798-0b7a-4dc2-ac3d-fd237012b497", "address": "fa:16:3e:71:1d:c5", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap303c8798-0b", "ovs_interfaceid": "303c8798-0b7a-4dc2-ac3d-fd237012b497", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 13:46:31 compute-0 nova_compute[185723]: 2026-02-16 13:46:31.919 185727 DEBUG oslo_concurrency.lockutils [None req-ca70c9ed-3aa2-4463-acf8-e9bdee047ef7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Releasing lock "refresh_cache-5fc1ad70-adc5-4109-a323-39a1b7137888" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 13:46:31 compute-0 nova_compute[185723]: 2026-02-16 13:46:31.920 185727 DEBUG nova.compute.manager [None req-ca70c9ed-3aa2-4463-acf8-e9bdee047ef7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 5fc1ad70-adc5-4109-a323-39a1b7137888] Instance network_info: |[{"id": "303c8798-0b7a-4dc2-ac3d-fd237012b497", "address": "fa:16:3e:71:1d:c5", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap303c8798-0b", "ovs_interfaceid": "303c8798-0b7a-4dc2-ac3d-fd237012b497", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Feb 16 13:46:31 compute-0 nova_compute[185723]: 2026-02-16 13:46:31.920 185727 DEBUG oslo_concurrency.lockutils [req-e62a8be0-5db1-4529-aa57-7c2560781ff0 req-d53289c9-8762-4fe6-adcf-b649cc5fd3ac faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquired lock "refresh_cache-5fc1ad70-adc5-4109-a323-39a1b7137888" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 13:46:31 compute-0 nova_compute[185723]: 2026-02-16 13:46:31.921 185727 DEBUG nova.network.neutron [req-e62a8be0-5db1-4529-aa57-7c2560781ff0 req-d53289c9-8762-4fe6-adcf-b649cc5fd3ac faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 5fc1ad70-adc5-4109-a323-39a1b7137888] Refreshing network info cache for port 303c8798-0b7a-4dc2-ac3d-fd237012b497 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 16 13:46:31 compute-0 nova_compute[185723]: 2026-02-16 13:46:31.923 185727 DEBUG nova.virt.libvirt.driver [None req-ca70c9ed-3aa2-4463-acf8-e9bdee047ef7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 5fc1ad70-adc5-4109-a323-39a1b7137888] Start _get_guest_xml network_info=[{"id": "303c8798-0b7a-4dc2-ac3d-fd237012b497", "address": "fa:16:3e:71:1d:c5", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap303c8798-0b", "ovs_interfaceid": "303c8798-0b7a-4dc2-ac3d-fd237012b497", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-16T13:17:01Z,direct_url=<?>,disk_format='qcow2',id=6fb9af7f-2971-4890-a777-6e99e888717f,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='fa8ec31824694513a42cc22c81880a5c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-16T13:17:03Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'encrypted': False, 'device_type': 'disk', 'disk_bus': 'virtio', 'encryption_options': None, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'size': 0, 'image_id': '6fb9af7f-2971-4890-a777-6e99e888717f'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Feb 16 13:46:31 compute-0 nova_compute[185723]: 2026-02-16 13:46:31.928 185727 WARNING nova.virt.libvirt.driver [None req-ca70c9ed-3aa2-4463-acf8-e9bdee047ef7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 13:46:31 compute-0 nova_compute[185723]: 2026-02-16 13:46:31.936 185727 DEBUG nova.virt.libvirt.host [None req-ca70c9ed-3aa2-4463-acf8-e9bdee047ef7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Feb 16 13:46:31 compute-0 nova_compute[185723]: 2026-02-16 13:46:31.937 185727 DEBUG nova.virt.libvirt.host [None req-ca70c9ed-3aa2-4463-acf8-e9bdee047ef7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Feb 16 13:46:31 compute-0 nova_compute[185723]: 2026-02-16 13:46:31.943 185727 DEBUG nova.virt.libvirt.host [None req-ca70c9ed-3aa2-4463-acf8-e9bdee047ef7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Feb 16 13:46:31 compute-0 nova_compute[185723]: 2026-02-16 13:46:31.944 185727 DEBUG nova.virt.libvirt.host [None req-ca70c9ed-3aa2-4463-acf8-e9bdee047ef7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Feb 16 13:46:31 compute-0 nova_compute[185723]: 2026-02-16 13:46:31.946 185727 DEBUG nova.virt.libvirt.driver [None req-ca70c9ed-3aa2-4463-acf8-e9bdee047ef7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Feb 16 13:46:31 compute-0 nova_compute[185723]: 2026-02-16 13:46:31.946 185727 DEBUG nova.virt.hardware [None req-ca70c9ed-3aa2-4463-acf8-e9bdee047ef7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-02-16T13:16:59Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='6d89f72c-1760-421e-a5f2-83dfc3723b84',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-16T13:17:01Z,direct_url=<?>,disk_format='qcow2',id=6fb9af7f-2971-4890-a777-6e99e888717f,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='fa8ec31824694513a42cc22c81880a5c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-16T13:17:03Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Feb 16 13:46:31 compute-0 nova_compute[185723]: 2026-02-16 13:46:31.946 185727 DEBUG nova.virt.hardware [None req-ca70c9ed-3aa2-4463-acf8-e9bdee047ef7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Feb 16 13:46:31 compute-0 nova_compute[185723]: 2026-02-16 13:46:31.947 185727 DEBUG nova.virt.hardware [None req-ca70c9ed-3aa2-4463-acf8-e9bdee047ef7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Feb 16 13:46:31 compute-0 nova_compute[185723]: 2026-02-16 13:46:31.947 185727 DEBUG nova.virt.hardware [None req-ca70c9ed-3aa2-4463-acf8-e9bdee047ef7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Feb 16 13:46:31 compute-0 nova_compute[185723]: 2026-02-16 13:46:31.947 185727 DEBUG nova.virt.hardware [None req-ca70c9ed-3aa2-4463-acf8-e9bdee047ef7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Feb 16 13:46:31 compute-0 nova_compute[185723]: 2026-02-16 13:46:31.947 185727 DEBUG nova.virt.hardware [None req-ca70c9ed-3aa2-4463-acf8-e9bdee047ef7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Feb 16 13:46:31 compute-0 nova_compute[185723]: 2026-02-16 13:46:31.948 185727 DEBUG nova.virt.hardware [None req-ca70c9ed-3aa2-4463-acf8-e9bdee047ef7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Feb 16 13:46:31 compute-0 nova_compute[185723]: 2026-02-16 13:46:31.948 185727 DEBUG nova.virt.hardware [None req-ca70c9ed-3aa2-4463-acf8-e9bdee047ef7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Feb 16 13:46:31 compute-0 nova_compute[185723]: 2026-02-16 13:46:31.948 185727 DEBUG nova.virt.hardware [None req-ca70c9ed-3aa2-4463-acf8-e9bdee047ef7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Feb 16 13:46:31 compute-0 nova_compute[185723]: 2026-02-16 13:46:31.948 185727 DEBUG nova.virt.hardware [None req-ca70c9ed-3aa2-4463-acf8-e9bdee047ef7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Feb 16 13:46:31 compute-0 nova_compute[185723]: 2026-02-16 13:46:31.949 185727 DEBUG nova.virt.hardware [None req-ca70c9ed-3aa2-4463-acf8-e9bdee047ef7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Feb 16 13:46:31 compute-0 nova_compute[185723]: 2026-02-16 13:46:31.952 185727 DEBUG nova.virt.libvirt.vif [None req-ca70c9ed-3aa2-4463-acf8-e9bdee047ef7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-16T13:46:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1875661694',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1875661694',id=21,image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='76c271745e704d5fa97fe16a7dcd4a81',ramdisk_id='',reservation_id='r-kx5zzt03',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-1085993185',owner_user_name='tempest-TestExecuteStrategies-1085993185-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-16T13:46:25Z,user_data=None,user_id='e19cd2d8a8894526ba620ca3249e9a63',uuid=5fc1ad70-adc5-4109-a323-39a1b7137888,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "303c8798-0b7a-4dc2-ac3d-fd237012b497", "address": "fa:16:3e:71:1d:c5", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap303c8798-0b", "ovs_interfaceid": "303c8798-0b7a-4dc2-ac3d-fd237012b497", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Feb 16 13:46:31 compute-0 nova_compute[185723]: 2026-02-16 13:46:31.953 185727 DEBUG nova.network.os_vif_util [None req-ca70c9ed-3aa2-4463-acf8-e9bdee047ef7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Converting VIF {"id": "303c8798-0b7a-4dc2-ac3d-fd237012b497", "address": "fa:16:3e:71:1d:c5", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap303c8798-0b", "ovs_interfaceid": "303c8798-0b7a-4dc2-ac3d-fd237012b497", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 13:46:31 compute-0 nova_compute[185723]: 2026-02-16 13:46:31.954 185727 DEBUG nova.network.os_vif_util [None req-ca70c9ed-3aa2-4463-acf8-e9bdee047ef7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:71:1d:c5,bridge_name='br-int',has_traffic_filtering=True,id=303c8798-0b7a-4dc2-ac3d-fd237012b497,network=Network(62a1ccdd-3048-4bbf-acc8-c791bff79ee8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap303c8798-0b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 13:46:31 compute-0 nova_compute[185723]: 2026-02-16 13:46:31.955 185727 DEBUG nova.objects.instance [None req-ca70c9ed-3aa2-4463-acf8-e9bdee047ef7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lazy-loading 'pci_devices' on Instance uuid 5fc1ad70-adc5-4109-a323-39a1b7137888 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 13:46:31 compute-0 nova_compute[185723]: 2026-02-16 13:46:31.971 185727 DEBUG nova.virt.libvirt.driver [None req-ca70c9ed-3aa2-4463-acf8-e9bdee047ef7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 5fc1ad70-adc5-4109-a323-39a1b7137888] End _get_guest_xml xml=<domain type="kvm">
Feb 16 13:46:31 compute-0 nova_compute[185723]:   <uuid>5fc1ad70-adc5-4109-a323-39a1b7137888</uuid>
Feb 16 13:46:31 compute-0 nova_compute[185723]:   <name>instance-00000015</name>
Feb 16 13:46:31 compute-0 nova_compute[185723]:   <memory>131072</memory>
Feb 16 13:46:31 compute-0 nova_compute[185723]:   <vcpu>1</vcpu>
Feb 16 13:46:31 compute-0 nova_compute[185723]:   <metadata>
Feb 16 13:46:31 compute-0 nova_compute[185723]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb 16 13:46:31 compute-0 nova_compute[185723]:       <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Feb 16 13:46:31 compute-0 nova_compute[185723]:       <nova:name>tempest-TestExecuteStrategies-server-1875661694</nova:name>
Feb 16 13:46:31 compute-0 nova_compute[185723]:       <nova:creationTime>2026-02-16 13:46:31</nova:creationTime>
Feb 16 13:46:31 compute-0 nova_compute[185723]:       <nova:flavor name="m1.nano">
Feb 16 13:46:31 compute-0 nova_compute[185723]:         <nova:memory>128</nova:memory>
Feb 16 13:46:31 compute-0 nova_compute[185723]:         <nova:disk>1</nova:disk>
Feb 16 13:46:31 compute-0 nova_compute[185723]:         <nova:swap>0</nova:swap>
Feb 16 13:46:31 compute-0 nova_compute[185723]:         <nova:ephemeral>0</nova:ephemeral>
Feb 16 13:46:31 compute-0 nova_compute[185723]:         <nova:vcpus>1</nova:vcpus>
Feb 16 13:46:31 compute-0 nova_compute[185723]:       </nova:flavor>
Feb 16 13:46:31 compute-0 nova_compute[185723]:       <nova:owner>
Feb 16 13:46:31 compute-0 nova_compute[185723]:         <nova:user uuid="e19cd2d8a8894526ba620ca3249e9a63">tempest-TestExecuteStrategies-1085993185-project-member</nova:user>
Feb 16 13:46:31 compute-0 nova_compute[185723]:         <nova:project uuid="76c271745e704d5fa97fe16a7dcd4a81">tempest-TestExecuteStrategies-1085993185</nova:project>
Feb 16 13:46:31 compute-0 nova_compute[185723]:       </nova:owner>
Feb 16 13:46:31 compute-0 nova_compute[185723]:       <nova:root type="image" uuid="6fb9af7f-2971-4890-a777-6e99e888717f"/>
Feb 16 13:46:31 compute-0 nova_compute[185723]:       <nova:ports>
Feb 16 13:46:31 compute-0 nova_compute[185723]:         <nova:port uuid="303c8798-0b7a-4dc2-ac3d-fd237012b497">
Feb 16 13:46:31 compute-0 nova_compute[185723]:           <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Feb 16 13:46:31 compute-0 nova_compute[185723]:         </nova:port>
Feb 16 13:46:31 compute-0 nova_compute[185723]:       </nova:ports>
Feb 16 13:46:31 compute-0 nova_compute[185723]:     </nova:instance>
Feb 16 13:46:31 compute-0 nova_compute[185723]:   </metadata>
Feb 16 13:46:31 compute-0 nova_compute[185723]:   <sysinfo type="smbios">
Feb 16 13:46:31 compute-0 nova_compute[185723]:     <system>
Feb 16 13:46:31 compute-0 nova_compute[185723]:       <entry name="manufacturer">RDO</entry>
Feb 16 13:46:31 compute-0 nova_compute[185723]:       <entry name="product">OpenStack Compute</entry>
Feb 16 13:46:31 compute-0 nova_compute[185723]:       <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Feb 16 13:46:31 compute-0 nova_compute[185723]:       <entry name="serial">5fc1ad70-adc5-4109-a323-39a1b7137888</entry>
Feb 16 13:46:31 compute-0 nova_compute[185723]:       <entry name="uuid">5fc1ad70-adc5-4109-a323-39a1b7137888</entry>
Feb 16 13:46:31 compute-0 nova_compute[185723]:       <entry name="family">Virtual Machine</entry>
Feb 16 13:46:31 compute-0 nova_compute[185723]:     </system>
Feb 16 13:46:31 compute-0 nova_compute[185723]:   </sysinfo>
Feb 16 13:46:31 compute-0 nova_compute[185723]:   <os>
Feb 16 13:46:31 compute-0 nova_compute[185723]:     <type arch="x86_64" machine="q35">hvm</type>
Feb 16 13:46:31 compute-0 nova_compute[185723]:     <boot dev="hd"/>
Feb 16 13:46:31 compute-0 nova_compute[185723]:     <smbios mode="sysinfo"/>
Feb 16 13:46:31 compute-0 nova_compute[185723]:   </os>
Feb 16 13:46:31 compute-0 nova_compute[185723]:   <features>
Feb 16 13:46:31 compute-0 nova_compute[185723]:     <acpi/>
Feb 16 13:46:31 compute-0 nova_compute[185723]:     <apic/>
Feb 16 13:46:31 compute-0 nova_compute[185723]:     <vmcoreinfo/>
Feb 16 13:46:31 compute-0 nova_compute[185723]:   </features>
Feb 16 13:46:31 compute-0 nova_compute[185723]:   <clock offset="utc">
Feb 16 13:46:31 compute-0 nova_compute[185723]:     <timer name="pit" tickpolicy="delay"/>
Feb 16 13:46:31 compute-0 nova_compute[185723]:     <timer name="rtc" tickpolicy="catchup"/>
Feb 16 13:46:31 compute-0 nova_compute[185723]:     <timer name="hpet" present="no"/>
Feb 16 13:46:31 compute-0 nova_compute[185723]:   </clock>
Feb 16 13:46:31 compute-0 nova_compute[185723]:   <cpu mode="custom" match="exact">
Feb 16 13:46:31 compute-0 nova_compute[185723]:     <model>Nehalem</model>
Feb 16 13:46:31 compute-0 nova_compute[185723]:     <topology sockets="1" cores="1" threads="1"/>
Feb 16 13:46:31 compute-0 nova_compute[185723]:   </cpu>
Feb 16 13:46:31 compute-0 nova_compute[185723]:   <devices>
Feb 16 13:46:31 compute-0 nova_compute[185723]:     <disk type="file" device="disk">
Feb 16 13:46:31 compute-0 nova_compute[185723]:       <driver name="qemu" type="qcow2" cache="none"/>
Feb 16 13:46:31 compute-0 nova_compute[185723]:       <source file="/var/lib/nova/instances/5fc1ad70-adc5-4109-a323-39a1b7137888/disk"/>
Feb 16 13:46:31 compute-0 nova_compute[185723]:       <target dev="vda" bus="virtio"/>
Feb 16 13:46:31 compute-0 nova_compute[185723]:     </disk>
Feb 16 13:46:31 compute-0 nova_compute[185723]:     <disk type="file" device="cdrom">
Feb 16 13:46:31 compute-0 nova_compute[185723]:       <driver name="qemu" type="raw" cache="none"/>
Feb 16 13:46:31 compute-0 nova_compute[185723]:       <source file="/var/lib/nova/instances/5fc1ad70-adc5-4109-a323-39a1b7137888/disk.config"/>
Feb 16 13:46:31 compute-0 nova_compute[185723]:       <target dev="sda" bus="sata"/>
Feb 16 13:46:31 compute-0 nova_compute[185723]:     </disk>
Feb 16 13:46:31 compute-0 nova_compute[185723]:     <interface type="ethernet">
Feb 16 13:46:31 compute-0 nova_compute[185723]:       <mac address="fa:16:3e:71:1d:c5"/>
Feb 16 13:46:31 compute-0 nova_compute[185723]:       <model type="virtio"/>
Feb 16 13:46:31 compute-0 nova_compute[185723]:       <driver name="vhost" rx_queue_size="512"/>
Feb 16 13:46:31 compute-0 nova_compute[185723]:       <mtu size="1442"/>
Feb 16 13:46:31 compute-0 nova_compute[185723]:       <target dev="tap303c8798-0b"/>
Feb 16 13:46:31 compute-0 nova_compute[185723]:     </interface>
Feb 16 13:46:31 compute-0 nova_compute[185723]:     <serial type="pty">
Feb 16 13:46:31 compute-0 nova_compute[185723]:       <log file="/var/lib/nova/instances/5fc1ad70-adc5-4109-a323-39a1b7137888/console.log" append="off"/>
Feb 16 13:46:31 compute-0 nova_compute[185723]:     </serial>
Feb 16 13:46:31 compute-0 nova_compute[185723]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Feb 16 13:46:31 compute-0 nova_compute[185723]:     <video>
Feb 16 13:46:31 compute-0 nova_compute[185723]:       <model type="virtio"/>
Feb 16 13:46:31 compute-0 nova_compute[185723]:     </video>
Feb 16 13:46:31 compute-0 nova_compute[185723]:     <input type="tablet" bus="usb"/>
Feb 16 13:46:31 compute-0 nova_compute[185723]:     <rng model="virtio">
Feb 16 13:46:31 compute-0 nova_compute[185723]:       <backend model="random">/dev/urandom</backend>
Feb 16 13:46:31 compute-0 nova_compute[185723]:     </rng>
Feb 16 13:46:31 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root"/>
Feb 16 13:46:31 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:46:31 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:46:31 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:46:31 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:46:31 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:46:31 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:46:31 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:46:31 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:46:31 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:46:31 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:46:31 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:46:31 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:46:31 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:46:31 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:46:31 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:46:31 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:46:31 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:46:31 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:46:31 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:46:31 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:46:31 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:46:31 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:46:31 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:46:31 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:46:31 compute-0 nova_compute[185723]:     <controller type="usb" index="0"/>
Feb 16 13:46:31 compute-0 nova_compute[185723]:     <memballoon model="virtio">
Feb 16 13:46:31 compute-0 nova_compute[185723]:       <stats period="10"/>
Feb 16 13:46:31 compute-0 nova_compute[185723]:     </memballoon>
Feb 16 13:46:31 compute-0 nova_compute[185723]:   </devices>
Feb 16 13:46:31 compute-0 nova_compute[185723]: </domain>
Feb 16 13:46:31 compute-0 nova_compute[185723]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Feb 16 13:46:31 compute-0 nova_compute[185723]: 2026-02-16 13:46:31.972 185727 DEBUG nova.compute.manager [None req-ca70c9ed-3aa2-4463-acf8-e9bdee047ef7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 5fc1ad70-adc5-4109-a323-39a1b7137888] Preparing to wait for external event network-vif-plugged-303c8798-0b7a-4dc2-ac3d-fd237012b497 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Feb 16 13:46:31 compute-0 nova_compute[185723]: 2026-02-16 13:46:31.972 185727 DEBUG oslo_concurrency.lockutils [None req-ca70c9ed-3aa2-4463-acf8-e9bdee047ef7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Acquiring lock "5fc1ad70-adc5-4109-a323-39a1b7137888-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:46:31 compute-0 nova_compute[185723]: 2026-02-16 13:46:31.972 185727 DEBUG oslo_concurrency.lockutils [None req-ca70c9ed-3aa2-4463-acf8-e9bdee047ef7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "5fc1ad70-adc5-4109-a323-39a1b7137888-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:46:31 compute-0 nova_compute[185723]: 2026-02-16 13:46:31.973 185727 DEBUG oslo_concurrency.lockutils [None req-ca70c9ed-3aa2-4463-acf8-e9bdee047ef7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "5fc1ad70-adc5-4109-a323-39a1b7137888-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:46:31 compute-0 nova_compute[185723]: 2026-02-16 13:46:31.973 185727 DEBUG nova.virt.libvirt.vif [None req-ca70c9ed-3aa2-4463-acf8-e9bdee047ef7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-16T13:46:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1875661694',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1875661694',id=21,image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='76c271745e704d5fa97fe16a7dcd4a81',ramdisk_id='',reservation_id='r-kx5zzt03',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-1085993185',owner_user_name='tempest-TestExecuteStrategies-1085993185-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-16T13:46:25Z,user_data=None,user_id='e19cd2d8a8894526ba620ca3249e9a63',uuid=5fc1ad70-adc5-4109-a323-39a1b7137888,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "303c8798-0b7a-4dc2-ac3d-fd237012b497", "address": "fa:16:3e:71:1d:c5", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap303c8798-0b", "ovs_interfaceid": "303c8798-0b7a-4dc2-ac3d-fd237012b497", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Feb 16 13:46:31 compute-0 nova_compute[185723]: 2026-02-16 13:46:31.974 185727 DEBUG nova.network.os_vif_util [None req-ca70c9ed-3aa2-4463-acf8-e9bdee047ef7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Converting VIF {"id": "303c8798-0b7a-4dc2-ac3d-fd237012b497", "address": "fa:16:3e:71:1d:c5", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap303c8798-0b", "ovs_interfaceid": "303c8798-0b7a-4dc2-ac3d-fd237012b497", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 13:46:31 compute-0 nova_compute[185723]: 2026-02-16 13:46:31.974 185727 DEBUG nova.network.os_vif_util [None req-ca70c9ed-3aa2-4463-acf8-e9bdee047ef7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:71:1d:c5,bridge_name='br-int',has_traffic_filtering=True,id=303c8798-0b7a-4dc2-ac3d-fd237012b497,network=Network(62a1ccdd-3048-4bbf-acc8-c791bff79ee8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap303c8798-0b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 13:46:31 compute-0 nova_compute[185723]: 2026-02-16 13:46:31.975 185727 DEBUG os_vif [None req-ca70c9ed-3aa2-4463-acf8-e9bdee047ef7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:71:1d:c5,bridge_name='br-int',has_traffic_filtering=True,id=303c8798-0b7a-4dc2-ac3d-fd237012b497,network=Network(62a1ccdd-3048-4bbf-acc8-c791bff79ee8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap303c8798-0b') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Feb 16 13:46:31 compute-0 nova_compute[185723]: 2026-02-16 13:46:31.976 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:46:31 compute-0 nova_compute[185723]: 2026-02-16 13:46:31.976 185727 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:46:31 compute-0 nova_compute[185723]: 2026-02-16 13:46:31.977 185727 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 13:46:31 compute-0 nova_compute[185723]: 2026-02-16 13:46:31.979 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:46:31 compute-0 nova_compute[185723]: 2026-02-16 13:46:31.979 185727 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap303c8798-0b, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:46:31 compute-0 nova_compute[185723]: 2026-02-16 13:46:31.979 185727 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap303c8798-0b, col_values=(('external_ids', {'iface-id': '303c8798-0b7a-4dc2-ac3d-fd237012b497', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:71:1d:c5', 'vm-uuid': '5fc1ad70-adc5-4109-a323-39a1b7137888'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:46:31 compute-0 nova_compute[185723]: 2026-02-16 13:46:31.981 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:46:31 compute-0 NetworkManager[56177]: <info>  [1771249591.9828] manager: (tap303c8798-0b): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/73)
Feb 16 13:46:31 compute-0 nova_compute[185723]: 2026-02-16 13:46:31.985 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 13:46:31 compute-0 nova_compute[185723]: 2026-02-16 13:46:31.986 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:46:31 compute-0 nova_compute[185723]: 2026-02-16 13:46:31.987 185727 INFO os_vif [None req-ca70c9ed-3aa2-4463-acf8-e9bdee047ef7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:71:1d:c5,bridge_name='br-int',has_traffic_filtering=True,id=303c8798-0b7a-4dc2-ac3d-fd237012b497,network=Network(62a1ccdd-3048-4bbf-acc8-c791bff79ee8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap303c8798-0b')
Feb 16 13:46:32 compute-0 nova_compute[185723]: 2026-02-16 13:46:32.035 185727 DEBUG nova.virt.libvirt.driver [None req-ca70c9ed-3aa2-4463-acf8-e9bdee047ef7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 16 13:46:32 compute-0 nova_compute[185723]: 2026-02-16 13:46:32.035 185727 DEBUG nova.virt.libvirt.driver [None req-ca70c9ed-3aa2-4463-acf8-e9bdee047ef7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 16 13:46:32 compute-0 nova_compute[185723]: 2026-02-16 13:46:32.036 185727 DEBUG nova.virt.libvirt.driver [None req-ca70c9ed-3aa2-4463-acf8-e9bdee047ef7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] No VIF found with MAC fa:16:3e:71:1d:c5, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Feb 16 13:46:32 compute-0 nova_compute[185723]: 2026-02-16 13:46:32.036 185727 INFO nova.virt.libvirt.driver [None req-ca70c9ed-3aa2-4463-acf8-e9bdee047ef7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 5fc1ad70-adc5-4109-a323-39a1b7137888] Using config drive
Feb 16 13:46:32 compute-0 nova_compute[185723]: 2026-02-16 13:46:32.922 185727 INFO nova.virt.libvirt.driver [None req-ca70c9ed-3aa2-4463-acf8-e9bdee047ef7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 5fc1ad70-adc5-4109-a323-39a1b7137888] Creating config drive at /var/lib/nova/instances/5fc1ad70-adc5-4109-a323-39a1b7137888/disk.config
Feb 16 13:46:32 compute-0 nova_compute[185723]: 2026-02-16 13:46:32.926 185727 DEBUG oslo_concurrency.processutils [None req-ca70c9ed-3aa2-4463-acf8-e9bdee047ef7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/5fc1ad70-adc5-4109-a323-39a1b7137888/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmppb89ywmp execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:46:33 compute-0 nova_compute[185723]: 2026-02-16 13:46:33.044 185727 DEBUG oslo_concurrency.processutils [None req-ca70c9ed-3aa2-4463-acf8-e9bdee047ef7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/5fc1ad70-adc5-4109-a323-39a1b7137888/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmppb89ywmp" returned: 0 in 0.117s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:46:33 compute-0 kernel: tap303c8798-0b: entered promiscuous mode
Feb 16 13:46:33 compute-0 NetworkManager[56177]: <info>  [1771249593.1000] manager: (tap303c8798-0b): new Tun device (/org/freedesktop/NetworkManager/Devices/74)
Feb 16 13:46:33 compute-0 nova_compute[185723]: 2026-02-16 13:46:33.101 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:46:33 compute-0 ovn_controller[96072]: 2026-02-16T13:46:33Z|00182|binding|INFO|Claiming lport 303c8798-0b7a-4dc2-ac3d-fd237012b497 for this chassis.
Feb 16 13:46:33 compute-0 ovn_controller[96072]: 2026-02-16T13:46:33Z|00183|binding|INFO|303c8798-0b7a-4dc2-ac3d-fd237012b497: Claiming fa:16:3e:71:1d:c5 10.100.0.13
Feb 16 13:46:33 compute-0 ovn_controller[96072]: 2026-02-16T13:46:33Z|00184|binding|INFO|Setting lport 303c8798-0b7a-4dc2-ac3d-fd237012b497 ovn-installed in OVS
Feb 16 13:46:33 compute-0 ovn_controller[96072]: 2026-02-16T13:46:33Z|00185|binding|INFO|Setting lport 303c8798-0b7a-4dc2-ac3d-fd237012b497 up in Southbound
Feb 16 13:46:33 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:46:33.107 105360 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:71:1d:c5 10.100.0.13'], port_security=['fa:16:3e:71:1d:c5 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '5fc1ad70-adc5-4109-a323-39a1b7137888', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '76c271745e704d5fa97fe16a7dcd4a81', 'neutron:revision_number': '2', 'neutron:security_group_ids': '832f9d98-96fb-45e2-8c11-0f4534711ee2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fb3ae2f5-242d-4288-83f4-c053c76499e5, chassis=[<ovs.db.idl.Row object at 0x7f2e53046760>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2e53046760>], logical_port=303c8798-0b7a-4dc2-ac3d-fd237012b497) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 13:46:33 compute-0 nova_compute[185723]: 2026-02-16 13:46:33.108 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:46:33 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:46:33.109 105360 INFO neutron.agent.ovn.metadata.agent [-] Port 303c8798-0b7a-4dc2-ac3d-fd237012b497 in datapath 62a1ccdd-3048-4bbf-acc8-c791bff79ee8 bound to our chassis
Feb 16 13:46:33 compute-0 nova_compute[185723]: 2026-02-16 13:46:33.110 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:46:33 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:46:33.110 105360 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 62a1ccdd-3048-4bbf-acc8-c791bff79ee8
Feb 16 13:46:33 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:46:33.120 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[8bca6310-3178-445b-b2e4-ebad082f8f89]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:46:33 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:46:33.121 105360 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap62a1ccdd-31 in ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Feb 16 13:46:33 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:46:33.122 206438 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap62a1ccdd-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Feb 16 13:46:33 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:46:33.123 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[48635aa8-b518-4f53-8499-05e85a040495]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:46:33 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:46:33.123 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[ebfe442e-2cfe-49d2-ba92-d9fb9bece3b4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:46:33 compute-0 systemd-udevd[214520]: Network interface NamePolicy= disabled on kernel command line.
Feb 16 13:46:33 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:46:33.133 105762 DEBUG oslo.privsep.daemon [-] privsep: reply[ec7bbc35-7fb1-4aea-8718-89c2dec8dfb9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:46:33 compute-0 systemd-machined[155229]: New machine qemu-17-instance-00000015.
Feb 16 13:46:33 compute-0 NetworkManager[56177]: <info>  [1771249593.1441] device (tap303c8798-0b): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 16 13:46:33 compute-0 NetworkManager[56177]: <info>  [1771249593.1450] device (tap303c8798-0b): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Feb 16 13:46:33 compute-0 systemd[1]: Started Virtual Machine qemu-17-instance-00000015.
Feb 16 13:46:33 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:46:33.158 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[a8d24709-b9ce-4763-8950-cd39a61c0824]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:46:33 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:46:33.188 206452 DEBUG oslo.privsep.daemon [-] privsep: reply[a07a3184-82ab-49ce-af69-5a66e58be2e9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:46:33 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:46:33.192 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[03f9c14b-2d94-4994-833b-45b5c837d392]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:46:33 compute-0 NetworkManager[56177]: <info>  [1771249593.1938] manager: (tap62a1ccdd-30): new Veth device (/org/freedesktop/NetworkManager/Devices/75)
Feb 16 13:46:33 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:46:33.224 206452 DEBUG oslo.privsep.daemon [-] privsep: reply[19f9c73d-5f5a-4a9c-a777-9af87b83e876]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:46:33 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:46:33.227 206452 DEBUG oslo.privsep.daemon [-] privsep: reply[8d61e3c8-dda3-4c81-a992-5e1929409f1a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:46:33 compute-0 NetworkManager[56177]: <info>  [1771249593.2444] device (tap62a1ccdd-30): carrier: link connected
Feb 16 13:46:33 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:46:33.248 206452 DEBUG oslo.privsep.daemon [-] privsep: reply[78548831-9411-45ab-a9dd-d1df09d2649f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:46:33 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:46:33.265 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[90ac4ded-5122-4d9b-952e-8f7b4db84c19]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap62a1ccdd-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a9:94:92'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 53], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 567663, 'reachable_time': 34535, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214552, 'error': None, 'target': 'ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:46:33 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:46:33.279 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[18c966ab-8b38-491b-87c2-8927e3111936]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fea9:9492'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 567663, 'tstamp': 567663}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214553, 'error': None, 'target': 'ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:46:33 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:46:33.298 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[548b4c03-c1c0-4748-929a-1b128abad71e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap62a1ccdd-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a9:94:92'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 53], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 567663, 'reachable_time': 34535, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 214554, 'error': None, 'target': 'ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:46:33 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:46:33.326 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[449657ed-62e8-4f79-9db1-a4caf5d569c1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:46:33 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:46:33.384 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[20826470-8431-415c-83b5-24aa9d7f59e2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:46:33 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:46:33.386 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap62a1ccdd-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:46:33 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:46:33.386 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 13:46:33 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:46:33.386 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap62a1ccdd-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:46:33 compute-0 nova_compute[185723]: 2026-02-16 13:46:33.388 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:46:33 compute-0 NetworkManager[56177]: <info>  [1771249593.3901] manager: (tap62a1ccdd-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/76)
Feb 16 13:46:33 compute-0 kernel: tap62a1ccdd-30: entered promiscuous mode
Feb 16 13:46:33 compute-0 nova_compute[185723]: 2026-02-16 13:46:33.392 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:46:33 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:46:33.393 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap62a1ccdd-30, col_values=(('external_ids', {'iface-id': 'ac21d57d-f71e-4560-b6aa-e9f6e3838308'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:46:33 compute-0 nova_compute[185723]: 2026-02-16 13:46:33.395 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:46:33 compute-0 ovn_controller[96072]: 2026-02-16T13:46:33Z|00186|binding|INFO|Releasing lport ac21d57d-f71e-4560-b6aa-e9f6e3838308 from this chassis (sb_readonly=0)
Feb 16 13:46:33 compute-0 nova_compute[185723]: 2026-02-16 13:46:33.400 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:46:33 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:46:33.401 105360 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/62a1ccdd-3048-4bbf-acc8-c791bff79ee8.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/62a1ccdd-3048-4bbf-acc8-c791bff79ee8.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Feb 16 13:46:33 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:46:33.403 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[c049ec25-fefe-422f-a5b3-94120d887820]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:46:33 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:46:33.404 105360 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Feb 16 13:46:33 compute-0 ovn_metadata_agent[105355]: global
Feb 16 13:46:33 compute-0 ovn_metadata_agent[105355]:     log         /dev/log local0 debug
Feb 16 13:46:33 compute-0 ovn_metadata_agent[105355]:     log-tag     haproxy-metadata-proxy-62a1ccdd-3048-4bbf-acc8-c791bff79ee8
Feb 16 13:46:33 compute-0 ovn_metadata_agent[105355]:     user        root
Feb 16 13:46:33 compute-0 ovn_metadata_agent[105355]:     group       root
Feb 16 13:46:33 compute-0 ovn_metadata_agent[105355]:     maxconn     1024
Feb 16 13:46:33 compute-0 ovn_metadata_agent[105355]:     pidfile     /var/lib/neutron/external/pids/62a1ccdd-3048-4bbf-acc8-c791bff79ee8.pid.haproxy
Feb 16 13:46:33 compute-0 ovn_metadata_agent[105355]:     daemon
Feb 16 13:46:33 compute-0 ovn_metadata_agent[105355]: 
Feb 16 13:46:33 compute-0 ovn_metadata_agent[105355]: defaults
Feb 16 13:46:33 compute-0 ovn_metadata_agent[105355]:     log global
Feb 16 13:46:33 compute-0 ovn_metadata_agent[105355]:     mode http
Feb 16 13:46:33 compute-0 ovn_metadata_agent[105355]:     option httplog
Feb 16 13:46:33 compute-0 ovn_metadata_agent[105355]:     option dontlognull
Feb 16 13:46:33 compute-0 ovn_metadata_agent[105355]:     option http-server-close
Feb 16 13:46:33 compute-0 ovn_metadata_agent[105355]:     option forwardfor
Feb 16 13:46:33 compute-0 ovn_metadata_agent[105355]:     retries                 3
Feb 16 13:46:33 compute-0 ovn_metadata_agent[105355]:     timeout http-request    30s
Feb 16 13:46:33 compute-0 ovn_metadata_agent[105355]:     timeout connect         30s
Feb 16 13:46:33 compute-0 ovn_metadata_agent[105355]:     timeout client          32s
Feb 16 13:46:33 compute-0 ovn_metadata_agent[105355]:     timeout server          32s
Feb 16 13:46:33 compute-0 ovn_metadata_agent[105355]:     timeout http-keep-alive 30s
Feb 16 13:46:33 compute-0 ovn_metadata_agent[105355]: 
Feb 16 13:46:33 compute-0 ovn_metadata_agent[105355]: 
Feb 16 13:46:33 compute-0 ovn_metadata_agent[105355]: listen listener
Feb 16 13:46:33 compute-0 ovn_metadata_agent[105355]:     bind 169.254.169.254:80
Feb 16 13:46:33 compute-0 ovn_metadata_agent[105355]:     server metadata /var/lib/neutron/metadata_proxy
Feb 16 13:46:33 compute-0 ovn_metadata_agent[105355]:     http-request add-header X-OVN-Network-ID 62a1ccdd-3048-4bbf-acc8-c791bff79ee8
Feb 16 13:46:33 compute-0 ovn_metadata_agent[105355]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Feb 16 13:46:33 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:46:33.404 105360 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'env', 'PROCESS_TAG=haproxy-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/62a1ccdd-3048-4bbf-acc8-c791bff79ee8.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Feb 16 13:46:33 compute-0 nova_compute[185723]: 2026-02-16 13:46:33.508 185727 DEBUG nova.virt.driver [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] Emitting event <LifecycleEvent: 1771249593.5072894, 5fc1ad70-adc5-4109-a323-39a1b7137888 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 13:46:33 compute-0 nova_compute[185723]: 2026-02-16 13:46:33.509 185727 INFO nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: 5fc1ad70-adc5-4109-a323-39a1b7137888] VM Started (Lifecycle Event)
Feb 16 13:46:33 compute-0 nova_compute[185723]: 2026-02-16 13:46:33.535 185727 DEBUG nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: 5fc1ad70-adc5-4109-a323-39a1b7137888] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:46:33 compute-0 nova_compute[185723]: 2026-02-16 13:46:33.540 185727 DEBUG nova.virt.driver [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] Emitting event <LifecycleEvent: 1771249593.5087626, 5fc1ad70-adc5-4109-a323-39a1b7137888 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 13:46:33 compute-0 nova_compute[185723]: 2026-02-16 13:46:33.540 185727 INFO nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: 5fc1ad70-adc5-4109-a323-39a1b7137888] VM Paused (Lifecycle Event)
Feb 16 13:46:33 compute-0 nova_compute[185723]: 2026-02-16 13:46:33.565 185727 DEBUG nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: 5fc1ad70-adc5-4109-a323-39a1b7137888] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:46:33 compute-0 nova_compute[185723]: 2026-02-16 13:46:33.568 185727 DEBUG nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: 5fc1ad70-adc5-4109-a323-39a1b7137888] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 16 13:46:33 compute-0 nova_compute[185723]: 2026-02-16 13:46:33.589 185727 INFO nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: 5fc1ad70-adc5-4109-a323-39a1b7137888] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 16 13:46:33 compute-0 podman[214593]: 2026-02-16 13:46:33.755415635 +0000 UTC m=+0.048399380 container create 9089f89607e35f29a02afc6a40f7f1671ba42a3d00bc43c5b50256e53e01cc73 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0)
Feb 16 13:46:33 compute-0 systemd[1]: Started libpod-conmon-9089f89607e35f29a02afc6a40f7f1671ba42a3d00bc43c5b50256e53e01cc73.scope.
Feb 16 13:46:33 compute-0 systemd[1]: Started libcrun container.
Feb 16 13:46:33 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/81fd336443fdc13027760b7f85d6676cd0044ff839f0a3c63822e13fbab7877a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 16 13:46:33 compute-0 podman[214593]: 2026-02-16 13:46:33.730958979 +0000 UTC m=+0.023942774 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Feb 16 13:46:33 compute-0 podman[214593]: 2026-02-16 13:46:33.833643144 +0000 UTC m=+0.126626919 container init 9089f89607e35f29a02afc6a40f7f1671ba42a3d00bc43c5b50256e53e01cc73 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 16 13:46:33 compute-0 podman[214593]: 2026-02-16 13:46:33.83792011 +0000 UTC m=+0.130903865 container start 9089f89607e35f29a02afc6a40f7f1671ba42a3d00bc43c5b50256e53e01cc73 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 16 13:46:33 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:46:33.844 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=b0e583b2-47d7-4bde-bbd6-282143e0c194, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '25'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:46:33 compute-0 neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8[214608]: [NOTICE]   (214612) : New worker (214614) forked
Feb 16 13:46:33 compute-0 neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8[214608]: [NOTICE]   (214612) : Loading success.
Feb 16 13:46:33 compute-0 nova_compute[185723]: 2026-02-16 13:46:33.870 185727 DEBUG nova.compute.manager [req-c68c61ab-9653-48b7-96c2-18b81fc2a629 req-775fce8c-a751-49f7-84ef-bc8b36db6775 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 5fc1ad70-adc5-4109-a323-39a1b7137888] Received event network-vif-plugged-303c8798-0b7a-4dc2-ac3d-fd237012b497 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:46:33 compute-0 nova_compute[185723]: 2026-02-16 13:46:33.871 185727 DEBUG oslo_concurrency.lockutils [req-c68c61ab-9653-48b7-96c2-18b81fc2a629 req-775fce8c-a751-49f7-84ef-bc8b36db6775 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "5fc1ad70-adc5-4109-a323-39a1b7137888-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:46:33 compute-0 nova_compute[185723]: 2026-02-16 13:46:33.871 185727 DEBUG oslo_concurrency.lockutils [req-c68c61ab-9653-48b7-96c2-18b81fc2a629 req-775fce8c-a751-49f7-84ef-bc8b36db6775 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "5fc1ad70-adc5-4109-a323-39a1b7137888-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:46:33 compute-0 nova_compute[185723]: 2026-02-16 13:46:33.871 185727 DEBUG oslo_concurrency.lockutils [req-c68c61ab-9653-48b7-96c2-18b81fc2a629 req-775fce8c-a751-49f7-84ef-bc8b36db6775 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "5fc1ad70-adc5-4109-a323-39a1b7137888-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:46:33 compute-0 nova_compute[185723]: 2026-02-16 13:46:33.871 185727 DEBUG nova.compute.manager [req-c68c61ab-9653-48b7-96c2-18b81fc2a629 req-775fce8c-a751-49f7-84ef-bc8b36db6775 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 5fc1ad70-adc5-4109-a323-39a1b7137888] Processing event network-vif-plugged-303c8798-0b7a-4dc2-ac3d-fd237012b497 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Feb 16 13:46:33 compute-0 nova_compute[185723]: 2026-02-16 13:46:33.872 185727 DEBUG nova.compute.manager [None req-ca70c9ed-3aa2-4463-acf8-e9bdee047ef7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 5fc1ad70-adc5-4109-a323-39a1b7137888] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Feb 16 13:46:33 compute-0 nova_compute[185723]: 2026-02-16 13:46:33.878 185727 DEBUG nova.virt.driver [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] Emitting event <LifecycleEvent: 1771249593.8772833, 5fc1ad70-adc5-4109-a323-39a1b7137888 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 13:46:33 compute-0 nova_compute[185723]: 2026-02-16 13:46:33.878 185727 INFO nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: 5fc1ad70-adc5-4109-a323-39a1b7137888] VM Resumed (Lifecycle Event)
Feb 16 13:46:33 compute-0 nova_compute[185723]: 2026-02-16 13:46:33.880 185727 DEBUG nova.virt.libvirt.driver [None req-ca70c9ed-3aa2-4463-acf8-e9bdee047ef7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 5fc1ad70-adc5-4109-a323-39a1b7137888] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Feb 16 13:46:33 compute-0 nova_compute[185723]: 2026-02-16 13:46:33.884 185727 INFO nova.virt.libvirt.driver [-] [instance: 5fc1ad70-adc5-4109-a323-39a1b7137888] Instance spawned successfully.
Feb 16 13:46:33 compute-0 nova_compute[185723]: 2026-02-16 13:46:33.885 185727 DEBUG nova.virt.libvirt.driver [None req-ca70c9ed-3aa2-4463-acf8-e9bdee047ef7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 5fc1ad70-adc5-4109-a323-39a1b7137888] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Feb 16 13:46:33 compute-0 nova_compute[185723]: 2026-02-16 13:46:33.913 185727 DEBUG nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: 5fc1ad70-adc5-4109-a323-39a1b7137888] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:46:33 compute-0 nova_compute[185723]: 2026-02-16 13:46:33.918 185727 DEBUG nova.virt.libvirt.driver [None req-ca70c9ed-3aa2-4463-acf8-e9bdee047ef7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 5fc1ad70-adc5-4109-a323-39a1b7137888] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:46:33 compute-0 nova_compute[185723]: 2026-02-16 13:46:33.919 185727 DEBUG nova.virt.libvirt.driver [None req-ca70c9ed-3aa2-4463-acf8-e9bdee047ef7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 5fc1ad70-adc5-4109-a323-39a1b7137888] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:46:33 compute-0 nova_compute[185723]: 2026-02-16 13:46:33.919 185727 DEBUG nova.virt.libvirt.driver [None req-ca70c9ed-3aa2-4463-acf8-e9bdee047ef7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 5fc1ad70-adc5-4109-a323-39a1b7137888] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:46:33 compute-0 nova_compute[185723]: 2026-02-16 13:46:33.920 185727 DEBUG nova.virt.libvirt.driver [None req-ca70c9ed-3aa2-4463-acf8-e9bdee047ef7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 5fc1ad70-adc5-4109-a323-39a1b7137888] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:46:33 compute-0 nova_compute[185723]: 2026-02-16 13:46:33.920 185727 DEBUG nova.virt.libvirt.driver [None req-ca70c9ed-3aa2-4463-acf8-e9bdee047ef7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 5fc1ad70-adc5-4109-a323-39a1b7137888] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:46:33 compute-0 nova_compute[185723]: 2026-02-16 13:46:33.922 185727 DEBUG nova.virt.libvirt.driver [None req-ca70c9ed-3aa2-4463-acf8-e9bdee047ef7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 5fc1ad70-adc5-4109-a323-39a1b7137888] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:46:33 compute-0 nova_compute[185723]: 2026-02-16 13:46:33.926 185727 DEBUG nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: 5fc1ad70-adc5-4109-a323-39a1b7137888] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 16 13:46:33 compute-0 nova_compute[185723]: 2026-02-16 13:46:33.963 185727 INFO nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: 5fc1ad70-adc5-4109-a323-39a1b7137888] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 16 13:46:33 compute-0 nova_compute[185723]: 2026-02-16 13:46:33.992 185727 INFO nova.compute.manager [None req-ca70c9ed-3aa2-4463-acf8-e9bdee047ef7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 5fc1ad70-adc5-4109-a323-39a1b7137888] Took 8.16 seconds to spawn the instance on the hypervisor.
Feb 16 13:46:33 compute-0 nova_compute[185723]: 2026-02-16 13:46:33.993 185727 DEBUG nova.compute.manager [None req-ca70c9ed-3aa2-4463-acf8-e9bdee047ef7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 5fc1ad70-adc5-4109-a323-39a1b7137888] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:46:34 compute-0 nova_compute[185723]: 2026-02-16 13:46:34.054 185727 DEBUG nova.network.neutron [req-e62a8be0-5db1-4529-aa57-7c2560781ff0 req-d53289c9-8762-4fe6-adcf-b649cc5fd3ac faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 5fc1ad70-adc5-4109-a323-39a1b7137888] Updated VIF entry in instance network info cache for port 303c8798-0b7a-4dc2-ac3d-fd237012b497. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 16 13:46:34 compute-0 nova_compute[185723]: 2026-02-16 13:46:34.062 185727 DEBUG nova.network.neutron [req-e62a8be0-5db1-4529-aa57-7c2560781ff0 req-d53289c9-8762-4fe6-adcf-b649cc5fd3ac faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 5fc1ad70-adc5-4109-a323-39a1b7137888] Updating instance_info_cache with network_info: [{"id": "303c8798-0b7a-4dc2-ac3d-fd237012b497", "address": "fa:16:3e:71:1d:c5", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap303c8798-0b", "ovs_interfaceid": "303c8798-0b7a-4dc2-ac3d-fd237012b497", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 13:46:34 compute-0 nova_compute[185723]: 2026-02-16 13:46:34.071 185727 INFO nova.compute.manager [None req-ca70c9ed-3aa2-4463-acf8-e9bdee047ef7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 5fc1ad70-adc5-4109-a323-39a1b7137888] Took 8.64 seconds to build instance.
Feb 16 13:46:34 compute-0 nova_compute[185723]: 2026-02-16 13:46:34.084 185727 DEBUG oslo_concurrency.lockutils [req-e62a8be0-5db1-4529-aa57-7c2560781ff0 req-d53289c9-8762-4fe6-adcf-b649cc5fd3ac faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Releasing lock "refresh_cache-5fc1ad70-adc5-4109-a323-39a1b7137888" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 13:46:34 compute-0 nova_compute[185723]: 2026-02-16 13:46:34.087 185727 DEBUG oslo_concurrency.lockutils [None req-ca70c9ed-3aa2-4463-acf8-e9bdee047ef7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "5fc1ad70-adc5-4109-a323-39a1b7137888" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.757s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:46:35 compute-0 podman[214624]: 2026-02-16 13:46:35.01961262 +0000 UTC m=+0.054980664 container health_status d69bd0be854ec8043fcba555b70c2cd311c7a2510d07d6bc9929fe7684abece9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Feb 16 13:46:35 compute-0 podman[214623]: 2026-02-16 13:46:35.023152437 +0000 UTC m=+0.060900890 container health_status 93151dcbec9f159583042df5101bdd7b5b1bcb5f3117955b4bc9cd1c0e465b98 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, org.opencontainers.image.created=2026-02-05T04:57:10Z, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-type=git, version=9.7, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, name=ubi9/ubi-minimal, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, release=1770267347, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, vendor=Red Hat, Inc., architecture=x86_64, build-date=2026-02-05T04:57:10Z, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream)
Feb 16 13:46:35 compute-0 nova_compute[185723]: 2026-02-16 13:46:35.403 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:46:35 compute-0 nova_compute[185723]: 2026-02-16 13:46:35.959 185727 DEBUG nova.compute.manager [req-0bf46362-0a17-4b76-8bd0-33b2917d2166 req-060b84f2-131e-4953-ad38-2cf93dc781f1 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 5fc1ad70-adc5-4109-a323-39a1b7137888] Received event network-vif-plugged-303c8798-0b7a-4dc2-ac3d-fd237012b497 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:46:35 compute-0 nova_compute[185723]: 2026-02-16 13:46:35.960 185727 DEBUG oslo_concurrency.lockutils [req-0bf46362-0a17-4b76-8bd0-33b2917d2166 req-060b84f2-131e-4953-ad38-2cf93dc781f1 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "5fc1ad70-adc5-4109-a323-39a1b7137888-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:46:35 compute-0 nova_compute[185723]: 2026-02-16 13:46:35.960 185727 DEBUG oslo_concurrency.lockutils [req-0bf46362-0a17-4b76-8bd0-33b2917d2166 req-060b84f2-131e-4953-ad38-2cf93dc781f1 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "5fc1ad70-adc5-4109-a323-39a1b7137888-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:46:35 compute-0 nova_compute[185723]: 2026-02-16 13:46:35.960 185727 DEBUG oslo_concurrency.lockutils [req-0bf46362-0a17-4b76-8bd0-33b2917d2166 req-060b84f2-131e-4953-ad38-2cf93dc781f1 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "5fc1ad70-adc5-4109-a323-39a1b7137888-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:46:35 compute-0 nova_compute[185723]: 2026-02-16 13:46:35.960 185727 DEBUG nova.compute.manager [req-0bf46362-0a17-4b76-8bd0-33b2917d2166 req-060b84f2-131e-4953-ad38-2cf93dc781f1 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 5fc1ad70-adc5-4109-a323-39a1b7137888] No waiting events found dispatching network-vif-plugged-303c8798-0b7a-4dc2-ac3d-fd237012b497 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:46:35 compute-0 nova_compute[185723]: 2026-02-16 13:46:35.961 185727 WARNING nova.compute.manager [req-0bf46362-0a17-4b76-8bd0-33b2917d2166 req-060b84f2-131e-4953-ad38-2cf93dc781f1 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 5fc1ad70-adc5-4109-a323-39a1b7137888] Received unexpected event network-vif-plugged-303c8798-0b7a-4dc2-ac3d-fd237012b497 for instance with vm_state active and task_state None.
Feb 16 13:46:36 compute-0 nova_compute[185723]: 2026-02-16 13:46:36.983 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:46:40 compute-0 nova_compute[185723]: 2026-02-16 13:46:40.406 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:46:41 compute-0 podman[214664]: 2026-02-16 13:46:41.035862762 +0000 UTC m=+0.072541829 container health_status 19961a2f872cc72daf954f4dd556f980b5f58f137d145776df6132c167a8a93c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Feb 16 13:46:41 compute-0 nova_compute[185723]: 2026-02-16 13:46:41.986 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:46:45 compute-0 ovn_controller[96072]: 2026-02-16T13:46:45Z|00026|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:71:1d:c5 10.100.0.13
Feb 16 13:46:45 compute-0 ovn_controller[96072]: 2026-02-16T13:46:45Z|00027|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:71:1d:c5 10.100.0.13
Feb 16 13:46:45 compute-0 nova_compute[185723]: 2026-02-16 13:46:45.439 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:46:46 compute-0 nova_compute[185723]: 2026-02-16 13:46:46.988 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:46:48 compute-0 sshd-session[214707]: Invalid user ubuntu from 64.227.72.94 port 56640
Feb 16 13:46:48 compute-0 sshd-session[214707]: Connection closed by invalid user ubuntu 64.227.72.94 port 56640 [preauth]
Feb 16 13:46:50 compute-0 nova_compute[185723]: 2026-02-16 13:46:50.443 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:46:51 compute-0 podman[214709]: 2026-02-16 13:46:51.021696655 +0000 UTC m=+0.059678910 container health_status 4ec6105d1727f201f656d7e6e358931be8ceabc5af37fb5ad779abc620122180 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 16 13:46:51 compute-0 nova_compute[185723]: 2026-02-16 13:46:51.991 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:46:53 compute-0 systemd[1]: virtproxyd.service: Deactivated successfully.
Feb 16 13:46:55 compute-0 nova_compute[185723]: 2026-02-16 13:46:55.444 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:46:56 compute-0 nova_compute[185723]: 2026-02-16 13:46:56.994 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:46:59 compute-0 podman[195053]: time="2026-02-16T13:46:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:46:59 compute-0 podman[195053]: @ - - [16/Feb/2026:13:46:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17245 "" "Go-http-client/1.1"
Feb 16 13:46:59 compute-0 podman[195053]: @ - - [16/Feb/2026:13:46:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2642 "" "Go-http-client/1.1"
Feb 16 13:47:00 compute-0 nova_compute[185723]: 2026-02-16 13:47:00.446 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:47:01 compute-0 openstack_network_exporter[197909]: ERROR   13:47:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:47:01 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:47:01 compute-0 openstack_network_exporter[197909]: ERROR   13:47:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:47:01 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:47:01 compute-0 nova_compute[185723]: 2026-02-16 13:47:01.996 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:47:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:47:03.240 105360 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:47:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:47:03.241 105360 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:47:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:47:03.241 105360 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:47:05 compute-0 nova_compute[185723]: 2026-02-16 13:47:05.448 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:47:05 compute-0 nova_compute[185723]: 2026-02-16 13:47:05.505 185727 DEBUG nova.virt.libvirt.driver [None req-1a7029b2-abe9-4613-bea5-b54bc2c81ba8 d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] [instance: 3217baa5-9eb7-414f-b18a-c49217ace9b6] Creating tmpfile /var/lib/nova/instances/tmpu_jf1msv to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10041
Feb 16 13:47:05 compute-0 nova_compute[185723]: 2026-02-16 13:47:05.507 185727 DEBUG nova.compute.manager [None req-1a7029b2-abe9-4613-bea5-b54bc2c81ba8 d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpu_jf1msv',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.9/site-packages/nova/compute/manager.py:8476
Feb 16 13:47:06 compute-0 podman[214736]: 2026-02-16 13:47:06.038387077 +0000 UTC m=+0.076968648 container health_status 93151dcbec9f159583042df5101bdd7b5b1bcb5f3117955b4bc9cd1c0e465b98 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=openstack_network_exporter, io.openshift.tags=minimal rhel9, build-date=2026-02-05T04:57:10Z, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.buildah.version=1.33.7, container_name=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, vendor=Red Hat, Inc., release=1770267347, vcs-type=git, name=ubi9/ubi-minimal, org.opencontainers.image.created=2026-02-05T04:57:10Z, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.7, architecture=x86_64, com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers)
Feb 16 13:47:06 compute-0 podman[214737]: 2026-02-16 13:47:06.038546411 +0000 UTC m=+0.074990109 container health_status d69bd0be854ec8043fcba555b70c2cd311c7a2510d07d6bc9929fe7684abece9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 16 13:47:07 compute-0 nova_compute[185723]: 2026-02-16 13:47:06.999 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:47:08 compute-0 nova_compute[185723]: 2026-02-16 13:47:08.574 185727 DEBUG nova.compute.manager [None req-1a7029b2-abe9-4613-bea5-b54bc2c81ba8 d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpu_jf1msv',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='3217baa5-9eb7-414f-b18a-c49217ace9b6',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8604
Feb 16 13:47:08 compute-0 nova_compute[185723]: 2026-02-16 13:47:08.601 185727 DEBUG oslo_concurrency.lockutils [None req-1a7029b2-abe9-4613-bea5-b54bc2c81ba8 d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] Acquiring lock "refresh_cache-3217baa5-9eb7-414f-b18a-c49217ace9b6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 13:47:08 compute-0 nova_compute[185723]: 2026-02-16 13:47:08.602 185727 DEBUG oslo_concurrency.lockutils [None req-1a7029b2-abe9-4613-bea5-b54bc2c81ba8 d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] Acquired lock "refresh_cache-3217baa5-9eb7-414f-b18a-c49217ace9b6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 13:47:08 compute-0 nova_compute[185723]: 2026-02-16 13:47:08.602 185727 DEBUG nova.network.neutron [None req-1a7029b2-abe9-4613-bea5-b54bc2c81ba8 d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] [instance: 3217baa5-9eb7-414f-b18a-c49217ace9b6] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 16 13:47:09 compute-0 sshd-session[214776]: Invalid user test from 146.190.226.24 port 40656
Feb 16 13:47:09 compute-0 sshd-session[214776]: Connection closed by invalid user test 146.190.226.24 port 40656 [preauth]
Feb 16 13:47:10 compute-0 nova_compute[185723]: 2026-02-16 13:47:10.450 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:47:12 compute-0 nova_compute[185723]: 2026-02-16 13:47:12.001 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:47:12 compute-0 podman[214778]: 2026-02-16 13:47:12.031222018 +0000 UTC m=+0.068166320 container health_status 19961a2f872cc72daf954f4dd556f980b5f58f137d145776df6132c167a8a93c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Feb 16 13:47:13 compute-0 nova_compute[185723]: 2026-02-16 13:47:13.188 185727 DEBUG nova.network.neutron [None req-1a7029b2-abe9-4613-bea5-b54bc2c81ba8 d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] [instance: 3217baa5-9eb7-414f-b18a-c49217ace9b6] Updating instance_info_cache with network_info: [{"id": "0ec2c49b-401e-4ba2-8344-3d943b18845b", "address": "fa:16:3e:34:09:3f", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0ec2c49b-40", "ovs_interfaceid": "0ec2c49b-401e-4ba2-8344-3d943b18845b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 13:47:13 compute-0 nova_compute[185723]: 2026-02-16 13:47:13.434 185727 DEBUG oslo_concurrency.lockutils [None req-1a7029b2-abe9-4613-bea5-b54bc2c81ba8 d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] Releasing lock "refresh_cache-3217baa5-9eb7-414f-b18a-c49217ace9b6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 13:47:13 compute-0 nova_compute[185723]: 2026-02-16 13:47:13.436 185727 DEBUG nova.virt.libvirt.driver [None req-1a7029b2-abe9-4613-bea5-b54bc2c81ba8 d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] [instance: 3217baa5-9eb7-414f-b18a-c49217ace9b6] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpu_jf1msv',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='3217baa5-9eb7-414f-b18a-c49217ace9b6',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10827
Feb 16 13:47:13 compute-0 nova_compute[185723]: 2026-02-16 13:47:13.437 185727 DEBUG nova.virt.libvirt.driver [None req-1a7029b2-abe9-4613-bea5-b54bc2c81ba8 d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] [instance: 3217baa5-9eb7-414f-b18a-c49217ace9b6] Creating instance directory: /var/lib/nova/instances/3217baa5-9eb7-414f-b18a-c49217ace9b6 pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10840
Feb 16 13:47:13 compute-0 nova_compute[185723]: 2026-02-16 13:47:13.437 185727 DEBUG nova.virt.libvirt.driver [None req-1a7029b2-abe9-4613-bea5-b54bc2c81ba8 d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] [instance: 3217baa5-9eb7-414f-b18a-c49217ace9b6] Creating disk.info with the contents: {'/var/lib/nova/instances/3217baa5-9eb7-414f-b18a-c49217ace9b6/disk': 'qcow2', '/var/lib/nova/instances/3217baa5-9eb7-414f-b18a-c49217ace9b6/disk.config': 'raw'} pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10854
Feb 16 13:47:13 compute-0 nova_compute[185723]: 2026-02-16 13:47:13.438 185727 DEBUG nova.virt.libvirt.driver [None req-1a7029b2-abe9-4613-bea5-b54bc2c81ba8 d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] [instance: 3217baa5-9eb7-414f-b18a-c49217ace9b6] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10864
Feb 16 13:47:13 compute-0 nova_compute[185723]: 2026-02-16 13:47:13.438 185727 DEBUG nova.objects.instance [None req-1a7029b2-abe9-4613-bea5-b54bc2c81ba8 d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] Lazy-loading 'trusted_certs' on Instance uuid 3217baa5-9eb7-414f-b18a-c49217ace9b6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 13:47:13 compute-0 nova_compute[185723]: 2026-02-16 13:47:13.477 185727 DEBUG oslo_concurrency.processutils [None req-1a7029b2-abe9-4613-bea5-b54bc2c81ba8 d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:47:13 compute-0 nova_compute[185723]: 2026-02-16 13:47:13.531 185727 DEBUG oslo_concurrency.processutils [None req-1a7029b2-abe9-4613-bea5-b54bc2c81ba8 d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:47:13 compute-0 nova_compute[185723]: 2026-02-16 13:47:13.532 185727 DEBUG oslo_concurrency.lockutils [None req-1a7029b2-abe9-4613-bea5-b54bc2c81ba8 d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] Acquiring lock "755ae6cb63977e865a71a8c38a1a67ce95c9d0b7" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:47:13 compute-0 nova_compute[185723]: 2026-02-16 13:47:13.533 185727 DEBUG oslo_concurrency.lockutils [None req-1a7029b2-abe9-4613-bea5-b54bc2c81ba8 d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] Lock "755ae6cb63977e865a71a8c38a1a67ce95c9d0b7" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:47:13 compute-0 nova_compute[185723]: 2026-02-16 13:47:13.545 185727 DEBUG oslo_concurrency.processutils [None req-1a7029b2-abe9-4613-bea5-b54bc2c81ba8 d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:47:13 compute-0 nova_compute[185723]: 2026-02-16 13:47:13.594 185727 DEBUG oslo_concurrency.processutils [None req-1a7029b2-abe9-4613-bea5-b54bc2c81ba8 d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json" returned: 0 in 0.049s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:47:13 compute-0 nova_compute[185723]: 2026-02-16 13:47:13.595 185727 DEBUG oslo_concurrency.processutils [None req-1a7029b2-abe9-4613-bea5-b54bc2c81ba8 d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7,backing_fmt=raw /var/lib/nova/instances/3217baa5-9eb7-414f-b18a-c49217ace9b6/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:47:13 compute-0 nova_compute[185723]: 2026-02-16 13:47:13.628 185727 DEBUG oslo_concurrency.processutils [None req-1a7029b2-abe9-4613-bea5-b54bc2c81ba8 d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7,backing_fmt=raw /var/lib/nova/instances/3217baa5-9eb7-414f-b18a-c49217ace9b6/disk 1073741824" returned: 0 in 0.033s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:47:13 compute-0 nova_compute[185723]: 2026-02-16 13:47:13.630 185727 DEBUG oslo_concurrency.lockutils [None req-1a7029b2-abe9-4613-bea5-b54bc2c81ba8 d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] Lock "755ae6cb63977e865a71a8c38a1a67ce95c9d0b7" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.096s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:47:13 compute-0 nova_compute[185723]: 2026-02-16 13:47:13.630 185727 DEBUG oslo_concurrency.processutils [None req-1a7029b2-abe9-4613-bea5-b54bc2c81ba8 d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:47:13 compute-0 nova_compute[185723]: 2026-02-16 13:47:13.690 185727 DEBUG oslo_concurrency.processutils [None req-1a7029b2-abe9-4613-bea5-b54bc2c81ba8 d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:47:13 compute-0 nova_compute[185723]: 2026-02-16 13:47:13.691 185727 DEBUG nova.virt.disk.api [None req-1a7029b2-abe9-4613-bea5-b54bc2c81ba8 d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] Checking if we can resize image /var/lib/nova/instances/3217baa5-9eb7-414f-b18a-c49217ace9b6/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Feb 16 13:47:13 compute-0 nova_compute[185723]: 2026-02-16 13:47:13.691 185727 DEBUG oslo_concurrency.processutils [None req-1a7029b2-abe9-4613-bea5-b54bc2c81ba8 d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3217baa5-9eb7-414f-b18a-c49217ace9b6/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:47:13 compute-0 nova_compute[185723]: 2026-02-16 13:47:13.744 185727 DEBUG oslo_concurrency.processutils [None req-1a7029b2-abe9-4613-bea5-b54bc2c81ba8 d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3217baa5-9eb7-414f-b18a-c49217ace9b6/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:47:13 compute-0 nova_compute[185723]: 2026-02-16 13:47:13.745 185727 DEBUG nova.virt.disk.api [None req-1a7029b2-abe9-4613-bea5-b54bc2c81ba8 d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] Cannot resize image /var/lib/nova/instances/3217baa5-9eb7-414f-b18a-c49217ace9b6/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Feb 16 13:47:13 compute-0 nova_compute[185723]: 2026-02-16 13:47:13.746 185727 DEBUG nova.objects.instance [None req-1a7029b2-abe9-4613-bea5-b54bc2c81ba8 d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] Lazy-loading 'migration_context' on Instance uuid 3217baa5-9eb7-414f-b18a-c49217ace9b6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 13:47:13 compute-0 nova_compute[185723]: 2026-02-16 13:47:13.766 185727 DEBUG oslo_concurrency.processutils [None req-1a7029b2-abe9-4613-bea5-b54bc2c81ba8 d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/3217baa5-9eb7-414f-b18a-c49217ace9b6/disk.config 485376 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:47:13 compute-0 nova_compute[185723]: 2026-02-16 13:47:13.790 185727 DEBUG oslo_concurrency.processutils [None req-1a7029b2-abe9-4613-bea5-b54bc2c81ba8 d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/3217baa5-9eb7-414f-b18a-c49217ace9b6/disk.config 485376" returned: 0 in 0.024s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:47:13 compute-0 nova_compute[185723]: 2026-02-16 13:47:13.792 185727 DEBUG nova.virt.libvirt.volume.remotefs [None req-1a7029b2-abe9-4613-bea5-b54bc2c81ba8 d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] Copying file compute-1.ctlplane.example.com:/var/lib/nova/instances/3217baa5-9eb7-414f-b18a-c49217ace9b6/disk.config to /var/lib/nova/instances/3217baa5-9eb7-414f-b18a-c49217ace9b6 copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Feb 16 13:47:13 compute-0 nova_compute[185723]: 2026-02-16 13:47:13.793 185727 DEBUG oslo_concurrency.processutils [None req-1a7029b2-abe9-4613-bea5-b54bc2c81ba8 d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] Running cmd (subprocess): scp -C -r compute-1.ctlplane.example.com:/var/lib/nova/instances/3217baa5-9eb7-414f-b18a-c49217ace9b6/disk.config /var/lib/nova/instances/3217baa5-9eb7-414f-b18a-c49217ace9b6 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:47:14 compute-0 nova_compute[185723]: 2026-02-16 13:47:14.157 185727 DEBUG oslo_concurrency.processutils [None req-1a7029b2-abe9-4613-bea5-b54bc2c81ba8 d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] CMD "scp -C -r compute-1.ctlplane.example.com:/var/lib/nova/instances/3217baa5-9eb7-414f-b18a-c49217ace9b6/disk.config /var/lib/nova/instances/3217baa5-9eb7-414f-b18a-c49217ace9b6" returned: 0 in 0.364s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:47:14 compute-0 nova_compute[185723]: 2026-02-16 13:47:14.158 185727 DEBUG nova.virt.libvirt.driver [None req-1a7029b2-abe9-4613-bea5-b54bc2c81ba8 d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] [instance: 3217baa5-9eb7-414f-b18a-c49217ace9b6] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10794
Feb 16 13:47:14 compute-0 nova_compute[185723]: 2026-02-16 13:47:14.160 185727 DEBUG nova.virt.libvirt.vif [None req-1a7029b2-abe9-4613-bea5-b54bc2c81ba8 d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-16T13:46:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1517311993',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1517311993',id=22,image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-16T13:46:55Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='76c271745e704d5fa97fe16a7dcd4a81',ramdisk_id='',reservation_id='r-4pvdm8hl',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-1085993185',owner_user_name='tempest-TestExecuteStrategies-1085993185-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2026-02-16T13:46:55Z,user_data=None,user_id='e19cd2d8a8894526ba620ca3249e9a63',uuid=3217baa5-9eb7-414f-b18a-c49217ace9b6,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "0ec2c49b-401e-4ba2-8344-3d943b18845b", "address": "fa:16:3e:34:09:3f", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap0ec2c49b-40", "ovs_interfaceid": "0ec2c49b-401e-4ba2-8344-3d943b18845b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Feb 16 13:47:14 compute-0 nova_compute[185723]: 2026-02-16 13:47:14.160 185727 DEBUG nova.network.os_vif_util [None req-1a7029b2-abe9-4613-bea5-b54bc2c81ba8 d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] Converting VIF {"id": "0ec2c49b-401e-4ba2-8344-3d943b18845b", "address": "fa:16:3e:34:09:3f", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap0ec2c49b-40", "ovs_interfaceid": "0ec2c49b-401e-4ba2-8344-3d943b18845b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 13:47:14 compute-0 nova_compute[185723]: 2026-02-16 13:47:14.162 185727 DEBUG nova.network.os_vif_util [None req-1a7029b2-abe9-4613-bea5-b54bc2c81ba8 d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:34:09:3f,bridge_name='br-int',has_traffic_filtering=True,id=0ec2c49b-401e-4ba2-8344-3d943b18845b,network=Network(62a1ccdd-3048-4bbf-acc8-c791bff79ee8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0ec2c49b-40') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 13:47:14 compute-0 nova_compute[185723]: 2026-02-16 13:47:14.162 185727 DEBUG os_vif [None req-1a7029b2-abe9-4613-bea5-b54bc2c81ba8 d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:34:09:3f,bridge_name='br-int',has_traffic_filtering=True,id=0ec2c49b-401e-4ba2-8344-3d943b18845b,network=Network(62a1ccdd-3048-4bbf-acc8-c791bff79ee8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0ec2c49b-40') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Feb 16 13:47:14 compute-0 nova_compute[185723]: 2026-02-16 13:47:14.163 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:47:14 compute-0 nova_compute[185723]: 2026-02-16 13:47:14.164 185727 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:47:14 compute-0 nova_compute[185723]: 2026-02-16 13:47:14.164 185727 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 13:47:14 compute-0 nova_compute[185723]: 2026-02-16 13:47:14.167 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:47:14 compute-0 nova_compute[185723]: 2026-02-16 13:47:14.167 185727 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0ec2c49b-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:47:14 compute-0 nova_compute[185723]: 2026-02-16 13:47:14.168 185727 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap0ec2c49b-40, col_values=(('external_ids', {'iface-id': '0ec2c49b-401e-4ba2-8344-3d943b18845b', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:34:09:3f', 'vm-uuid': '3217baa5-9eb7-414f-b18a-c49217ace9b6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:47:14 compute-0 nova_compute[185723]: 2026-02-16 13:47:14.171 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:47:14 compute-0 NetworkManager[56177]: <info>  [1771249634.1726] manager: (tap0ec2c49b-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/77)
Feb 16 13:47:14 compute-0 nova_compute[185723]: 2026-02-16 13:47:14.174 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 13:47:14 compute-0 nova_compute[185723]: 2026-02-16 13:47:14.177 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:47:14 compute-0 nova_compute[185723]: 2026-02-16 13:47:14.178 185727 INFO os_vif [None req-1a7029b2-abe9-4613-bea5-b54bc2c81ba8 d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:34:09:3f,bridge_name='br-int',has_traffic_filtering=True,id=0ec2c49b-401e-4ba2-8344-3d943b18845b,network=Network(62a1ccdd-3048-4bbf-acc8-c791bff79ee8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0ec2c49b-40')
Feb 16 13:47:14 compute-0 nova_compute[185723]: 2026-02-16 13:47:14.179 185727 DEBUG nova.virt.libvirt.driver [None req-1a7029b2-abe9-4613-bea5-b54bc2c81ba8 d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10954
Feb 16 13:47:14 compute-0 nova_compute[185723]: 2026-02-16 13:47:14.179 185727 DEBUG nova.compute.manager [None req-1a7029b2-abe9-4613-bea5-b54bc2c81ba8 d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpu_jf1msv',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='3217baa5-9eb7-414f-b18a-c49217ace9b6',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8668
Feb 16 13:47:15 compute-0 nova_compute[185723]: 2026-02-16 13:47:15.451 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:47:15 compute-0 nova_compute[185723]: 2026-02-16 13:47:15.888 185727 DEBUG nova.network.neutron [None req-1a7029b2-abe9-4613-bea5-b54bc2c81ba8 d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] [instance: 3217baa5-9eb7-414f-b18a-c49217ace9b6] Port 0ec2c49b-401e-4ba2-8344-3d943b18845b updated with migration profile {'migrating_to': 'compute-0.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.9/site-packages/nova/network/neutron.py:354
Feb 16 13:47:15 compute-0 nova_compute[185723]: 2026-02-16 13:47:15.890 185727 DEBUG nova.compute.manager [None req-1a7029b2-abe9-4613-bea5-b54bc2c81ba8 d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpu_jf1msv',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='3217baa5-9eb7-414f-b18a-c49217ace9b6',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8723
Feb 16 13:47:16 compute-0 systemd[1]: Starting libvirt proxy daemon...
Feb 16 13:47:16 compute-0 systemd[1]: Started libvirt proxy daemon.
Feb 16 13:47:16 compute-0 kernel: tap0ec2c49b-40: entered promiscuous mode
Feb 16 13:47:16 compute-0 NetworkManager[56177]: <info>  [1771249636.1868] manager: (tap0ec2c49b-40): new Tun device (/org/freedesktop/NetworkManager/Devices/78)
Feb 16 13:47:16 compute-0 ovn_controller[96072]: 2026-02-16T13:47:16Z|00187|binding|INFO|Claiming lport 0ec2c49b-401e-4ba2-8344-3d943b18845b for this additional chassis.
Feb 16 13:47:16 compute-0 nova_compute[185723]: 2026-02-16 13:47:16.187 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:47:16 compute-0 ovn_controller[96072]: 2026-02-16T13:47:16Z|00188|binding|INFO|0ec2c49b-401e-4ba2-8344-3d943b18845b: Claiming fa:16:3e:34:09:3f 10.100.0.4
Feb 16 13:47:16 compute-0 ovn_controller[96072]: 2026-02-16T13:47:16Z|00189|binding|INFO|Setting lport 0ec2c49b-401e-4ba2-8344-3d943b18845b ovn-installed in OVS
Feb 16 13:47:16 compute-0 nova_compute[185723]: 2026-02-16 13:47:16.197 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:47:16 compute-0 nova_compute[185723]: 2026-02-16 13:47:16.199 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:47:16 compute-0 systemd-udevd[214863]: Network interface NamePolicy= disabled on kernel command line.
Feb 16 13:47:16 compute-0 systemd-machined[155229]: New machine qemu-18-instance-00000016.
Feb 16 13:47:16 compute-0 NetworkManager[56177]: <info>  [1771249636.2249] device (tap0ec2c49b-40): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 16 13:47:16 compute-0 NetworkManager[56177]: <info>  [1771249636.2257] device (tap0ec2c49b-40): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Feb 16 13:47:16 compute-0 systemd[1]: Started Virtual Machine qemu-18-instance-00000016.
Feb 16 13:47:16 compute-0 sshd-session[214827]: Invalid user postgres from 188.166.42.159 port 37782
Feb 16 13:47:16 compute-0 sshd-session[214827]: Connection closed by invalid user postgres 188.166.42.159 port 37782 [preauth]
Feb 16 13:47:17 compute-0 nova_compute[185723]: 2026-02-16 13:47:17.656 185727 DEBUG nova.virt.driver [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] Emitting event <LifecycleEvent: 1771249637.6562872, 3217baa5-9eb7-414f-b18a-c49217ace9b6 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 13:47:17 compute-0 nova_compute[185723]: 2026-02-16 13:47:17.658 185727 INFO nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: 3217baa5-9eb7-414f-b18a-c49217ace9b6] VM Started (Lifecycle Event)
Feb 16 13:47:17 compute-0 nova_compute[185723]: 2026-02-16 13:47:17.722 185727 DEBUG nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: 3217baa5-9eb7-414f-b18a-c49217ace9b6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:47:18 compute-0 nova_compute[185723]: 2026-02-16 13:47:18.356 185727 DEBUG nova.virt.driver [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] Emitting event <LifecycleEvent: 1771249638.3558397, 3217baa5-9eb7-414f-b18a-c49217ace9b6 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 13:47:18 compute-0 nova_compute[185723]: 2026-02-16 13:47:18.357 185727 INFO nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: 3217baa5-9eb7-414f-b18a-c49217ace9b6] VM Resumed (Lifecycle Event)
Feb 16 13:47:18 compute-0 nova_compute[185723]: 2026-02-16 13:47:18.400 185727 DEBUG nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: 3217baa5-9eb7-414f-b18a-c49217ace9b6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:47:18 compute-0 nova_compute[185723]: 2026-02-16 13:47:18.404 185727 DEBUG nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: 3217baa5-9eb7-414f-b18a-c49217ace9b6] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 16 13:47:18 compute-0 nova_compute[185723]: 2026-02-16 13:47:18.425 185727 INFO nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: 3217baa5-9eb7-414f-b18a-c49217ace9b6] During the sync_power process the instance has moved from host compute-1.ctlplane.example.com to host compute-0.ctlplane.example.com
Feb 16 13:47:19 compute-0 nova_compute[185723]: 2026-02-16 13:47:19.171 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:47:20 compute-0 ovn_controller[96072]: 2026-02-16T13:47:20Z|00190|binding|INFO|Claiming lport 0ec2c49b-401e-4ba2-8344-3d943b18845b for this chassis.
Feb 16 13:47:20 compute-0 ovn_controller[96072]: 2026-02-16T13:47:20Z|00191|binding|INFO|0ec2c49b-401e-4ba2-8344-3d943b18845b: Claiming fa:16:3e:34:09:3f 10.100.0.4
Feb 16 13:47:20 compute-0 ovn_controller[96072]: 2026-02-16T13:47:20Z|00192|binding|INFO|Setting lport 0ec2c49b-401e-4ba2-8344-3d943b18845b up in Southbound
Feb 16 13:47:20 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:47:20.372 105360 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:34:09:3f 10.100.0.4'], port_security=['fa:16:3e:34:09:3f 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '3217baa5-9eb7-414f-b18a-c49217ace9b6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '76c271745e704d5fa97fe16a7dcd4a81', 'neutron:revision_number': '11', 'neutron:security_group_ids': '832f9d98-96fb-45e2-8c11-0f4534711ee2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fb3ae2f5-242d-4288-83f4-c053c76499e5, chassis=[<ovs.db.idl.Row object at 0x7f2e53046760>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2e53046760>], logical_port=0ec2c49b-401e-4ba2-8344-3d943b18845b) old=Port_Binding(up=[False], additional_chassis=[<ovs.db.idl.Row object at 0x7f2e53046760>], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 13:47:20 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:47:20.374 105360 INFO neutron.agent.ovn.metadata.agent [-] Port 0ec2c49b-401e-4ba2-8344-3d943b18845b in datapath 62a1ccdd-3048-4bbf-acc8-c791bff79ee8 bound to our chassis
Feb 16 13:47:20 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:47:20.375 105360 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 62a1ccdd-3048-4bbf-acc8-c791bff79ee8
Feb 16 13:47:20 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:47:20.386 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[b04b68a0-103d-4ba5-9052-8a35b985e08f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:47:20 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:47:20.410 206452 DEBUG oslo.privsep.daemon [-] privsep: reply[3f5e2915-a8d1-4335-9894-d8564265f423]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:47:20 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:47:20.414 206452 DEBUG oslo.privsep.daemon [-] privsep: reply[bdab41e3-c83e-4711-b077-877ca44cd433]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:47:20 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:47:20.435 206452 DEBUG oslo.privsep.daemon [-] privsep: reply[afcc1f58-6916-4780-b490-71a1a4782d3c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:47:20 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:47:20.451 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[18413b1d-c32e-4759-9f07-73231b918e6f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap62a1ccdd-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a9:94:92'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 13, 'tx_packets': 5, 'rx_bytes': 826, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 13, 'tx_packets': 5, 'rx_bytes': 826, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 53], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 567663, 'reachable_time': 34535, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214895, 'error': None, 'target': 'ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:47:20 compute-0 nova_compute[185723]: 2026-02-16 13:47:20.453 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:47:20 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:47:20.462 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[69e78c2b-4c80-4c6e-bd49-7405e8587f1e]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap62a1ccdd-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 567674, 'tstamp': 567674}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214896, 'error': None, 'target': 'ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap62a1ccdd-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 567677, 'tstamp': 567677}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214896, 'error': None, 'target': 'ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:47:20 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:47:20.464 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap62a1ccdd-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:47:20 compute-0 nova_compute[185723]: 2026-02-16 13:47:20.466 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:47:20 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:47:20.467 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap62a1ccdd-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:47:20 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:47:20.467 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 13:47:20 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:47:20.467 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap62a1ccdd-30, col_values=(('external_ids', {'iface-id': 'ac21d57d-f71e-4560-b6aa-e9f6e3838308'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:47:20 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:47:20.467 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 13:47:20 compute-0 nova_compute[185723]: 2026-02-16 13:47:20.559 185727 INFO nova.compute.manager [None req-1a7029b2-abe9-4613-bea5-b54bc2c81ba8 d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] [instance: 3217baa5-9eb7-414f-b18a-c49217ace9b6] Post operation of migration started
Feb 16 13:47:20 compute-0 nova_compute[185723]: 2026-02-16 13:47:20.998 185727 DEBUG oslo_concurrency.lockutils [None req-1a7029b2-abe9-4613-bea5-b54bc2c81ba8 d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] Acquiring lock "refresh_cache-3217baa5-9eb7-414f-b18a-c49217ace9b6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 13:47:20 compute-0 nova_compute[185723]: 2026-02-16 13:47:20.999 185727 DEBUG oslo_concurrency.lockutils [None req-1a7029b2-abe9-4613-bea5-b54bc2c81ba8 d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] Acquired lock "refresh_cache-3217baa5-9eb7-414f-b18a-c49217ace9b6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 13:47:20 compute-0 nova_compute[185723]: 2026-02-16 13:47:20.999 185727 DEBUG nova.network.neutron [None req-1a7029b2-abe9-4613-bea5-b54bc2c81ba8 d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] [instance: 3217baa5-9eb7-414f-b18a-c49217ace9b6] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 16 13:47:21 compute-0 nova_compute[185723]: 2026-02-16 13:47:21.433 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:47:21 compute-0 nova_compute[185723]: 2026-02-16 13:47:21.461 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:47:21 compute-0 nova_compute[185723]: 2026-02-16 13:47:21.462 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:47:21 compute-0 nova_compute[185723]: 2026-02-16 13:47:21.462 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:47:21 compute-0 nova_compute[185723]: 2026-02-16 13:47:21.463 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 16 13:47:21 compute-0 podman[214898]: 2026-02-16 13:47:21.571119838 +0000 UTC m=+0.058749607 container health_status 4ec6105d1727f201f656d7e6e358931be8ceabc5af37fb5ad779abc620122180 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 16 13:47:21 compute-0 nova_compute[185723]: 2026-02-16 13:47:21.579 185727 DEBUG oslo_concurrency.processutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3217baa5-9eb7-414f-b18a-c49217ace9b6/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:47:21 compute-0 nova_compute[185723]: 2026-02-16 13:47:21.636 185727 DEBUG oslo_concurrency.processutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3217baa5-9eb7-414f-b18a-c49217ace9b6/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:47:21 compute-0 nova_compute[185723]: 2026-02-16 13:47:21.637 185727 DEBUG oslo_concurrency.processutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3217baa5-9eb7-414f-b18a-c49217ace9b6/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:47:21 compute-0 nova_compute[185723]: 2026-02-16 13:47:21.686 185727 DEBUG oslo_concurrency.processutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3217baa5-9eb7-414f-b18a-c49217ace9b6/disk --force-share --output=json" returned: 0 in 0.049s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:47:21 compute-0 nova_compute[185723]: 2026-02-16 13:47:21.692 185727 DEBUG oslo_concurrency.processutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5fc1ad70-adc5-4109-a323-39a1b7137888/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:47:21 compute-0 nova_compute[185723]: 2026-02-16 13:47:21.742 185727 DEBUG oslo_concurrency.processutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5fc1ad70-adc5-4109-a323-39a1b7137888/disk --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:47:21 compute-0 nova_compute[185723]: 2026-02-16 13:47:21.743 185727 DEBUG oslo_concurrency.processutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5fc1ad70-adc5-4109-a323-39a1b7137888/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:47:21 compute-0 nova_compute[185723]: 2026-02-16 13:47:21.786 185727 DEBUG oslo_concurrency.processutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5fc1ad70-adc5-4109-a323-39a1b7137888/disk --force-share --output=json" returned: 0 in 0.042s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:47:21 compute-0 nova_compute[185723]: 2026-02-16 13:47:21.947 185727 WARNING nova.virt.libvirt.driver [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 13:47:21 compute-0 nova_compute[185723]: 2026-02-16 13:47:21.948 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5548MB free_disk=73.16597747802734GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 16 13:47:21 compute-0 nova_compute[185723]: 2026-02-16 13:47:21.949 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:47:21 compute-0 nova_compute[185723]: 2026-02-16 13:47:21.949 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:47:22 compute-0 nova_compute[185723]: 2026-02-16 13:47:22.007 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Migration for instance 3217baa5-9eb7-414f-b18a-c49217ace9b6 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903
Feb 16 13:47:22 compute-0 nova_compute[185723]: 2026-02-16 13:47:22.034 185727 INFO nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] [instance: 3217baa5-9eb7-414f-b18a-c49217ace9b6] Updating resource usage from migration 96cc26af-369a-42a4-acdf-7e760a2b25f9
Feb 16 13:47:22 compute-0 nova_compute[185723]: 2026-02-16 13:47:22.034 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] [instance: 3217baa5-9eb7-414f-b18a-c49217ace9b6] Starting to track incoming migration 96cc26af-369a-42a4-acdf-7e760a2b25f9 with flavor 6d89f72c-1760-421e-a5f2-83dfc3723b84 _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1431
Feb 16 13:47:22 compute-0 nova_compute[185723]: 2026-02-16 13:47:22.111 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Instance 5fc1ad70-adc5-4109-a323-39a1b7137888 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 16 13:47:22 compute-0 nova_compute[185723]: 2026-02-16 13:47:22.138 185727 WARNING nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Instance 3217baa5-9eb7-414f-b18a-c49217ace9b6 has been moved to another host compute-1.ctlplane.example.com(compute-1.ctlplane.example.com). There are allocations remaining against the source host that might need to be removed: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}.
Feb 16 13:47:22 compute-0 nova_compute[185723]: 2026-02-16 13:47:22.138 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 16 13:47:22 compute-0 nova_compute[185723]: 2026-02-16 13:47:22.138 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 16 13:47:22 compute-0 nova_compute[185723]: 2026-02-16 13:47:22.250 185727 DEBUG nova.compute.provider_tree [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Inventory has not changed in ProviderTree for provider: c9501a85-df32-4b8f-bce0-9425ef1e7866 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 13:47:22 compute-0 nova_compute[185723]: 2026-02-16 13:47:22.274 185727 DEBUG nova.scheduler.client.report [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Inventory has not changed for provider c9501a85-df32-4b8f-bce0-9425ef1e7866 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 13:47:22 compute-0 nova_compute[185723]: 2026-02-16 13:47:22.305 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 16 13:47:22 compute-0 nova_compute[185723]: 2026-02-16 13:47:22.305 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.356s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:47:23 compute-0 nova_compute[185723]: 2026-02-16 13:47:23.932 185727 DEBUG nova.network.neutron [None req-1a7029b2-abe9-4613-bea5-b54bc2c81ba8 d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] [instance: 3217baa5-9eb7-414f-b18a-c49217ace9b6] Updating instance_info_cache with network_info: [{"id": "0ec2c49b-401e-4ba2-8344-3d943b18845b", "address": "fa:16:3e:34:09:3f", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0ec2c49b-40", "ovs_interfaceid": "0ec2c49b-401e-4ba2-8344-3d943b18845b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 13:47:23 compute-0 nova_compute[185723]: 2026-02-16 13:47:23.966 185727 DEBUG oslo_concurrency.lockutils [None req-1a7029b2-abe9-4613-bea5-b54bc2c81ba8 d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] Releasing lock "refresh_cache-3217baa5-9eb7-414f-b18a-c49217ace9b6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 13:47:23 compute-0 nova_compute[185723]: 2026-02-16 13:47:23.985 185727 DEBUG oslo_concurrency.lockutils [None req-1a7029b2-abe9-4613-bea5-b54bc2c81ba8 d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:47:23 compute-0 nova_compute[185723]: 2026-02-16 13:47:23.986 185727 DEBUG oslo_concurrency.lockutils [None req-1a7029b2-abe9-4613-bea5-b54bc2c81ba8 d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:47:23 compute-0 nova_compute[185723]: 2026-02-16 13:47:23.986 185727 DEBUG oslo_concurrency.lockutils [None req-1a7029b2-abe9-4613-bea5-b54bc2c81ba8 d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:47:23 compute-0 nova_compute[185723]: 2026-02-16 13:47:23.990 185727 INFO nova.virt.libvirt.driver [None req-1a7029b2-abe9-4613-bea5-b54bc2c81ba8 d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] [instance: 3217baa5-9eb7-414f-b18a-c49217ace9b6] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Feb 16 13:47:23 compute-0 virtqemud[184843]: Domain id=18 name='instance-00000016' uuid=3217baa5-9eb7-414f-b18a-c49217ace9b6 is tainted: custom-monitor
Feb 16 13:47:24 compute-0 nova_compute[185723]: 2026-02-16 13:47:24.174 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:47:24 compute-0 nova_compute[185723]: 2026-02-16 13:47:24.306 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:47:24 compute-0 nova_compute[185723]: 2026-02-16 13:47:24.434 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:47:24 compute-0 nova_compute[185723]: 2026-02-16 13:47:24.998 185727 INFO nova.virt.libvirt.driver [None req-1a7029b2-abe9-4613-bea5-b54bc2c81ba8 d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] [instance: 3217baa5-9eb7-414f-b18a-c49217ace9b6] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Feb 16 13:47:25 compute-0 nova_compute[185723]: 2026-02-16 13:47:25.433 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:47:25 compute-0 nova_compute[185723]: 2026-02-16 13:47:25.434 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 16 13:47:25 compute-0 nova_compute[185723]: 2026-02-16 13:47:25.434 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 16 13:47:25 compute-0 nova_compute[185723]: 2026-02-16 13:47:25.454 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:47:25 compute-0 nova_compute[185723]: 2026-02-16 13:47:25.901 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Acquiring lock "refresh_cache-5fc1ad70-adc5-4109-a323-39a1b7137888" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 13:47:25 compute-0 nova_compute[185723]: 2026-02-16 13:47:25.901 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Acquired lock "refresh_cache-5fc1ad70-adc5-4109-a323-39a1b7137888" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 13:47:25 compute-0 nova_compute[185723]: 2026-02-16 13:47:25.901 185727 DEBUG nova.network.neutron [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] [instance: 5fc1ad70-adc5-4109-a323-39a1b7137888] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Feb 16 13:47:25 compute-0 nova_compute[185723]: 2026-02-16 13:47:25.902 185727 DEBUG nova.objects.instance [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 5fc1ad70-adc5-4109-a323-39a1b7137888 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 13:47:26 compute-0 nova_compute[185723]: 2026-02-16 13:47:26.003 185727 INFO nova.virt.libvirt.driver [None req-1a7029b2-abe9-4613-bea5-b54bc2c81ba8 d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] [instance: 3217baa5-9eb7-414f-b18a-c49217ace9b6] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Feb 16 13:47:26 compute-0 nova_compute[185723]: 2026-02-16 13:47:26.009 185727 DEBUG nova.compute.manager [None req-1a7029b2-abe9-4613-bea5-b54bc2c81ba8 d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] [instance: 3217baa5-9eb7-414f-b18a-c49217ace9b6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:47:26 compute-0 nova_compute[185723]: 2026-02-16 13:47:26.028 185727 DEBUG nova.objects.instance [None req-1a7029b2-abe9-4613-bea5-b54bc2c81ba8 d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] [instance: 3217baa5-9eb7-414f-b18a-c49217ace9b6] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Feb 16 13:47:28 compute-0 nova_compute[185723]: 2026-02-16 13:47:28.407 185727 DEBUG nova.network.neutron [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] [instance: 5fc1ad70-adc5-4109-a323-39a1b7137888] Updating instance_info_cache with network_info: [{"id": "303c8798-0b7a-4dc2-ac3d-fd237012b497", "address": "fa:16:3e:71:1d:c5", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap303c8798-0b", "ovs_interfaceid": "303c8798-0b7a-4dc2-ac3d-fd237012b497", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 13:47:28 compute-0 nova_compute[185723]: 2026-02-16 13:47:28.443 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Releasing lock "refresh_cache-5fc1ad70-adc5-4109-a323-39a1b7137888" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 13:47:28 compute-0 nova_compute[185723]: 2026-02-16 13:47:28.444 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] [instance: 5fc1ad70-adc5-4109-a323-39a1b7137888] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Feb 16 13:47:28 compute-0 nova_compute[185723]: 2026-02-16 13:47:28.444 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:47:28 compute-0 nova_compute[185723]: 2026-02-16 13:47:28.445 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:47:28 compute-0 nova_compute[185723]: 2026-02-16 13:47:28.445 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:47:28 compute-0 nova_compute[185723]: 2026-02-16 13:47:28.445 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:47:28 compute-0 nova_compute[185723]: 2026-02-16 13:47:28.445 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 16 13:47:29 compute-0 nova_compute[185723]: 2026-02-16 13:47:29.176 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:47:29 compute-0 nova_compute[185723]: 2026-02-16 13:47:29.440 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:47:29 compute-0 podman[195053]: time="2026-02-16T13:47:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:47:29 compute-0 podman[195053]: @ - - [16/Feb/2026:13:47:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17245 "" "Go-http-client/1.1"
Feb 16 13:47:29 compute-0 podman[195053]: @ - - [16/Feb/2026:13:47:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2641 "" "Go-http-client/1.1"
Feb 16 13:47:30 compute-0 nova_compute[185723]: 2026-02-16 13:47:30.455 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:47:31 compute-0 openstack_network_exporter[197909]: ERROR   13:47:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:47:31 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:47:31 compute-0 openstack_network_exporter[197909]: ERROR   13:47:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:47:31 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:47:31 compute-0 nova_compute[185723]: 2026-02-16 13:47:31.428 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:47:33 compute-0 sshd-session[214938]: Invalid user ubuntu from 64.227.72.94 port 52244
Feb 16 13:47:33 compute-0 sshd-session[214938]: Connection closed by invalid user ubuntu 64.227.72.94 port 52244 [preauth]
Feb 16 13:47:33 compute-0 nova_compute[185723]: 2026-02-16 13:47:33.784 185727 DEBUG oslo_concurrency.lockutils [None req-fde0be4e-e1b1-4629-902f-a4f1d1349c6f e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Acquiring lock "3217baa5-9eb7-414f-b18a-c49217ace9b6" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:47:33 compute-0 nova_compute[185723]: 2026-02-16 13:47:33.784 185727 DEBUG oslo_concurrency.lockutils [None req-fde0be4e-e1b1-4629-902f-a4f1d1349c6f e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "3217baa5-9eb7-414f-b18a-c49217ace9b6" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:47:33 compute-0 nova_compute[185723]: 2026-02-16 13:47:33.784 185727 DEBUG oslo_concurrency.lockutils [None req-fde0be4e-e1b1-4629-902f-a4f1d1349c6f e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Acquiring lock "3217baa5-9eb7-414f-b18a-c49217ace9b6-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:47:33 compute-0 nova_compute[185723]: 2026-02-16 13:47:33.785 185727 DEBUG oslo_concurrency.lockutils [None req-fde0be4e-e1b1-4629-902f-a4f1d1349c6f e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "3217baa5-9eb7-414f-b18a-c49217ace9b6-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:47:33 compute-0 nova_compute[185723]: 2026-02-16 13:47:33.785 185727 DEBUG oslo_concurrency.lockutils [None req-fde0be4e-e1b1-4629-902f-a4f1d1349c6f e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "3217baa5-9eb7-414f-b18a-c49217ace9b6-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:47:33 compute-0 nova_compute[185723]: 2026-02-16 13:47:33.786 185727 INFO nova.compute.manager [None req-fde0be4e-e1b1-4629-902f-a4f1d1349c6f e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 3217baa5-9eb7-414f-b18a-c49217ace9b6] Terminating instance
Feb 16 13:47:33 compute-0 nova_compute[185723]: 2026-02-16 13:47:33.787 185727 DEBUG nova.compute.manager [None req-fde0be4e-e1b1-4629-902f-a4f1d1349c6f e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 3217baa5-9eb7-414f-b18a-c49217ace9b6] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Feb 16 13:47:33 compute-0 kernel: tap0ec2c49b-40 (unregistering): left promiscuous mode
Feb 16 13:47:33 compute-0 NetworkManager[56177]: <info>  [1771249653.8144] device (tap0ec2c49b-40): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Feb 16 13:47:33 compute-0 ovn_controller[96072]: 2026-02-16T13:47:33Z|00193|binding|INFO|Releasing lport 0ec2c49b-401e-4ba2-8344-3d943b18845b from this chassis (sb_readonly=0)
Feb 16 13:47:33 compute-0 nova_compute[185723]: 2026-02-16 13:47:33.822 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:47:33 compute-0 ovn_controller[96072]: 2026-02-16T13:47:33Z|00194|binding|INFO|Setting lport 0ec2c49b-401e-4ba2-8344-3d943b18845b down in Southbound
Feb 16 13:47:33 compute-0 ovn_controller[96072]: 2026-02-16T13:47:33Z|00195|binding|INFO|Removing iface tap0ec2c49b-40 ovn-installed in OVS
Feb 16 13:47:33 compute-0 nova_compute[185723]: 2026-02-16 13:47:33.824 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:47:33 compute-0 nova_compute[185723]: 2026-02-16 13:47:33.827 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:47:33 compute-0 systemd[1]: machine-qemu\x2d18\x2dinstance\x2d00000016.scope: Deactivated successfully.
Feb 16 13:47:33 compute-0 systemd[1]: machine-qemu\x2d18\x2dinstance\x2d00000016.scope: Consumed 2.310s CPU time.
Feb 16 13:47:33 compute-0 systemd-machined[155229]: Machine qemu-18-instance-00000016 terminated.
Feb 16 13:47:33 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:47:33.871 105360 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:34:09:3f 10.100.0.4'], port_security=['fa:16:3e:34:09:3f 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '3217baa5-9eb7-414f-b18a-c49217ace9b6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '76c271745e704d5fa97fe16a7dcd4a81', 'neutron:revision_number': '11', 'neutron:security_group_ids': '832f9d98-96fb-45e2-8c11-0f4534711ee2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fb3ae2f5-242d-4288-83f4-c053c76499e5, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2e53046760>], logical_port=0ec2c49b-401e-4ba2-8344-3d943b18845b) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2e53046760>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 13:47:33 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:47:33.872 105360 INFO neutron.agent.ovn.metadata.agent [-] Port 0ec2c49b-401e-4ba2-8344-3d943b18845b in datapath 62a1ccdd-3048-4bbf-acc8-c791bff79ee8 unbound from our chassis
Feb 16 13:47:33 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:47:33.874 105360 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 62a1ccdd-3048-4bbf-acc8-c791bff79ee8
Feb 16 13:47:33 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:47:33.888 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[0f72dd68-7fe6-432d-b984-7cb7bf747ba1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:47:33 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:47:33.912 206452 DEBUG oslo.privsep.daemon [-] privsep: reply[bf425c53-83ef-4d6e-bfa7-afbcb305b62e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:47:33 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:47:33.915 206452 DEBUG oslo.privsep.daemon [-] privsep: reply[800a2928-5cdc-4f84-8538-485a6f55db62]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:47:33 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:47:33.934 206452 DEBUG oslo.privsep.daemon [-] privsep: reply[f3cea190-7ef4-499d-8435-cdabe8866b8a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:47:33 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:47:33.947 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[ff874a8c-430d-46b7-8a74-4277eff41780]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap62a1ccdd-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a9:94:92'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 28, 'tx_packets': 7, 'rx_bytes': 1456, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 28, 'tx_packets': 7, 'rx_bytes': 1456, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 53], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 567663, 'reachable_time': 34535, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214951, 'error': None, 'target': 'ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:47:33 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:47:33.959 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[f25e9fe3-c5b8-452a-901d-957ae7ab8ea4]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap62a1ccdd-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 567674, 'tstamp': 567674}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214952, 'error': None, 'target': 'ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap62a1ccdd-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 567677, 'tstamp': 567677}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214952, 'error': None, 'target': 'ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:47:33 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:47:33.961 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap62a1ccdd-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:47:33 compute-0 nova_compute[185723]: 2026-02-16 13:47:33.962 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:47:33 compute-0 nova_compute[185723]: 2026-02-16 13:47:33.966 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:47:33 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:47:33.967 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap62a1ccdd-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:47:33 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:47:33.967 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 13:47:33 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:47:33.968 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap62a1ccdd-30, col_values=(('external_ids', {'iface-id': 'ac21d57d-f71e-4560-b6aa-e9f6e3838308'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:47:33 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:47:33.968 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 13:47:33 compute-0 sshd-session[214936]: Invalid user guest from 146.190.22.227 port 35346
Feb 16 13:47:34 compute-0 kernel: tap0ec2c49b-40: entered promiscuous mode
Feb 16 13:47:34 compute-0 kernel: tap0ec2c49b-40 (unregistering): left promiscuous mode
Feb 16 13:47:34 compute-0 NetworkManager[56177]: <info>  [1771249654.0078] manager: (tap0ec2c49b-40): new Tun device (/org/freedesktop/NetworkManager/Devices/79)
Feb 16 13:47:34 compute-0 nova_compute[185723]: 2026-02-16 13:47:34.011 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:47:34 compute-0 nova_compute[185723]: 2026-02-16 13:47:34.048 185727 INFO nova.virt.libvirt.driver [-] [instance: 3217baa5-9eb7-414f-b18a-c49217ace9b6] Instance destroyed successfully.
Feb 16 13:47:34 compute-0 nova_compute[185723]: 2026-02-16 13:47:34.048 185727 DEBUG nova.objects.instance [None req-fde0be4e-e1b1-4629-902f-a4f1d1349c6f e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lazy-loading 'resources' on Instance uuid 3217baa5-9eb7-414f-b18a-c49217ace9b6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 13:47:34 compute-0 nova_compute[185723]: 2026-02-16 13:47:34.064 185727 DEBUG nova.virt.libvirt.vif [None req-fde0be4e-e1b1-4629-902f-a4f1d1349c6f e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-02-16T13:46:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1517311993',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1517311993',id=22,image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-16T13:46:55Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='76c271745e704d5fa97fe16a7dcd4a81',ramdisk_id='',reservation_id='r-4pvdm8hl',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-1085993185',owner_user_name='tempest-TestExecuteStrategies-1085993185-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-16T13:47:26Z,user_data=None,user_id='e19cd2d8a8894526ba620ca3249e9a63',uuid=3217baa5-9eb7-414f-b18a-c49217ace9b6,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "0ec2c49b-401e-4ba2-8344-3d943b18845b", "address": "fa:16:3e:34:09:3f", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0ec2c49b-40", "ovs_interfaceid": "0ec2c49b-401e-4ba2-8344-3d943b18845b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Feb 16 13:47:34 compute-0 nova_compute[185723]: 2026-02-16 13:47:34.064 185727 DEBUG nova.network.os_vif_util [None req-fde0be4e-e1b1-4629-902f-a4f1d1349c6f e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Converting VIF {"id": "0ec2c49b-401e-4ba2-8344-3d943b18845b", "address": "fa:16:3e:34:09:3f", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0ec2c49b-40", "ovs_interfaceid": "0ec2c49b-401e-4ba2-8344-3d943b18845b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 13:47:34 compute-0 nova_compute[185723]: 2026-02-16 13:47:34.065 185727 DEBUG nova.network.os_vif_util [None req-fde0be4e-e1b1-4629-902f-a4f1d1349c6f e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:34:09:3f,bridge_name='br-int',has_traffic_filtering=True,id=0ec2c49b-401e-4ba2-8344-3d943b18845b,network=Network(62a1ccdd-3048-4bbf-acc8-c791bff79ee8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0ec2c49b-40') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 13:47:34 compute-0 nova_compute[185723]: 2026-02-16 13:47:34.066 185727 DEBUG os_vif [None req-fde0be4e-e1b1-4629-902f-a4f1d1349c6f e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:34:09:3f,bridge_name='br-int',has_traffic_filtering=True,id=0ec2c49b-401e-4ba2-8344-3d943b18845b,network=Network(62a1ccdd-3048-4bbf-acc8-c791bff79ee8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0ec2c49b-40') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Feb 16 13:47:34 compute-0 nova_compute[185723]: 2026-02-16 13:47:34.068 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:47:34 compute-0 nova_compute[185723]: 2026-02-16 13:47:34.069 185727 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0ec2c49b-40, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:47:34 compute-0 nova_compute[185723]: 2026-02-16 13:47:34.070 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:47:34 compute-0 nova_compute[185723]: 2026-02-16 13:47:34.072 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:47:34 compute-0 nova_compute[185723]: 2026-02-16 13:47:34.074 185727 INFO os_vif [None req-fde0be4e-e1b1-4629-902f-a4f1d1349c6f e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:34:09:3f,bridge_name='br-int',has_traffic_filtering=True,id=0ec2c49b-401e-4ba2-8344-3d943b18845b,network=Network(62a1ccdd-3048-4bbf-acc8-c791bff79ee8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0ec2c49b-40')
Feb 16 13:47:34 compute-0 nova_compute[185723]: 2026-02-16 13:47:34.075 185727 INFO nova.virt.libvirt.driver [None req-fde0be4e-e1b1-4629-902f-a4f1d1349c6f e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 3217baa5-9eb7-414f-b18a-c49217ace9b6] Deleting instance files /var/lib/nova/instances/3217baa5-9eb7-414f-b18a-c49217ace9b6_del
Feb 16 13:47:34 compute-0 nova_compute[185723]: 2026-02-16 13:47:34.076 185727 INFO nova.virt.libvirt.driver [None req-fde0be4e-e1b1-4629-902f-a4f1d1349c6f e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 3217baa5-9eb7-414f-b18a-c49217ace9b6] Deletion of /var/lib/nova/instances/3217baa5-9eb7-414f-b18a-c49217ace9b6_del complete
Feb 16 13:47:34 compute-0 nova_compute[185723]: 2026-02-16 13:47:34.125 185727 INFO nova.compute.manager [None req-fde0be4e-e1b1-4629-902f-a4f1d1349c6f e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 3217baa5-9eb7-414f-b18a-c49217ace9b6] Took 0.34 seconds to destroy the instance on the hypervisor.
Feb 16 13:47:34 compute-0 nova_compute[185723]: 2026-02-16 13:47:34.126 185727 DEBUG oslo.service.loopingcall [None req-fde0be4e-e1b1-4629-902f-a4f1d1349c6f e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Feb 16 13:47:34 compute-0 nova_compute[185723]: 2026-02-16 13:47:34.126 185727 DEBUG nova.compute.manager [-] [instance: 3217baa5-9eb7-414f-b18a-c49217ace9b6] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Feb 16 13:47:34 compute-0 nova_compute[185723]: 2026-02-16 13:47:34.126 185727 DEBUG nova.network.neutron [-] [instance: 3217baa5-9eb7-414f-b18a-c49217ace9b6] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Feb 16 13:47:34 compute-0 sshd-session[214936]: Connection closed by invalid user guest 146.190.22.227 port 35346 [preauth]
Feb 16 13:47:35 compute-0 nova_compute[185723]: 2026-02-16 13:47:35.458 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:47:37 compute-0 podman[214965]: 2026-02-16 13:47:37.027938459 +0000 UTC m=+0.053511217 container health_status 93151dcbec9f159583042df5101bdd7b5b1bcb5f3117955b4bc9cd1c0e465b98 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, distribution-scope=public, container_name=openstack_network_exporter, io.buildah.version=1.33.7, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2026-02-05T04:57:10Z, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., vcs-type=git, release=1770267347, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9/ubi-minimal, version=9.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, org.opencontainers.image.created=2026-02-05T04:57:10Z)
Feb 16 13:47:37 compute-0 podman[214966]: 2026-02-16 13:47:37.028100483 +0000 UTC m=+0.053089457 container health_status d69bd0be854ec8043fcba555b70c2cd311c7a2510d07d6bc9929fe7684abece9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127)
Feb 16 13:47:37 compute-0 nova_compute[185723]: 2026-02-16 13:47:37.561 185727 DEBUG nova.compute.manager [req-cb28ca10-b511-44aa-813b-337c46c3eaf7 req-baba7956-f8cf-4252-bccd-a1a7d26004ee faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3217baa5-9eb7-414f-b18a-c49217ace9b6] Received event network-vif-unplugged-0ec2c49b-401e-4ba2-8344-3d943b18845b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:47:37 compute-0 nova_compute[185723]: 2026-02-16 13:47:37.562 185727 DEBUG oslo_concurrency.lockutils [req-cb28ca10-b511-44aa-813b-337c46c3eaf7 req-baba7956-f8cf-4252-bccd-a1a7d26004ee faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "3217baa5-9eb7-414f-b18a-c49217ace9b6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:47:37 compute-0 nova_compute[185723]: 2026-02-16 13:47:37.562 185727 DEBUG oslo_concurrency.lockutils [req-cb28ca10-b511-44aa-813b-337c46c3eaf7 req-baba7956-f8cf-4252-bccd-a1a7d26004ee faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "3217baa5-9eb7-414f-b18a-c49217ace9b6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:47:37 compute-0 nova_compute[185723]: 2026-02-16 13:47:37.562 185727 DEBUG oslo_concurrency.lockutils [req-cb28ca10-b511-44aa-813b-337c46c3eaf7 req-baba7956-f8cf-4252-bccd-a1a7d26004ee faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "3217baa5-9eb7-414f-b18a-c49217ace9b6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:47:37 compute-0 nova_compute[185723]: 2026-02-16 13:47:37.562 185727 DEBUG nova.compute.manager [req-cb28ca10-b511-44aa-813b-337c46c3eaf7 req-baba7956-f8cf-4252-bccd-a1a7d26004ee faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3217baa5-9eb7-414f-b18a-c49217ace9b6] No waiting events found dispatching network-vif-unplugged-0ec2c49b-401e-4ba2-8344-3d943b18845b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:47:37 compute-0 nova_compute[185723]: 2026-02-16 13:47:37.562 185727 DEBUG nova.compute.manager [req-cb28ca10-b511-44aa-813b-337c46c3eaf7 req-baba7956-f8cf-4252-bccd-a1a7d26004ee faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3217baa5-9eb7-414f-b18a-c49217ace9b6] Received event network-vif-unplugged-0ec2c49b-401e-4ba2-8344-3d943b18845b for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Feb 16 13:47:37 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:47:37.773 105360 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=26, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '52:96:1b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '16:6c:7a:bd:49:39'}, ipsec=False) old=SB_Global(nb_cfg=25) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 13:47:37 compute-0 nova_compute[185723]: 2026-02-16 13:47:37.774 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:47:37 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:47:37.774 105360 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 16 13:47:37 compute-0 nova_compute[185723]: 2026-02-16 13:47:37.796 185727 DEBUG nova.network.neutron [-] [instance: 3217baa5-9eb7-414f-b18a-c49217ace9b6] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 13:47:37 compute-0 nova_compute[185723]: 2026-02-16 13:47:37.819 185727 INFO nova.compute.manager [-] [instance: 3217baa5-9eb7-414f-b18a-c49217ace9b6] Took 3.69 seconds to deallocate network for instance.
Feb 16 13:47:37 compute-0 nova_compute[185723]: 2026-02-16 13:47:37.876 185727 DEBUG oslo_concurrency.lockutils [None req-fde0be4e-e1b1-4629-902f-a4f1d1349c6f e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:47:37 compute-0 nova_compute[185723]: 2026-02-16 13:47:37.877 185727 DEBUG oslo_concurrency.lockutils [None req-fde0be4e-e1b1-4629-902f-a4f1d1349c6f e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:47:37 compute-0 nova_compute[185723]: 2026-02-16 13:47:37.881 185727 DEBUG oslo_concurrency.lockutils [None req-fde0be4e-e1b1-4629-902f-a4f1d1349c6f e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.005s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:47:37 compute-0 nova_compute[185723]: 2026-02-16 13:47:37.889 185727 DEBUG nova.compute.manager [req-a47f72cd-e80c-4abc-ab77-269d38ece0be req-c1dc3cb2-5439-4ff4-b776-c7a18d29fb7a faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3217baa5-9eb7-414f-b18a-c49217ace9b6] Received event network-vif-deleted-0ec2c49b-401e-4ba2-8344-3d943b18845b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:47:37 compute-0 nova_compute[185723]: 2026-02-16 13:47:37.923 185727 INFO nova.scheduler.client.report [None req-fde0be4e-e1b1-4629-902f-a4f1d1349c6f e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Deleted allocations for instance 3217baa5-9eb7-414f-b18a-c49217ace9b6
Feb 16 13:47:38 compute-0 nova_compute[185723]: 2026-02-16 13:47:38.016 185727 DEBUG oslo_concurrency.lockutils [None req-fde0be4e-e1b1-4629-902f-a4f1d1349c6f e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "3217baa5-9eb7-414f-b18a-c49217ace9b6" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.232s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:47:38 compute-0 nova_compute[185723]: 2026-02-16 13:47:38.871 185727 DEBUG oslo_concurrency.lockutils [None req-ba3e6f97-87db-4ea3-ad32-4d6e2eba888a e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Acquiring lock "5fc1ad70-adc5-4109-a323-39a1b7137888" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:47:38 compute-0 nova_compute[185723]: 2026-02-16 13:47:38.872 185727 DEBUG oslo_concurrency.lockutils [None req-ba3e6f97-87db-4ea3-ad32-4d6e2eba888a e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "5fc1ad70-adc5-4109-a323-39a1b7137888" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:47:38 compute-0 nova_compute[185723]: 2026-02-16 13:47:38.872 185727 DEBUG oslo_concurrency.lockutils [None req-ba3e6f97-87db-4ea3-ad32-4d6e2eba888a e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Acquiring lock "5fc1ad70-adc5-4109-a323-39a1b7137888-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:47:38 compute-0 nova_compute[185723]: 2026-02-16 13:47:38.872 185727 DEBUG oslo_concurrency.lockutils [None req-ba3e6f97-87db-4ea3-ad32-4d6e2eba888a e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "5fc1ad70-adc5-4109-a323-39a1b7137888-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:47:38 compute-0 nova_compute[185723]: 2026-02-16 13:47:38.872 185727 DEBUG oslo_concurrency.lockutils [None req-ba3e6f97-87db-4ea3-ad32-4d6e2eba888a e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "5fc1ad70-adc5-4109-a323-39a1b7137888-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:47:38 compute-0 nova_compute[185723]: 2026-02-16 13:47:38.874 185727 INFO nova.compute.manager [None req-ba3e6f97-87db-4ea3-ad32-4d6e2eba888a e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 5fc1ad70-adc5-4109-a323-39a1b7137888] Terminating instance
Feb 16 13:47:38 compute-0 nova_compute[185723]: 2026-02-16 13:47:38.875 185727 DEBUG nova.compute.manager [None req-ba3e6f97-87db-4ea3-ad32-4d6e2eba888a e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 5fc1ad70-adc5-4109-a323-39a1b7137888] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Feb 16 13:47:38 compute-0 kernel: tap303c8798-0b (unregistering): left promiscuous mode
Feb 16 13:47:38 compute-0 NetworkManager[56177]: <info>  [1771249658.9192] device (tap303c8798-0b): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Feb 16 13:47:38 compute-0 nova_compute[185723]: 2026-02-16 13:47:38.919 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:47:38 compute-0 nova_compute[185723]: 2026-02-16 13:47:38.925 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:47:38 compute-0 ovn_controller[96072]: 2026-02-16T13:47:38Z|00196|binding|INFO|Releasing lport 303c8798-0b7a-4dc2-ac3d-fd237012b497 from this chassis (sb_readonly=0)
Feb 16 13:47:38 compute-0 ovn_controller[96072]: 2026-02-16T13:47:38Z|00197|binding|INFO|Setting lport 303c8798-0b7a-4dc2-ac3d-fd237012b497 down in Southbound
Feb 16 13:47:38 compute-0 ovn_controller[96072]: 2026-02-16T13:47:38Z|00198|binding|INFO|Removing iface tap303c8798-0b ovn-installed in OVS
Feb 16 13:47:38 compute-0 nova_compute[185723]: 2026-02-16 13:47:38.927 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:47:38 compute-0 nova_compute[185723]: 2026-02-16 13:47:38.930 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:47:38 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:47:38.934 105360 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:71:1d:c5 10.100.0.13'], port_security=['fa:16:3e:71:1d:c5 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '5fc1ad70-adc5-4109-a323-39a1b7137888', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '76c271745e704d5fa97fe16a7dcd4a81', 'neutron:revision_number': '4', 'neutron:security_group_ids': '832f9d98-96fb-45e2-8c11-0f4534711ee2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fb3ae2f5-242d-4288-83f4-c053c76499e5, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2e53046760>], logical_port=303c8798-0b7a-4dc2-ac3d-fd237012b497) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2e53046760>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 13:47:38 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:47:38.936 105360 INFO neutron.agent.ovn.metadata.agent [-] Port 303c8798-0b7a-4dc2-ac3d-fd237012b497 in datapath 62a1ccdd-3048-4bbf-acc8-c791bff79ee8 unbound from our chassis
Feb 16 13:47:38 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:47:38.937 105360 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 62a1ccdd-3048-4bbf-acc8-c791bff79ee8, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 16 13:47:38 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:47:38.939 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[b2b8dc38-cef8-48aa-8bb7-0dd47b0eaab1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:47:38 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:47:38.940 105360 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8 namespace which is not needed anymore
Feb 16 13:47:38 compute-0 systemd[1]: machine-qemu\x2d17\x2dinstance\x2d00000015.scope: Deactivated successfully.
Feb 16 13:47:38 compute-0 systemd[1]: machine-qemu\x2d17\x2dinstance\x2d00000015.scope: Consumed 13.211s CPU time.
Feb 16 13:47:38 compute-0 systemd-machined[155229]: Machine qemu-17-instance-00000015 terminated.
Feb 16 13:47:39 compute-0 nova_compute[185723]: 2026-02-16 13:47:39.071 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:47:39 compute-0 nova_compute[185723]: 2026-02-16 13:47:39.094 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:47:39 compute-0 nova_compute[185723]: 2026-02-16 13:47:39.097 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:47:39 compute-0 neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8[214608]: [NOTICE]   (214612) : haproxy version is 2.8.14-c23fe91
Feb 16 13:47:39 compute-0 neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8[214608]: [NOTICE]   (214612) : path to executable is /usr/sbin/haproxy
Feb 16 13:47:39 compute-0 neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8[214608]: [WARNING]  (214612) : Exiting Master process...
Feb 16 13:47:39 compute-0 neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8[214608]: [ALERT]    (214612) : Current worker (214614) exited with code 143 (Terminated)
Feb 16 13:47:39 compute-0 neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8[214608]: [WARNING]  (214612) : All workers exited. Exiting... (0)
Feb 16 13:47:39 compute-0 systemd[1]: libpod-9089f89607e35f29a02afc6a40f7f1671ba42a3d00bc43c5b50256e53e01cc73.scope: Deactivated successfully.
Feb 16 13:47:39 compute-0 podman[215027]: 2026-02-16 13:47:39.132444802 +0000 UTC m=+0.119738949 container died 9089f89607e35f29a02afc6a40f7f1671ba42a3d00bc43c5b50256e53e01cc73 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Feb 16 13:47:39 compute-0 nova_compute[185723]: 2026-02-16 13:47:39.133 185727 INFO nova.virt.libvirt.driver [-] [instance: 5fc1ad70-adc5-4109-a323-39a1b7137888] Instance destroyed successfully.
Feb 16 13:47:39 compute-0 nova_compute[185723]: 2026-02-16 13:47:39.134 185727 DEBUG nova.objects.instance [None req-ba3e6f97-87db-4ea3-ad32-4d6e2eba888a e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lazy-loading 'resources' on Instance uuid 5fc1ad70-adc5-4109-a323-39a1b7137888 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 13:47:39 compute-0 nova_compute[185723]: 2026-02-16 13:47:39.152 185727 DEBUG nova.virt.libvirt.vif [None req-ba3e6f97-87db-4ea3-ad32-4d6e2eba888a e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-16T13:46:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1875661694',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1875661694',id=21,image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-16T13:46:33Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='76c271745e704d5fa97fe16a7dcd4a81',ramdisk_id='',reservation_id='r-kx5zzt03',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-1085993185',owner_user_name='tempest-TestExecuteStrategies-1085993185-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-16T13:46:34Z,user_data=None,user_id='e19cd2d8a8894526ba620ca3249e9a63',uuid=5fc1ad70-adc5-4109-a323-39a1b7137888,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "303c8798-0b7a-4dc2-ac3d-fd237012b497", "address": "fa:16:3e:71:1d:c5", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap303c8798-0b", "ovs_interfaceid": "303c8798-0b7a-4dc2-ac3d-fd237012b497", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Feb 16 13:47:39 compute-0 nova_compute[185723]: 2026-02-16 13:47:39.152 185727 DEBUG nova.network.os_vif_util [None req-ba3e6f97-87db-4ea3-ad32-4d6e2eba888a e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Converting VIF {"id": "303c8798-0b7a-4dc2-ac3d-fd237012b497", "address": "fa:16:3e:71:1d:c5", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap303c8798-0b", "ovs_interfaceid": "303c8798-0b7a-4dc2-ac3d-fd237012b497", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 13:47:39 compute-0 nova_compute[185723]: 2026-02-16 13:47:39.153 185727 DEBUG nova.network.os_vif_util [None req-ba3e6f97-87db-4ea3-ad32-4d6e2eba888a e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:71:1d:c5,bridge_name='br-int',has_traffic_filtering=True,id=303c8798-0b7a-4dc2-ac3d-fd237012b497,network=Network(62a1ccdd-3048-4bbf-acc8-c791bff79ee8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap303c8798-0b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 13:47:39 compute-0 nova_compute[185723]: 2026-02-16 13:47:39.153 185727 DEBUG os_vif [None req-ba3e6f97-87db-4ea3-ad32-4d6e2eba888a e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:71:1d:c5,bridge_name='br-int',has_traffic_filtering=True,id=303c8798-0b7a-4dc2-ac3d-fd237012b497,network=Network(62a1ccdd-3048-4bbf-acc8-c791bff79ee8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap303c8798-0b') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Feb 16 13:47:39 compute-0 nova_compute[185723]: 2026-02-16 13:47:39.154 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:47:39 compute-0 nova_compute[185723]: 2026-02-16 13:47:39.155 185727 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap303c8798-0b, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:47:39 compute-0 nova_compute[185723]: 2026-02-16 13:47:39.158 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 13:47:39 compute-0 nova_compute[185723]: 2026-02-16 13:47:39.161 185727 INFO os_vif [None req-ba3e6f97-87db-4ea3-ad32-4d6e2eba888a e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:71:1d:c5,bridge_name='br-int',has_traffic_filtering=True,id=303c8798-0b7a-4dc2-ac3d-fd237012b497,network=Network(62a1ccdd-3048-4bbf-acc8-c791bff79ee8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap303c8798-0b')
Feb 16 13:47:39 compute-0 nova_compute[185723]: 2026-02-16 13:47:39.161 185727 INFO nova.virt.libvirt.driver [None req-ba3e6f97-87db-4ea3-ad32-4d6e2eba888a e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 5fc1ad70-adc5-4109-a323-39a1b7137888] Deleting instance files /var/lib/nova/instances/5fc1ad70-adc5-4109-a323-39a1b7137888_del
Feb 16 13:47:39 compute-0 nova_compute[185723]: 2026-02-16 13:47:39.161 185727 INFO nova.virt.libvirt.driver [None req-ba3e6f97-87db-4ea3-ad32-4d6e2eba888a e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 5fc1ad70-adc5-4109-a323-39a1b7137888] Deletion of /var/lib/nova/instances/5fc1ad70-adc5-4109-a323-39a1b7137888_del complete
Feb 16 13:47:39 compute-0 nova_compute[185723]: 2026-02-16 13:47:39.235 185727 INFO nova.compute.manager [None req-ba3e6f97-87db-4ea3-ad32-4d6e2eba888a e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 5fc1ad70-adc5-4109-a323-39a1b7137888] Took 0.36 seconds to destroy the instance on the hypervisor.
Feb 16 13:47:39 compute-0 nova_compute[185723]: 2026-02-16 13:47:39.236 185727 DEBUG oslo.service.loopingcall [None req-ba3e6f97-87db-4ea3-ad32-4d6e2eba888a e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Feb 16 13:47:39 compute-0 nova_compute[185723]: 2026-02-16 13:47:39.236 185727 DEBUG nova.compute.manager [-] [instance: 5fc1ad70-adc5-4109-a323-39a1b7137888] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Feb 16 13:47:39 compute-0 nova_compute[185723]: 2026-02-16 13:47:39.236 185727 DEBUG nova.network.neutron [-] [instance: 5fc1ad70-adc5-4109-a323-39a1b7137888] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Feb 16 13:47:39 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9089f89607e35f29a02afc6a40f7f1671ba42a3d00bc43c5b50256e53e01cc73-userdata-shm.mount: Deactivated successfully.
Feb 16 13:47:39 compute-0 systemd[1]: var-lib-containers-storage-overlay-81fd336443fdc13027760b7f85d6676cd0044ff839f0a3c63822e13fbab7877a-merged.mount: Deactivated successfully.
Feb 16 13:47:39 compute-0 podman[215027]: 2026-02-16 13:47:39.656200893 +0000 UTC m=+0.643495030 container cleanup 9089f89607e35f29a02afc6a40f7f1671ba42a3d00bc43c5b50256e53e01cc73 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 16 13:47:39 compute-0 systemd[1]: libpod-conmon-9089f89607e35f29a02afc6a40f7f1671ba42a3d00bc43c5b50256e53e01cc73.scope: Deactivated successfully.
Feb 16 13:47:39 compute-0 nova_compute[185723]: 2026-02-16 13:47:39.782 185727 DEBUG nova.compute.manager [req-526ebe7e-3184-47ab-a81c-9f38a2138edd req-457e64bf-30ca-4cc1-9089-3f169a9d5459 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3217baa5-9eb7-414f-b18a-c49217ace9b6] Received event network-vif-plugged-0ec2c49b-401e-4ba2-8344-3d943b18845b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:47:39 compute-0 nova_compute[185723]: 2026-02-16 13:47:39.783 185727 DEBUG oslo_concurrency.lockutils [req-526ebe7e-3184-47ab-a81c-9f38a2138edd req-457e64bf-30ca-4cc1-9089-3f169a9d5459 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "3217baa5-9eb7-414f-b18a-c49217ace9b6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:47:39 compute-0 nova_compute[185723]: 2026-02-16 13:47:39.784 185727 DEBUG oslo_concurrency.lockutils [req-526ebe7e-3184-47ab-a81c-9f38a2138edd req-457e64bf-30ca-4cc1-9089-3f169a9d5459 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "3217baa5-9eb7-414f-b18a-c49217ace9b6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:47:39 compute-0 nova_compute[185723]: 2026-02-16 13:47:39.784 185727 DEBUG oslo_concurrency.lockutils [req-526ebe7e-3184-47ab-a81c-9f38a2138edd req-457e64bf-30ca-4cc1-9089-3f169a9d5459 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "3217baa5-9eb7-414f-b18a-c49217ace9b6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:47:39 compute-0 nova_compute[185723]: 2026-02-16 13:47:39.784 185727 DEBUG nova.compute.manager [req-526ebe7e-3184-47ab-a81c-9f38a2138edd req-457e64bf-30ca-4cc1-9089-3f169a9d5459 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3217baa5-9eb7-414f-b18a-c49217ace9b6] No waiting events found dispatching network-vif-plugged-0ec2c49b-401e-4ba2-8344-3d943b18845b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:47:39 compute-0 nova_compute[185723]: 2026-02-16 13:47:39.785 185727 WARNING nova.compute.manager [req-526ebe7e-3184-47ab-a81c-9f38a2138edd req-457e64bf-30ca-4cc1-9089-3f169a9d5459 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3217baa5-9eb7-414f-b18a-c49217ace9b6] Received unexpected event network-vif-plugged-0ec2c49b-401e-4ba2-8344-3d943b18845b for instance with vm_state deleted and task_state None.
Feb 16 13:47:39 compute-0 podman[215072]: 2026-02-16 13:47:39.879897719 +0000 UTC m=+0.203903617 container remove 9089f89607e35f29a02afc6a40f7f1671ba42a3d00bc43c5b50256e53e01cc73 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true)
Feb 16 13:47:39 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:47:39.884 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[4f3eef2a-ae91-4aba-9830-b91cdab79271]: (4, ('Mon Feb 16 01:47:39 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8 (9089f89607e35f29a02afc6a40f7f1671ba42a3d00bc43c5b50256e53e01cc73)\n9089f89607e35f29a02afc6a40f7f1671ba42a3d00bc43c5b50256e53e01cc73\nMon Feb 16 01:47:39 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8 (9089f89607e35f29a02afc6a40f7f1671ba42a3d00bc43c5b50256e53e01cc73)\n9089f89607e35f29a02afc6a40f7f1671ba42a3d00bc43c5b50256e53e01cc73\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:47:39 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:47:39.886 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[aede0491-ae4b-4a64-9dd2-38db69d636f4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:47:39 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:47:39.887 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap62a1ccdd-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:47:39 compute-0 nova_compute[185723]: 2026-02-16 13:47:39.889 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:47:39 compute-0 kernel: tap62a1ccdd-30: left promiscuous mode
Feb 16 13:47:39 compute-0 nova_compute[185723]: 2026-02-16 13:47:39.893 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:47:39 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:47:39.896 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[5a862360-5cf5-4f71-bcc2-9fb43065b1f6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:47:39 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:47:39.911 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[644be11e-7464-4238-b08c-dc46ed055694]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:47:39 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:47:39.914 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[7126b1f4-81e5-40c7-8be2-00cdfac90295]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:47:39 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:47:39.929 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[28f3f2d8-ad6f-476f-b83d-83fd9d4a757a]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 567657, 'reachable_time': 27025, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215087, 'error': None, 'target': 'ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:47:39 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:47:39.932 105762 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Feb 16 13:47:39 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:47:39.932 105762 DEBUG oslo.privsep.daemon [-] privsep: reply[e784fa0c-5566-438e-afde-21f6c6676cb7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:47:39 compute-0 systemd[1]: run-netns-ovnmeta\x2d62a1ccdd\x2d3048\x2d4bbf\x2dacc8\x2dc791bff79ee8.mount: Deactivated successfully.
Feb 16 13:47:40 compute-0 nova_compute[185723]: 2026-02-16 13:47:40.028 185727 DEBUG nova.compute.manager [req-6420db99-59e7-4d7d-8302-8ad60bf451b8 req-a80c58a5-f6d4-4bc7-9c87-72fe6cff7bb7 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 5fc1ad70-adc5-4109-a323-39a1b7137888] Received event network-vif-unplugged-303c8798-0b7a-4dc2-ac3d-fd237012b497 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:47:40 compute-0 nova_compute[185723]: 2026-02-16 13:47:40.028 185727 DEBUG oslo_concurrency.lockutils [req-6420db99-59e7-4d7d-8302-8ad60bf451b8 req-a80c58a5-f6d4-4bc7-9c87-72fe6cff7bb7 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "5fc1ad70-adc5-4109-a323-39a1b7137888-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:47:40 compute-0 nova_compute[185723]: 2026-02-16 13:47:40.028 185727 DEBUG oslo_concurrency.lockutils [req-6420db99-59e7-4d7d-8302-8ad60bf451b8 req-a80c58a5-f6d4-4bc7-9c87-72fe6cff7bb7 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "5fc1ad70-adc5-4109-a323-39a1b7137888-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:47:40 compute-0 nova_compute[185723]: 2026-02-16 13:47:40.029 185727 DEBUG oslo_concurrency.lockutils [req-6420db99-59e7-4d7d-8302-8ad60bf451b8 req-a80c58a5-f6d4-4bc7-9c87-72fe6cff7bb7 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "5fc1ad70-adc5-4109-a323-39a1b7137888-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:47:40 compute-0 nova_compute[185723]: 2026-02-16 13:47:40.029 185727 DEBUG nova.compute.manager [req-6420db99-59e7-4d7d-8302-8ad60bf451b8 req-a80c58a5-f6d4-4bc7-9c87-72fe6cff7bb7 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 5fc1ad70-adc5-4109-a323-39a1b7137888] No waiting events found dispatching network-vif-unplugged-303c8798-0b7a-4dc2-ac3d-fd237012b497 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:47:40 compute-0 nova_compute[185723]: 2026-02-16 13:47:40.029 185727 DEBUG nova.compute.manager [req-6420db99-59e7-4d7d-8302-8ad60bf451b8 req-a80c58a5-f6d4-4bc7-9c87-72fe6cff7bb7 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 5fc1ad70-adc5-4109-a323-39a1b7137888] Received event network-vif-unplugged-303c8798-0b7a-4dc2-ac3d-fd237012b497 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Feb 16 13:47:40 compute-0 nova_compute[185723]: 2026-02-16 13:47:40.505 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:47:42 compute-0 nova_compute[185723]: 2026-02-16 13:47:42.155 185727 DEBUG nova.compute.manager [req-70dad813-897d-4ef7-aecb-91fcd0f59a76 req-e52f34c3-006c-480f-bb33-bf5e0d017e53 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 5fc1ad70-adc5-4109-a323-39a1b7137888] Received event network-vif-plugged-303c8798-0b7a-4dc2-ac3d-fd237012b497 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:47:42 compute-0 nova_compute[185723]: 2026-02-16 13:47:42.155 185727 DEBUG oslo_concurrency.lockutils [req-70dad813-897d-4ef7-aecb-91fcd0f59a76 req-e52f34c3-006c-480f-bb33-bf5e0d017e53 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "5fc1ad70-adc5-4109-a323-39a1b7137888-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:47:42 compute-0 nova_compute[185723]: 2026-02-16 13:47:42.156 185727 DEBUG oslo_concurrency.lockutils [req-70dad813-897d-4ef7-aecb-91fcd0f59a76 req-e52f34c3-006c-480f-bb33-bf5e0d017e53 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "5fc1ad70-adc5-4109-a323-39a1b7137888-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:47:42 compute-0 nova_compute[185723]: 2026-02-16 13:47:42.156 185727 DEBUG oslo_concurrency.lockutils [req-70dad813-897d-4ef7-aecb-91fcd0f59a76 req-e52f34c3-006c-480f-bb33-bf5e0d017e53 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "5fc1ad70-adc5-4109-a323-39a1b7137888-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:47:42 compute-0 nova_compute[185723]: 2026-02-16 13:47:42.156 185727 DEBUG nova.compute.manager [req-70dad813-897d-4ef7-aecb-91fcd0f59a76 req-e52f34c3-006c-480f-bb33-bf5e0d017e53 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 5fc1ad70-adc5-4109-a323-39a1b7137888] No waiting events found dispatching network-vif-plugged-303c8798-0b7a-4dc2-ac3d-fd237012b497 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:47:42 compute-0 nova_compute[185723]: 2026-02-16 13:47:42.157 185727 WARNING nova.compute.manager [req-70dad813-897d-4ef7-aecb-91fcd0f59a76 req-e52f34c3-006c-480f-bb33-bf5e0d017e53 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 5fc1ad70-adc5-4109-a323-39a1b7137888] Received unexpected event network-vif-plugged-303c8798-0b7a-4dc2-ac3d-fd237012b497 for instance with vm_state active and task_state deleting.
Feb 16 13:47:42 compute-0 nova_compute[185723]: 2026-02-16 13:47:42.936 185727 DEBUG nova.network.neutron [-] [instance: 5fc1ad70-adc5-4109-a323-39a1b7137888] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 13:47:42 compute-0 nova_compute[185723]: 2026-02-16 13:47:42.956 185727 INFO nova.compute.manager [-] [instance: 5fc1ad70-adc5-4109-a323-39a1b7137888] Took 3.72 seconds to deallocate network for instance.
Feb 16 13:47:42 compute-0 nova_compute[185723]: 2026-02-16 13:47:42.996 185727 DEBUG oslo_concurrency.lockutils [None req-ba3e6f97-87db-4ea3-ad32-4d6e2eba888a e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:47:42 compute-0 nova_compute[185723]: 2026-02-16 13:47:42.997 185727 DEBUG oslo_concurrency.lockutils [None req-ba3e6f97-87db-4ea3-ad32-4d6e2eba888a e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:47:43 compute-0 nova_compute[185723]: 2026-02-16 13:47:43.046 185727 DEBUG nova.compute.provider_tree [None req-ba3e6f97-87db-4ea3-ad32-4d6e2eba888a e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Inventory has not changed in ProviderTree for provider: c9501a85-df32-4b8f-bce0-9425ef1e7866 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 13:47:43 compute-0 podman[215088]: 2026-02-16 13:47:43.053809109 +0000 UTC m=+0.085703206 container health_status 19961a2f872cc72daf954f4dd556f980b5f58f137d145776df6132c167a8a93c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 16 13:47:43 compute-0 nova_compute[185723]: 2026-02-16 13:47:43.061 185727 DEBUG nova.scheduler.client.report [None req-ba3e6f97-87db-4ea3-ad32-4d6e2eba888a e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Inventory has not changed for provider c9501a85-df32-4b8f-bce0-9425ef1e7866 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 13:47:43 compute-0 nova_compute[185723]: 2026-02-16 13:47:43.080 185727 DEBUG oslo_concurrency.lockutils [None req-ba3e6f97-87db-4ea3-ad32-4d6e2eba888a e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.083s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:47:43 compute-0 nova_compute[185723]: 2026-02-16 13:47:43.102 185727 INFO nova.scheduler.client.report [None req-ba3e6f97-87db-4ea3-ad32-4d6e2eba888a e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Deleted allocations for instance 5fc1ad70-adc5-4109-a323-39a1b7137888
Feb 16 13:47:43 compute-0 nova_compute[185723]: 2026-02-16 13:47:43.197 185727 DEBUG oslo_concurrency.lockutils [None req-ba3e6f97-87db-4ea3-ad32-4d6e2eba888a e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "5fc1ad70-adc5-4109-a323-39a1b7137888" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.325s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:47:44 compute-0 nova_compute[185723]: 2026-02-16 13:47:44.158 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:47:44 compute-0 nova_compute[185723]: 2026-02-16 13:47:44.303 185727 DEBUG nova.compute.manager [req-e328838a-dd9e-411f-b0b0-da6eff5d5b9c req-1a2cc627-5da1-4e26-a89a-675480d1b9c8 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 5fc1ad70-adc5-4109-a323-39a1b7137888] Received event network-vif-deleted-303c8798-0b7a-4dc2-ac3d-fd237012b497 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:47:45 compute-0 nova_compute[185723]: 2026-02-16 13:47:45.506 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:47:45 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:47:45.776 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=b0e583b2-47d7-4bde-bbd6-282143e0c194, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '26'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:47:49 compute-0 nova_compute[185723]: 2026-02-16 13:47:49.045 185727 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1771249654.044111, 3217baa5-9eb7-414f-b18a-c49217ace9b6 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 13:47:49 compute-0 nova_compute[185723]: 2026-02-16 13:47:49.046 185727 INFO nova.compute.manager [-] [instance: 3217baa5-9eb7-414f-b18a-c49217ace9b6] VM Stopped (Lifecycle Event)
Feb 16 13:47:49 compute-0 nova_compute[185723]: 2026-02-16 13:47:49.081 185727 DEBUG nova.compute.manager [None req-d4169665-3306-4d0a-8dbf-d048db5b9e9a - - - - - -] [instance: 3217baa5-9eb7-414f-b18a-c49217ace9b6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:47:49 compute-0 nova_compute[185723]: 2026-02-16 13:47:49.161 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:47:50 compute-0 nova_compute[185723]: 2026-02-16 13:47:50.508 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:47:52 compute-0 podman[215114]: 2026-02-16 13:47:52.009154061 +0000 UTC m=+0.046069323 container health_status 4ec6105d1727f201f656d7e6e358931be8ceabc5af37fb5ad779abc620122180 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 16 13:47:54 compute-0 nova_compute[185723]: 2026-02-16 13:47:54.134 185727 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1771249659.1313198, 5fc1ad70-adc5-4109-a323-39a1b7137888 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 13:47:54 compute-0 nova_compute[185723]: 2026-02-16 13:47:54.134 185727 INFO nova.compute.manager [-] [instance: 5fc1ad70-adc5-4109-a323-39a1b7137888] VM Stopped (Lifecycle Event)
Feb 16 13:47:54 compute-0 nova_compute[185723]: 2026-02-16 13:47:54.158 185727 DEBUG nova.compute.manager [None req-8ddcd271-4c88-4c4e-bd45-0b227f29ff67 - - - - - -] [instance: 5fc1ad70-adc5-4109-a323-39a1b7137888] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:47:54 compute-0 nova_compute[185723]: 2026-02-16 13:47:54.163 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:47:55 compute-0 nova_compute[185723]: 2026-02-16 13:47:55.510 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:47:59 compute-0 nova_compute[185723]: 2026-02-16 13:47:59.208 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:47:59 compute-0 podman[195053]: time="2026-02-16T13:47:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:47:59 compute-0 podman[195053]: @ - - [16/Feb/2026:13:47:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16012 "" "Go-http-client/1.1"
Feb 16 13:47:59 compute-0 podman[195053]: @ - - [16/Feb/2026:13:47:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2178 "" "Go-http-client/1.1"
Feb 16 13:48:00 compute-0 nova_compute[185723]: 2026-02-16 13:48:00.511 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:48:01 compute-0 openstack_network_exporter[197909]: ERROR   13:48:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:48:01 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:48:01 compute-0 openstack_network_exporter[197909]: ERROR   13:48:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:48:01 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:48:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:48:03.241 105360 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:48:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:48:03.242 105360 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:48:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:48:03.242 105360 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:48:04 compute-0 nova_compute[185723]: 2026-02-16 13:48:04.210 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:48:05 compute-0 nova_compute[185723]: 2026-02-16 13:48:05.513 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:48:08 compute-0 podman[215139]: 2026-02-16 13:48:08.012263152 +0000 UTC m=+0.047102119 container health_status d69bd0be854ec8043fcba555b70c2cd311c7a2510d07d6bc9929fe7684abece9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Feb 16 13:48:08 compute-0 podman[215138]: 2026-02-16 13:48:08.036121673 +0000 UTC m=+0.075012281 container health_status 93151dcbec9f159583042df5101bdd7b5b1bcb5f3117955b4bc9cd1c0e465b98 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, name=ubi9/ubi-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, vcs-type=git, vendor=Red Hat, Inc., version=9.7, build-date=2026-02-05T04:57:10Z, io.openshift.expose-services=, org.opencontainers.image.created=2026-02-05T04:57:10Z, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1770267347, architecture=x86_64, distribution-scope=public, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c)
Feb 16 13:48:09 compute-0 nova_compute[185723]: 2026-02-16 13:48:09.213 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:48:09 compute-0 sshd-session[215178]: Invalid user postgres from 188.166.42.159 port 41492
Feb 16 13:48:09 compute-0 sshd-session[215178]: Connection closed by invalid user postgres 188.166.42.159 port 41492 [preauth]
Feb 16 13:48:10 compute-0 nova_compute[185723]: 2026-02-16 13:48:10.517 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:48:12 compute-0 ovn_controller[96072]: 2026-02-16T13:48:12Z|00199|memory_trim|INFO|Detected inactivity (last active 30009 ms ago): trimming memory
Feb 16 13:48:13 compute-0 nova_compute[185723]: 2026-02-16 13:48:13.433 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:48:13 compute-0 nova_compute[185723]: 2026-02-16 13:48:13.433 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Feb 16 13:48:13 compute-0 nova_compute[185723]: 2026-02-16 13:48:13.452 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Feb 16 13:48:14 compute-0 podman[215180]: 2026-02-16 13:48:14.082098213 +0000 UTC m=+0.122338944 container health_status 19961a2f872cc72daf954f4dd556f980b5f58f137d145776df6132c167a8a93c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_controller)
Feb 16 13:48:14 compute-0 nova_compute[185723]: 2026-02-16 13:48:14.215 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:48:15 compute-0 nova_compute[185723]: 2026-02-16 13:48:15.518 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:48:18 compute-0 sshd-session[215208]: Invalid user ubuntu from 64.227.72.94 port 54380
Feb 16 13:48:18 compute-0 sshd-session[215208]: Connection closed by invalid user ubuntu 64.227.72.94 port 54380 [preauth]
Feb 16 13:48:19 compute-0 sshd-session[215207]: Invalid user test from 146.190.226.24 port 37256
Feb 16 13:48:19 compute-0 nova_compute[185723]: 2026-02-16 13:48:19.216 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:48:19 compute-0 sshd-session[215207]: Connection closed by invalid user test 146.190.226.24 port 37256 [preauth]
Feb 16 13:48:20 compute-0 nova_compute[185723]: 2026-02-16 13:48:20.551 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:48:22 compute-0 nova_compute[185723]: 2026-02-16 13:48:22.453 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:48:22 compute-0 nova_compute[185723]: 2026-02-16 13:48:22.507 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:48:22 compute-0 nova_compute[185723]: 2026-02-16 13:48:22.507 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:48:22 compute-0 nova_compute[185723]: 2026-02-16 13:48:22.508 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:48:22 compute-0 nova_compute[185723]: 2026-02-16 13:48:22.508 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 16 13:48:22 compute-0 podman[215211]: 2026-02-16 13:48:22.593376607 +0000 UTC m=+0.048706998 container health_status 4ec6105d1727f201f656d7e6e358931be8ceabc5af37fb5ad779abc620122180 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 16 13:48:22 compute-0 nova_compute[185723]: 2026-02-16 13:48:22.672 185727 WARNING nova.virt.libvirt.driver [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 13:48:22 compute-0 nova_compute[185723]: 2026-02-16 13:48:22.673 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5855MB free_disk=73.22404098510742GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 16 13:48:22 compute-0 nova_compute[185723]: 2026-02-16 13:48:22.673 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:48:22 compute-0 nova_compute[185723]: 2026-02-16 13:48:22.673 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:48:22 compute-0 nova_compute[185723]: 2026-02-16 13:48:22.811 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 16 13:48:22 compute-0 nova_compute[185723]: 2026-02-16 13:48:22.811 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 16 13:48:22 compute-0 nova_compute[185723]: 2026-02-16 13:48:22.890 185727 DEBUG nova.compute.provider_tree [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Inventory has not changed in ProviderTree for provider: c9501a85-df32-4b8f-bce0-9425ef1e7866 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 13:48:22 compute-0 nova_compute[185723]: 2026-02-16 13:48:22.914 185727 DEBUG nova.scheduler.client.report [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Inventory has not changed for provider c9501a85-df32-4b8f-bce0-9425ef1e7866 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 13:48:22 compute-0 nova_compute[185723]: 2026-02-16 13:48:22.940 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 16 13:48:22 compute-0 nova_compute[185723]: 2026-02-16 13:48:22.940 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.267s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:48:24 compute-0 nova_compute[185723]: 2026-02-16 13:48:24.219 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:48:24 compute-0 nova_compute[185723]: 2026-02-16 13:48:24.921 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:48:25 compute-0 nova_compute[185723]: 2026-02-16 13:48:25.433 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:48:25 compute-0 nova_compute[185723]: 2026-02-16 13:48:25.552 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:48:26 compute-0 nova_compute[185723]: 2026-02-16 13:48:26.433 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:48:26 compute-0 nova_compute[185723]: 2026-02-16 13:48:26.434 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 16 13:48:26 compute-0 nova_compute[185723]: 2026-02-16 13:48:26.434 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 16 13:48:26 compute-0 nova_compute[185723]: 2026-02-16 13:48:26.453 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 16 13:48:26 compute-0 nova_compute[185723]: 2026-02-16 13:48:26.454 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:48:27 compute-0 nova_compute[185723]: 2026-02-16 13:48:27.433 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:48:28 compute-0 nova_compute[185723]: 2026-02-16 13:48:28.428 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:48:28 compute-0 nova_compute[185723]: 2026-02-16 13:48:28.432 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:48:29 compute-0 nova_compute[185723]: 2026-02-16 13:48:29.220 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:48:29 compute-0 nova_compute[185723]: 2026-02-16 13:48:29.433 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:48:29 compute-0 nova_compute[185723]: 2026-02-16 13:48:29.433 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 16 13:48:29 compute-0 podman[195053]: time="2026-02-16T13:48:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:48:29 compute-0 podman[195053]: @ - - [16/Feb/2026:13:48:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16012 "" "Go-http-client/1.1"
Feb 16 13:48:29 compute-0 podman[195053]: @ - - [16/Feb/2026:13:48:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2178 "" "Go-http-client/1.1"
Feb 16 13:48:30 compute-0 nova_compute[185723]: 2026-02-16 13:48:30.553 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:48:31 compute-0 openstack_network_exporter[197909]: ERROR   13:48:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:48:31 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:48:31 compute-0 openstack_network_exporter[197909]: ERROR   13:48:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:48:31 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:48:31 compute-0 nova_compute[185723]: 2026-02-16 13:48:31.433 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:48:31 compute-0 nova_compute[185723]: 2026-02-16 13:48:31.434 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Feb 16 13:48:34 compute-0 nova_compute[185723]: 2026-02-16 13:48:34.222 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:48:35 compute-0 nova_compute[185723]: 2026-02-16 13:48:35.392 185727 DEBUG oslo_concurrency.lockutils [None req-708decea-b236-4591-adce-55d911640ae7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Acquiring lock "18d7e5d6-d36a-46d7-b461-264c28cb9043" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:48:35 compute-0 nova_compute[185723]: 2026-02-16 13:48:35.392 185727 DEBUG oslo_concurrency.lockutils [None req-708decea-b236-4591-adce-55d911640ae7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "18d7e5d6-d36a-46d7-b461-264c28cb9043" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:48:35 compute-0 nova_compute[185723]: 2026-02-16 13:48:35.420 185727 DEBUG nova.compute.manager [None req-708decea-b236-4591-adce-55d911640ae7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 18d7e5d6-d36a-46d7-b461-264c28cb9043] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Feb 16 13:48:35 compute-0 nova_compute[185723]: 2026-02-16 13:48:35.521 185727 DEBUG oslo_concurrency.lockutils [None req-708decea-b236-4591-adce-55d911640ae7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:48:35 compute-0 nova_compute[185723]: 2026-02-16 13:48:35.521 185727 DEBUG oslo_concurrency.lockutils [None req-708decea-b236-4591-adce-55d911640ae7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:48:35 compute-0 nova_compute[185723]: 2026-02-16 13:48:35.528 185727 DEBUG nova.virt.hardware [None req-708decea-b236-4591-adce-55d911640ae7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Feb 16 13:48:35 compute-0 nova_compute[185723]: 2026-02-16 13:48:35.529 185727 INFO nova.compute.claims [None req-708decea-b236-4591-adce-55d911640ae7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 18d7e5d6-d36a-46d7-b461-264c28cb9043] Claim successful on node compute-0.ctlplane.example.com
Feb 16 13:48:35 compute-0 nova_compute[185723]: 2026-02-16 13:48:35.554 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:48:35 compute-0 nova_compute[185723]: 2026-02-16 13:48:35.651 185727 DEBUG nova.compute.provider_tree [None req-708decea-b236-4591-adce-55d911640ae7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Inventory has not changed in ProviderTree for provider: c9501a85-df32-4b8f-bce0-9425ef1e7866 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 13:48:35 compute-0 nova_compute[185723]: 2026-02-16 13:48:35.668 185727 DEBUG nova.scheduler.client.report [None req-708decea-b236-4591-adce-55d911640ae7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Inventory has not changed for provider c9501a85-df32-4b8f-bce0-9425ef1e7866 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 13:48:35 compute-0 nova_compute[185723]: 2026-02-16 13:48:35.694 185727 DEBUG oslo_concurrency.lockutils [None req-708decea-b236-4591-adce-55d911640ae7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.173s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:48:35 compute-0 nova_compute[185723]: 2026-02-16 13:48:35.695 185727 DEBUG nova.compute.manager [None req-708decea-b236-4591-adce-55d911640ae7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 18d7e5d6-d36a-46d7-b461-264c28cb9043] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Feb 16 13:48:35 compute-0 nova_compute[185723]: 2026-02-16 13:48:35.767 185727 DEBUG nova.compute.manager [None req-708decea-b236-4591-adce-55d911640ae7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 18d7e5d6-d36a-46d7-b461-264c28cb9043] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Feb 16 13:48:35 compute-0 nova_compute[185723]: 2026-02-16 13:48:35.767 185727 DEBUG nova.network.neutron [None req-708decea-b236-4591-adce-55d911640ae7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 18d7e5d6-d36a-46d7-b461-264c28cb9043] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Feb 16 13:48:35 compute-0 nova_compute[185723]: 2026-02-16 13:48:35.792 185727 INFO nova.virt.libvirt.driver [None req-708decea-b236-4591-adce-55d911640ae7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 18d7e5d6-d36a-46d7-b461-264c28cb9043] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Feb 16 13:48:35 compute-0 nova_compute[185723]: 2026-02-16 13:48:35.817 185727 DEBUG nova.compute.manager [None req-708decea-b236-4591-adce-55d911640ae7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 18d7e5d6-d36a-46d7-b461-264c28cb9043] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Feb 16 13:48:35 compute-0 nova_compute[185723]: 2026-02-16 13:48:35.941 185727 DEBUG nova.compute.manager [None req-708decea-b236-4591-adce-55d911640ae7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 18d7e5d6-d36a-46d7-b461-264c28cb9043] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Feb 16 13:48:35 compute-0 nova_compute[185723]: 2026-02-16 13:48:35.943 185727 DEBUG nova.virt.libvirt.driver [None req-708decea-b236-4591-adce-55d911640ae7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 18d7e5d6-d36a-46d7-b461-264c28cb9043] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Feb 16 13:48:35 compute-0 nova_compute[185723]: 2026-02-16 13:48:35.943 185727 INFO nova.virt.libvirt.driver [None req-708decea-b236-4591-adce-55d911640ae7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 18d7e5d6-d36a-46d7-b461-264c28cb9043] Creating image(s)
Feb 16 13:48:35 compute-0 nova_compute[185723]: 2026-02-16 13:48:35.944 185727 DEBUG oslo_concurrency.lockutils [None req-708decea-b236-4591-adce-55d911640ae7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Acquiring lock "/var/lib/nova/instances/18d7e5d6-d36a-46d7-b461-264c28cb9043/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:48:35 compute-0 nova_compute[185723]: 2026-02-16 13:48:35.944 185727 DEBUG oslo_concurrency.lockutils [None req-708decea-b236-4591-adce-55d911640ae7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "/var/lib/nova/instances/18d7e5d6-d36a-46d7-b461-264c28cb9043/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:48:35 compute-0 nova_compute[185723]: 2026-02-16 13:48:35.945 185727 DEBUG oslo_concurrency.lockutils [None req-708decea-b236-4591-adce-55d911640ae7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "/var/lib/nova/instances/18d7e5d6-d36a-46d7-b461-264c28cb9043/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:48:35 compute-0 nova_compute[185723]: 2026-02-16 13:48:35.963 185727 DEBUG nova.policy [None req-708decea-b236-4591-adce-55d911640ae7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'e19cd2d8a8894526ba620ca3249e9a63', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '76c271745e704d5fa97fe16a7dcd4a81', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Feb 16 13:48:35 compute-0 nova_compute[185723]: 2026-02-16 13:48:35.966 185727 DEBUG oslo_concurrency.processutils [None req-708decea-b236-4591-adce-55d911640ae7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:48:36 compute-0 nova_compute[185723]: 2026-02-16 13:48:36.015 185727 DEBUG oslo_concurrency.processutils [None req-708decea-b236-4591-adce-55d911640ae7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json" returned: 0 in 0.049s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:48:36 compute-0 nova_compute[185723]: 2026-02-16 13:48:36.016 185727 DEBUG oslo_concurrency.lockutils [None req-708decea-b236-4591-adce-55d911640ae7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Acquiring lock "755ae6cb63977e865a71a8c38a1a67ce95c9d0b7" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:48:36 compute-0 nova_compute[185723]: 2026-02-16 13:48:36.017 185727 DEBUG oslo_concurrency.lockutils [None req-708decea-b236-4591-adce-55d911640ae7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "755ae6cb63977e865a71a8c38a1a67ce95c9d0b7" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:48:36 compute-0 nova_compute[185723]: 2026-02-16 13:48:36.027 185727 DEBUG oslo_concurrency.processutils [None req-708decea-b236-4591-adce-55d911640ae7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:48:36 compute-0 nova_compute[185723]: 2026-02-16 13:48:36.074 185727 DEBUG oslo_concurrency.processutils [None req-708decea-b236-4591-adce-55d911640ae7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json" returned: 0 in 0.047s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:48:36 compute-0 nova_compute[185723]: 2026-02-16 13:48:36.075 185727 DEBUG oslo_concurrency.processutils [None req-708decea-b236-4591-adce-55d911640ae7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7,backing_fmt=raw /var/lib/nova/instances/18d7e5d6-d36a-46d7-b461-264c28cb9043/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:48:36 compute-0 nova_compute[185723]: 2026-02-16 13:48:36.104 185727 DEBUG oslo_concurrency.processutils [None req-708decea-b236-4591-adce-55d911640ae7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7,backing_fmt=raw /var/lib/nova/instances/18d7e5d6-d36a-46d7-b461-264c28cb9043/disk 1073741824" returned: 0 in 0.029s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:48:36 compute-0 nova_compute[185723]: 2026-02-16 13:48:36.105 185727 DEBUG oslo_concurrency.lockutils [None req-708decea-b236-4591-adce-55d911640ae7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "755ae6cb63977e865a71a8c38a1a67ce95c9d0b7" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.088s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:48:36 compute-0 nova_compute[185723]: 2026-02-16 13:48:36.105 185727 DEBUG oslo_concurrency.processutils [None req-708decea-b236-4591-adce-55d911640ae7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:48:36 compute-0 nova_compute[185723]: 2026-02-16 13:48:36.156 185727 DEBUG oslo_concurrency.processutils [None req-708decea-b236-4591-adce-55d911640ae7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:48:36 compute-0 nova_compute[185723]: 2026-02-16 13:48:36.158 185727 DEBUG nova.virt.disk.api [None req-708decea-b236-4591-adce-55d911640ae7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Checking if we can resize image /var/lib/nova/instances/18d7e5d6-d36a-46d7-b461-264c28cb9043/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Feb 16 13:48:36 compute-0 nova_compute[185723]: 2026-02-16 13:48:36.158 185727 DEBUG oslo_concurrency.processutils [None req-708decea-b236-4591-adce-55d911640ae7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/18d7e5d6-d36a-46d7-b461-264c28cb9043/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:48:36 compute-0 nova_compute[185723]: 2026-02-16 13:48:36.210 185727 DEBUG oslo_concurrency.processutils [None req-708decea-b236-4591-adce-55d911640ae7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/18d7e5d6-d36a-46d7-b461-264c28cb9043/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:48:36 compute-0 nova_compute[185723]: 2026-02-16 13:48:36.211 185727 DEBUG nova.virt.disk.api [None req-708decea-b236-4591-adce-55d911640ae7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Cannot resize image /var/lib/nova/instances/18d7e5d6-d36a-46d7-b461-264c28cb9043/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Feb 16 13:48:36 compute-0 nova_compute[185723]: 2026-02-16 13:48:36.211 185727 DEBUG nova.objects.instance [None req-708decea-b236-4591-adce-55d911640ae7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lazy-loading 'migration_context' on Instance uuid 18d7e5d6-d36a-46d7-b461-264c28cb9043 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 13:48:36 compute-0 nova_compute[185723]: 2026-02-16 13:48:36.226 185727 DEBUG nova.virt.libvirt.driver [None req-708decea-b236-4591-adce-55d911640ae7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 18d7e5d6-d36a-46d7-b461-264c28cb9043] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Feb 16 13:48:36 compute-0 nova_compute[185723]: 2026-02-16 13:48:36.227 185727 DEBUG nova.virt.libvirt.driver [None req-708decea-b236-4591-adce-55d911640ae7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 18d7e5d6-d36a-46d7-b461-264c28cb9043] Ensure instance console log exists: /var/lib/nova/instances/18d7e5d6-d36a-46d7-b461-264c28cb9043/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Feb 16 13:48:36 compute-0 nova_compute[185723]: 2026-02-16 13:48:36.227 185727 DEBUG oslo_concurrency.lockutils [None req-708decea-b236-4591-adce-55d911640ae7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:48:36 compute-0 nova_compute[185723]: 2026-02-16 13:48:36.228 185727 DEBUG oslo_concurrency.lockutils [None req-708decea-b236-4591-adce-55d911640ae7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:48:36 compute-0 nova_compute[185723]: 2026-02-16 13:48:36.228 185727 DEBUG oslo_concurrency.lockutils [None req-708decea-b236-4591-adce-55d911640ae7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:48:37 compute-0 nova_compute[185723]: 2026-02-16 13:48:37.597 185727 DEBUG nova.network.neutron [None req-708decea-b236-4591-adce-55d911640ae7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 18d7e5d6-d36a-46d7-b461-264c28cb9043] Successfully created port: 321f3fac-0060-4083-a357-cce4f142588b _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Feb 16 13:48:38 compute-0 nova_compute[185723]: 2026-02-16 13:48:38.217 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:48:38 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:48:38.217 105360 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=27, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '52:96:1b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '16:6c:7a:bd:49:39'}, ipsec=False) old=SB_Global(nb_cfg=26) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 13:48:38 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:48:38.218 105360 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 16 13:48:38 compute-0 podman[215254]: 2026-02-16 13:48:38.999185551 +0000 UTC m=+0.040528925 container health_status d69bd0be854ec8043fcba555b70c2cd311c7a2510d07d6bc9929fe7684abece9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, managed_by=edpm_ansible, config_id=ovn_metadata_agent, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Feb 16 13:48:39 compute-0 podman[215253]: 2026-02-16 13:48:39.009069066 +0000 UTC m=+0.052533673 container health_status 93151dcbec9f159583042df5101bdd7b5b1bcb5f3117955b4bc9cd1c0e465b98 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vcs-type=git, architecture=x86_64, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, release=1770267347, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.openshift.expose-services=, maintainer=Red Hat, Inc., managed_by=edpm_ansible, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, config_id=openstack_network_exporter, build-date=2026-02-05T04:57:10Z, name=ubi9/ubi-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, org.opencontainers.image.created=2026-02-05T04:57:10Z, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, version=9.7)
Feb 16 13:48:39 compute-0 nova_compute[185723]: 2026-02-16 13:48:39.157 185727 DEBUG nova.network.neutron [None req-708decea-b236-4591-adce-55d911640ae7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 18d7e5d6-d36a-46d7-b461-264c28cb9043] Successfully updated port: 321f3fac-0060-4083-a357-cce4f142588b _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Feb 16 13:48:39 compute-0 nova_compute[185723]: 2026-02-16 13:48:39.173 185727 DEBUG oslo_concurrency.lockutils [None req-708decea-b236-4591-adce-55d911640ae7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Acquiring lock "refresh_cache-18d7e5d6-d36a-46d7-b461-264c28cb9043" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 13:48:39 compute-0 nova_compute[185723]: 2026-02-16 13:48:39.173 185727 DEBUG oslo_concurrency.lockutils [None req-708decea-b236-4591-adce-55d911640ae7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Acquired lock "refresh_cache-18d7e5d6-d36a-46d7-b461-264c28cb9043" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 13:48:39 compute-0 nova_compute[185723]: 2026-02-16 13:48:39.173 185727 DEBUG nova.network.neutron [None req-708decea-b236-4591-adce-55d911640ae7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 18d7e5d6-d36a-46d7-b461-264c28cb9043] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 16 13:48:39 compute-0 nova_compute[185723]: 2026-02-16 13:48:39.225 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:48:39 compute-0 nova_compute[185723]: 2026-02-16 13:48:39.253 185727 DEBUG nova.compute.manager [req-f9375a17-3225-43ab-b335-4e6631c77311 req-002f8627-74f1-4564-8047-07ff07c04900 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 18d7e5d6-d36a-46d7-b461-264c28cb9043] Received event network-changed-321f3fac-0060-4083-a357-cce4f142588b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:48:39 compute-0 nova_compute[185723]: 2026-02-16 13:48:39.254 185727 DEBUG nova.compute.manager [req-f9375a17-3225-43ab-b335-4e6631c77311 req-002f8627-74f1-4564-8047-07ff07c04900 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 18d7e5d6-d36a-46d7-b461-264c28cb9043] Refreshing instance network info cache due to event network-changed-321f3fac-0060-4083-a357-cce4f142588b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 16 13:48:39 compute-0 nova_compute[185723]: 2026-02-16 13:48:39.254 185727 DEBUG oslo_concurrency.lockutils [req-f9375a17-3225-43ab-b335-4e6631c77311 req-002f8627-74f1-4564-8047-07ff07c04900 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "refresh_cache-18d7e5d6-d36a-46d7-b461-264c28cb9043" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 13:48:39 compute-0 nova_compute[185723]: 2026-02-16 13:48:39.454 185727 DEBUG nova.network.neutron [None req-708decea-b236-4591-adce-55d911640ae7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 18d7e5d6-d36a-46d7-b461-264c28cb9043] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Feb 16 13:48:40 compute-0 nova_compute[185723]: 2026-02-16 13:48:40.123 185727 DEBUG nova.network.neutron [None req-708decea-b236-4591-adce-55d911640ae7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 18d7e5d6-d36a-46d7-b461-264c28cb9043] Updating instance_info_cache with network_info: [{"id": "321f3fac-0060-4083-a357-cce4f142588b", "address": "fa:16:3e:5e:12:9a", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap321f3fac-00", "ovs_interfaceid": "321f3fac-0060-4083-a357-cce4f142588b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 13:48:40 compute-0 nova_compute[185723]: 2026-02-16 13:48:40.141 185727 DEBUG oslo_concurrency.lockutils [None req-708decea-b236-4591-adce-55d911640ae7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Releasing lock "refresh_cache-18d7e5d6-d36a-46d7-b461-264c28cb9043" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 13:48:40 compute-0 nova_compute[185723]: 2026-02-16 13:48:40.142 185727 DEBUG nova.compute.manager [None req-708decea-b236-4591-adce-55d911640ae7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 18d7e5d6-d36a-46d7-b461-264c28cb9043] Instance network_info: |[{"id": "321f3fac-0060-4083-a357-cce4f142588b", "address": "fa:16:3e:5e:12:9a", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap321f3fac-00", "ovs_interfaceid": "321f3fac-0060-4083-a357-cce4f142588b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Feb 16 13:48:40 compute-0 nova_compute[185723]: 2026-02-16 13:48:40.143 185727 DEBUG oslo_concurrency.lockutils [req-f9375a17-3225-43ab-b335-4e6631c77311 req-002f8627-74f1-4564-8047-07ff07c04900 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquired lock "refresh_cache-18d7e5d6-d36a-46d7-b461-264c28cb9043" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 13:48:40 compute-0 nova_compute[185723]: 2026-02-16 13:48:40.143 185727 DEBUG nova.network.neutron [req-f9375a17-3225-43ab-b335-4e6631c77311 req-002f8627-74f1-4564-8047-07ff07c04900 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 18d7e5d6-d36a-46d7-b461-264c28cb9043] Refreshing network info cache for port 321f3fac-0060-4083-a357-cce4f142588b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 16 13:48:40 compute-0 nova_compute[185723]: 2026-02-16 13:48:40.147 185727 DEBUG nova.virt.libvirt.driver [None req-708decea-b236-4591-adce-55d911640ae7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 18d7e5d6-d36a-46d7-b461-264c28cb9043] Start _get_guest_xml network_info=[{"id": "321f3fac-0060-4083-a357-cce4f142588b", "address": "fa:16:3e:5e:12:9a", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap321f3fac-00", "ovs_interfaceid": "321f3fac-0060-4083-a357-cce4f142588b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-16T13:17:01Z,direct_url=<?>,disk_format='qcow2',id=6fb9af7f-2971-4890-a777-6e99e888717f,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='fa8ec31824694513a42cc22c81880a5c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-16T13:17:03Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'encrypted': False, 'device_type': 'disk', 'disk_bus': 'virtio', 'encryption_options': None, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'size': 0, 'image_id': '6fb9af7f-2971-4890-a777-6e99e888717f'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Feb 16 13:48:40 compute-0 nova_compute[185723]: 2026-02-16 13:48:40.152 185727 WARNING nova.virt.libvirt.driver [None req-708decea-b236-4591-adce-55d911640ae7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 13:48:40 compute-0 nova_compute[185723]: 2026-02-16 13:48:40.157 185727 DEBUG nova.virt.libvirt.host [None req-708decea-b236-4591-adce-55d911640ae7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Feb 16 13:48:40 compute-0 nova_compute[185723]: 2026-02-16 13:48:40.158 185727 DEBUG nova.virt.libvirt.host [None req-708decea-b236-4591-adce-55d911640ae7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Feb 16 13:48:40 compute-0 nova_compute[185723]: 2026-02-16 13:48:40.167 185727 DEBUG nova.virt.libvirt.host [None req-708decea-b236-4591-adce-55d911640ae7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Feb 16 13:48:40 compute-0 nova_compute[185723]: 2026-02-16 13:48:40.168 185727 DEBUG nova.virt.libvirt.host [None req-708decea-b236-4591-adce-55d911640ae7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Feb 16 13:48:40 compute-0 nova_compute[185723]: 2026-02-16 13:48:40.169 185727 DEBUG nova.virt.libvirt.driver [None req-708decea-b236-4591-adce-55d911640ae7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Feb 16 13:48:40 compute-0 nova_compute[185723]: 2026-02-16 13:48:40.170 185727 DEBUG nova.virt.hardware [None req-708decea-b236-4591-adce-55d911640ae7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-02-16T13:16:59Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='6d89f72c-1760-421e-a5f2-83dfc3723b84',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-16T13:17:01Z,direct_url=<?>,disk_format='qcow2',id=6fb9af7f-2971-4890-a777-6e99e888717f,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='fa8ec31824694513a42cc22c81880a5c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-16T13:17:03Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Feb 16 13:48:40 compute-0 nova_compute[185723]: 2026-02-16 13:48:40.170 185727 DEBUG nova.virt.hardware [None req-708decea-b236-4591-adce-55d911640ae7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Feb 16 13:48:40 compute-0 nova_compute[185723]: 2026-02-16 13:48:40.170 185727 DEBUG nova.virt.hardware [None req-708decea-b236-4591-adce-55d911640ae7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Feb 16 13:48:40 compute-0 nova_compute[185723]: 2026-02-16 13:48:40.171 185727 DEBUG nova.virt.hardware [None req-708decea-b236-4591-adce-55d911640ae7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Feb 16 13:48:40 compute-0 nova_compute[185723]: 2026-02-16 13:48:40.171 185727 DEBUG nova.virt.hardware [None req-708decea-b236-4591-adce-55d911640ae7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Feb 16 13:48:40 compute-0 nova_compute[185723]: 2026-02-16 13:48:40.171 185727 DEBUG nova.virt.hardware [None req-708decea-b236-4591-adce-55d911640ae7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Feb 16 13:48:40 compute-0 nova_compute[185723]: 2026-02-16 13:48:40.171 185727 DEBUG nova.virt.hardware [None req-708decea-b236-4591-adce-55d911640ae7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Feb 16 13:48:40 compute-0 nova_compute[185723]: 2026-02-16 13:48:40.172 185727 DEBUG nova.virt.hardware [None req-708decea-b236-4591-adce-55d911640ae7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Feb 16 13:48:40 compute-0 nova_compute[185723]: 2026-02-16 13:48:40.172 185727 DEBUG nova.virt.hardware [None req-708decea-b236-4591-adce-55d911640ae7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Feb 16 13:48:40 compute-0 nova_compute[185723]: 2026-02-16 13:48:40.172 185727 DEBUG nova.virt.hardware [None req-708decea-b236-4591-adce-55d911640ae7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Feb 16 13:48:40 compute-0 nova_compute[185723]: 2026-02-16 13:48:40.172 185727 DEBUG nova.virt.hardware [None req-708decea-b236-4591-adce-55d911640ae7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Feb 16 13:48:40 compute-0 nova_compute[185723]: 2026-02-16 13:48:40.176 185727 DEBUG nova.virt.libvirt.vif [None req-708decea-b236-4591-adce-55d911640ae7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-16T13:48:34Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1866584778',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1866584778',id=24,image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='76c271745e704d5fa97fe16a7dcd4a81',ramdisk_id='',reservation_id='r-lgc90ub5',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-1085993185',owner_user_name='tempest-TestExecuteStrategies-1085993185-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-16T13:48:35Z,user_data=None,user_id='e19cd2d8a8894526ba620ca3249e9a63',uuid=18d7e5d6-d36a-46d7-b461-264c28cb9043,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "321f3fac-0060-4083-a357-cce4f142588b", "address": "fa:16:3e:5e:12:9a", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap321f3fac-00", "ovs_interfaceid": "321f3fac-0060-4083-a357-cce4f142588b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Feb 16 13:48:40 compute-0 nova_compute[185723]: 2026-02-16 13:48:40.177 185727 DEBUG nova.network.os_vif_util [None req-708decea-b236-4591-adce-55d911640ae7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Converting VIF {"id": "321f3fac-0060-4083-a357-cce4f142588b", "address": "fa:16:3e:5e:12:9a", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap321f3fac-00", "ovs_interfaceid": "321f3fac-0060-4083-a357-cce4f142588b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 13:48:40 compute-0 nova_compute[185723]: 2026-02-16 13:48:40.177 185727 DEBUG nova.network.os_vif_util [None req-708decea-b236-4591-adce-55d911640ae7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5e:12:9a,bridge_name='br-int',has_traffic_filtering=True,id=321f3fac-0060-4083-a357-cce4f142588b,network=Network(62a1ccdd-3048-4bbf-acc8-c791bff79ee8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap321f3fac-00') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 13:48:40 compute-0 nova_compute[185723]: 2026-02-16 13:48:40.179 185727 DEBUG nova.objects.instance [None req-708decea-b236-4591-adce-55d911640ae7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lazy-loading 'pci_devices' on Instance uuid 18d7e5d6-d36a-46d7-b461-264c28cb9043 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 13:48:40 compute-0 nova_compute[185723]: 2026-02-16 13:48:40.193 185727 DEBUG nova.virt.libvirt.driver [None req-708decea-b236-4591-adce-55d911640ae7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 18d7e5d6-d36a-46d7-b461-264c28cb9043] End _get_guest_xml xml=<domain type="kvm">
Feb 16 13:48:40 compute-0 nova_compute[185723]:   <uuid>18d7e5d6-d36a-46d7-b461-264c28cb9043</uuid>
Feb 16 13:48:40 compute-0 nova_compute[185723]:   <name>instance-00000018</name>
Feb 16 13:48:40 compute-0 nova_compute[185723]:   <memory>131072</memory>
Feb 16 13:48:40 compute-0 nova_compute[185723]:   <vcpu>1</vcpu>
Feb 16 13:48:40 compute-0 nova_compute[185723]:   <metadata>
Feb 16 13:48:40 compute-0 nova_compute[185723]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb 16 13:48:40 compute-0 nova_compute[185723]:       <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Feb 16 13:48:40 compute-0 nova_compute[185723]:       <nova:name>tempest-TestExecuteStrategies-server-1866584778</nova:name>
Feb 16 13:48:40 compute-0 nova_compute[185723]:       <nova:creationTime>2026-02-16 13:48:40</nova:creationTime>
Feb 16 13:48:40 compute-0 nova_compute[185723]:       <nova:flavor name="m1.nano">
Feb 16 13:48:40 compute-0 nova_compute[185723]:         <nova:memory>128</nova:memory>
Feb 16 13:48:40 compute-0 nova_compute[185723]:         <nova:disk>1</nova:disk>
Feb 16 13:48:40 compute-0 nova_compute[185723]:         <nova:swap>0</nova:swap>
Feb 16 13:48:40 compute-0 nova_compute[185723]:         <nova:ephemeral>0</nova:ephemeral>
Feb 16 13:48:40 compute-0 nova_compute[185723]:         <nova:vcpus>1</nova:vcpus>
Feb 16 13:48:40 compute-0 nova_compute[185723]:       </nova:flavor>
Feb 16 13:48:40 compute-0 nova_compute[185723]:       <nova:owner>
Feb 16 13:48:40 compute-0 nova_compute[185723]:         <nova:user uuid="e19cd2d8a8894526ba620ca3249e9a63">tempest-TestExecuteStrategies-1085993185-project-member</nova:user>
Feb 16 13:48:40 compute-0 nova_compute[185723]:         <nova:project uuid="76c271745e704d5fa97fe16a7dcd4a81">tempest-TestExecuteStrategies-1085993185</nova:project>
Feb 16 13:48:40 compute-0 nova_compute[185723]:       </nova:owner>
Feb 16 13:48:40 compute-0 nova_compute[185723]:       <nova:root type="image" uuid="6fb9af7f-2971-4890-a777-6e99e888717f"/>
Feb 16 13:48:40 compute-0 nova_compute[185723]:       <nova:ports>
Feb 16 13:48:40 compute-0 nova_compute[185723]:         <nova:port uuid="321f3fac-0060-4083-a357-cce4f142588b">
Feb 16 13:48:40 compute-0 nova_compute[185723]:           <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Feb 16 13:48:40 compute-0 nova_compute[185723]:         </nova:port>
Feb 16 13:48:40 compute-0 nova_compute[185723]:       </nova:ports>
Feb 16 13:48:40 compute-0 nova_compute[185723]:     </nova:instance>
Feb 16 13:48:40 compute-0 nova_compute[185723]:   </metadata>
Feb 16 13:48:40 compute-0 nova_compute[185723]:   <sysinfo type="smbios">
Feb 16 13:48:40 compute-0 nova_compute[185723]:     <system>
Feb 16 13:48:40 compute-0 nova_compute[185723]:       <entry name="manufacturer">RDO</entry>
Feb 16 13:48:40 compute-0 nova_compute[185723]:       <entry name="product">OpenStack Compute</entry>
Feb 16 13:48:40 compute-0 nova_compute[185723]:       <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Feb 16 13:48:40 compute-0 nova_compute[185723]:       <entry name="serial">18d7e5d6-d36a-46d7-b461-264c28cb9043</entry>
Feb 16 13:48:40 compute-0 nova_compute[185723]:       <entry name="uuid">18d7e5d6-d36a-46d7-b461-264c28cb9043</entry>
Feb 16 13:48:40 compute-0 nova_compute[185723]:       <entry name="family">Virtual Machine</entry>
Feb 16 13:48:40 compute-0 nova_compute[185723]:     </system>
Feb 16 13:48:40 compute-0 nova_compute[185723]:   </sysinfo>
Feb 16 13:48:40 compute-0 nova_compute[185723]:   <os>
Feb 16 13:48:40 compute-0 nova_compute[185723]:     <type arch="x86_64" machine="q35">hvm</type>
Feb 16 13:48:40 compute-0 nova_compute[185723]:     <boot dev="hd"/>
Feb 16 13:48:40 compute-0 nova_compute[185723]:     <smbios mode="sysinfo"/>
Feb 16 13:48:40 compute-0 nova_compute[185723]:   </os>
Feb 16 13:48:40 compute-0 nova_compute[185723]:   <features>
Feb 16 13:48:40 compute-0 nova_compute[185723]:     <acpi/>
Feb 16 13:48:40 compute-0 nova_compute[185723]:     <apic/>
Feb 16 13:48:40 compute-0 nova_compute[185723]:     <vmcoreinfo/>
Feb 16 13:48:40 compute-0 nova_compute[185723]:   </features>
Feb 16 13:48:40 compute-0 nova_compute[185723]:   <clock offset="utc">
Feb 16 13:48:40 compute-0 nova_compute[185723]:     <timer name="pit" tickpolicy="delay"/>
Feb 16 13:48:40 compute-0 nova_compute[185723]:     <timer name="rtc" tickpolicy="catchup"/>
Feb 16 13:48:40 compute-0 nova_compute[185723]:     <timer name="hpet" present="no"/>
Feb 16 13:48:40 compute-0 nova_compute[185723]:   </clock>
Feb 16 13:48:40 compute-0 nova_compute[185723]:   <cpu mode="custom" match="exact">
Feb 16 13:48:40 compute-0 nova_compute[185723]:     <model>Nehalem</model>
Feb 16 13:48:40 compute-0 nova_compute[185723]:     <topology sockets="1" cores="1" threads="1"/>
Feb 16 13:48:40 compute-0 nova_compute[185723]:   </cpu>
Feb 16 13:48:40 compute-0 nova_compute[185723]:   <devices>
Feb 16 13:48:40 compute-0 nova_compute[185723]:     <disk type="file" device="disk">
Feb 16 13:48:40 compute-0 nova_compute[185723]:       <driver name="qemu" type="qcow2" cache="none"/>
Feb 16 13:48:40 compute-0 nova_compute[185723]:       <source file="/var/lib/nova/instances/18d7e5d6-d36a-46d7-b461-264c28cb9043/disk"/>
Feb 16 13:48:40 compute-0 nova_compute[185723]:       <target dev="vda" bus="virtio"/>
Feb 16 13:48:40 compute-0 nova_compute[185723]:     </disk>
Feb 16 13:48:40 compute-0 nova_compute[185723]:     <disk type="file" device="cdrom">
Feb 16 13:48:40 compute-0 nova_compute[185723]:       <driver name="qemu" type="raw" cache="none"/>
Feb 16 13:48:40 compute-0 nova_compute[185723]:       <source file="/var/lib/nova/instances/18d7e5d6-d36a-46d7-b461-264c28cb9043/disk.config"/>
Feb 16 13:48:40 compute-0 nova_compute[185723]:       <target dev="sda" bus="sata"/>
Feb 16 13:48:40 compute-0 nova_compute[185723]:     </disk>
Feb 16 13:48:40 compute-0 nova_compute[185723]:     <interface type="ethernet">
Feb 16 13:48:40 compute-0 nova_compute[185723]:       <mac address="fa:16:3e:5e:12:9a"/>
Feb 16 13:48:40 compute-0 nova_compute[185723]:       <model type="virtio"/>
Feb 16 13:48:40 compute-0 nova_compute[185723]:       <driver name="vhost" rx_queue_size="512"/>
Feb 16 13:48:40 compute-0 nova_compute[185723]:       <mtu size="1442"/>
Feb 16 13:48:40 compute-0 nova_compute[185723]:       <target dev="tap321f3fac-00"/>
Feb 16 13:48:40 compute-0 nova_compute[185723]:     </interface>
Feb 16 13:48:40 compute-0 nova_compute[185723]:     <serial type="pty">
Feb 16 13:48:40 compute-0 nova_compute[185723]:       <log file="/var/lib/nova/instances/18d7e5d6-d36a-46d7-b461-264c28cb9043/console.log" append="off"/>
Feb 16 13:48:40 compute-0 nova_compute[185723]:     </serial>
Feb 16 13:48:40 compute-0 nova_compute[185723]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Feb 16 13:48:40 compute-0 nova_compute[185723]:     <video>
Feb 16 13:48:40 compute-0 nova_compute[185723]:       <model type="virtio"/>
Feb 16 13:48:40 compute-0 nova_compute[185723]:     </video>
Feb 16 13:48:40 compute-0 nova_compute[185723]:     <input type="tablet" bus="usb"/>
Feb 16 13:48:40 compute-0 nova_compute[185723]:     <rng model="virtio">
Feb 16 13:48:40 compute-0 nova_compute[185723]:       <backend model="random">/dev/urandom</backend>
Feb 16 13:48:40 compute-0 nova_compute[185723]:     </rng>
Feb 16 13:48:40 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root"/>
Feb 16 13:48:40 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:48:40 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:48:40 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:48:40 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:48:40 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:48:40 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:48:40 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:48:40 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:48:40 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:48:40 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:48:40 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:48:40 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:48:40 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:48:40 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:48:40 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:48:40 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:48:40 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:48:40 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:48:40 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:48:40 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:48:40 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:48:40 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:48:40 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:48:40 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:48:40 compute-0 nova_compute[185723]:     <controller type="usb" index="0"/>
Feb 16 13:48:40 compute-0 nova_compute[185723]:     <memballoon model="virtio">
Feb 16 13:48:40 compute-0 nova_compute[185723]:       <stats period="10"/>
Feb 16 13:48:40 compute-0 nova_compute[185723]:     </memballoon>
Feb 16 13:48:40 compute-0 nova_compute[185723]:   </devices>
Feb 16 13:48:40 compute-0 nova_compute[185723]: </domain>
Feb 16 13:48:40 compute-0 nova_compute[185723]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Feb 16 13:48:40 compute-0 nova_compute[185723]: 2026-02-16 13:48:40.194 185727 DEBUG nova.compute.manager [None req-708decea-b236-4591-adce-55d911640ae7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 18d7e5d6-d36a-46d7-b461-264c28cb9043] Preparing to wait for external event network-vif-plugged-321f3fac-0060-4083-a357-cce4f142588b prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Feb 16 13:48:40 compute-0 nova_compute[185723]: 2026-02-16 13:48:40.194 185727 DEBUG oslo_concurrency.lockutils [None req-708decea-b236-4591-adce-55d911640ae7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Acquiring lock "18d7e5d6-d36a-46d7-b461-264c28cb9043-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:48:40 compute-0 nova_compute[185723]: 2026-02-16 13:48:40.195 185727 DEBUG oslo_concurrency.lockutils [None req-708decea-b236-4591-adce-55d911640ae7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "18d7e5d6-d36a-46d7-b461-264c28cb9043-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:48:40 compute-0 nova_compute[185723]: 2026-02-16 13:48:40.195 185727 DEBUG oslo_concurrency.lockutils [None req-708decea-b236-4591-adce-55d911640ae7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "18d7e5d6-d36a-46d7-b461-264c28cb9043-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:48:40 compute-0 nova_compute[185723]: 2026-02-16 13:48:40.196 185727 DEBUG nova.virt.libvirt.vif [None req-708decea-b236-4591-adce-55d911640ae7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-16T13:48:34Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1866584778',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1866584778',id=24,image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='76c271745e704d5fa97fe16a7dcd4a81',ramdisk_id='',reservation_id='r-lgc90ub5',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-1085993185',owner_user_name='tempest-TestExecuteStrategies-1085993185-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-16T13:48:35Z,user_data=None,user_id='e19cd2d8a8894526ba620ca3249e9a63',uuid=18d7e5d6-d36a-46d7-b461-264c28cb9043,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "321f3fac-0060-4083-a357-cce4f142588b", "address": "fa:16:3e:5e:12:9a", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap321f3fac-00", "ovs_interfaceid": "321f3fac-0060-4083-a357-cce4f142588b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Feb 16 13:48:40 compute-0 nova_compute[185723]: 2026-02-16 13:48:40.196 185727 DEBUG nova.network.os_vif_util [None req-708decea-b236-4591-adce-55d911640ae7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Converting VIF {"id": "321f3fac-0060-4083-a357-cce4f142588b", "address": "fa:16:3e:5e:12:9a", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap321f3fac-00", "ovs_interfaceid": "321f3fac-0060-4083-a357-cce4f142588b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 13:48:40 compute-0 nova_compute[185723]: 2026-02-16 13:48:40.197 185727 DEBUG nova.network.os_vif_util [None req-708decea-b236-4591-adce-55d911640ae7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5e:12:9a,bridge_name='br-int',has_traffic_filtering=True,id=321f3fac-0060-4083-a357-cce4f142588b,network=Network(62a1ccdd-3048-4bbf-acc8-c791bff79ee8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap321f3fac-00') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 13:48:40 compute-0 nova_compute[185723]: 2026-02-16 13:48:40.197 185727 DEBUG os_vif [None req-708decea-b236-4591-adce-55d911640ae7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:5e:12:9a,bridge_name='br-int',has_traffic_filtering=True,id=321f3fac-0060-4083-a357-cce4f142588b,network=Network(62a1ccdd-3048-4bbf-acc8-c791bff79ee8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap321f3fac-00') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Feb 16 13:48:40 compute-0 nova_compute[185723]: 2026-02-16 13:48:40.197 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:48:40 compute-0 nova_compute[185723]: 2026-02-16 13:48:40.198 185727 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:48:40 compute-0 nova_compute[185723]: 2026-02-16 13:48:40.198 185727 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 13:48:40 compute-0 nova_compute[185723]: 2026-02-16 13:48:40.201 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:48:40 compute-0 nova_compute[185723]: 2026-02-16 13:48:40.201 185727 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap321f3fac-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:48:40 compute-0 nova_compute[185723]: 2026-02-16 13:48:40.202 185727 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap321f3fac-00, col_values=(('external_ids', {'iface-id': '321f3fac-0060-4083-a357-cce4f142588b', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:5e:12:9a', 'vm-uuid': '18d7e5d6-d36a-46d7-b461-264c28cb9043'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:48:40 compute-0 nova_compute[185723]: 2026-02-16 13:48:40.203 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:48:40 compute-0 nova_compute[185723]: 2026-02-16 13:48:40.204 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:48:40 compute-0 NetworkManager[56177]: <info>  [1771249720.2059] manager: (tap321f3fac-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/80)
Feb 16 13:48:40 compute-0 nova_compute[185723]: 2026-02-16 13:48:40.206 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 13:48:40 compute-0 nova_compute[185723]: 2026-02-16 13:48:40.211 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:48:40 compute-0 nova_compute[185723]: 2026-02-16 13:48:40.212 185727 INFO os_vif [None req-708decea-b236-4591-adce-55d911640ae7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:5e:12:9a,bridge_name='br-int',has_traffic_filtering=True,id=321f3fac-0060-4083-a357-cce4f142588b,network=Network(62a1ccdd-3048-4bbf-acc8-c791bff79ee8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap321f3fac-00')
Feb 16 13:48:40 compute-0 nova_compute[185723]: 2026-02-16 13:48:40.270 185727 DEBUG nova.virt.libvirt.driver [None req-708decea-b236-4591-adce-55d911640ae7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 16 13:48:40 compute-0 nova_compute[185723]: 2026-02-16 13:48:40.271 185727 DEBUG nova.virt.libvirt.driver [None req-708decea-b236-4591-adce-55d911640ae7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 16 13:48:40 compute-0 nova_compute[185723]: 2026-02-16 13:48:40.271 185727 DEBUG nova.virt.libvirt.driver [None req-708decea-b236-4591-adce-55d911640ae7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] No VIF found with MAC fa:16:3e:5e:12:9a, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Feb 16 13:48:40 compute-0 nova_compute[185723]: 2026-02-16 13:48:40.271 185727 INFO nova.virt.libvirt.driver [None req-708decea-b236-4591-adce-55d911640ae7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 18d7e5d6-d36a-46d7-b461-264c28cb9043] Using config drive
Feb 16 13:48:40 compute-0 nova_compute[185723]: 2026-02-16 13:48:40.557 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:48:41 compute-0 nova_compute[185723]: 2026-02-16 13:48:41.214 185727 INFO nova.virt.libvirt.driver [None req-708decea-b236-4591-adce-55d911640ae7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 18d7e5d6-d36a-46d7-b461-264c28cb9043] Creating config drive at /var/lib/nova/instances/18d7e5d6-d36a-46d7-b461-264c28cb9043/disk.config
Feb 16 13:48:41 compute-0 nova_compute[185723]: 2026-02-16 13:48:41.218 185727 DEBUG oslo_concurrency.processutils [None req-708decea-b236-4591-adce-55d911640ae7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/18d7e5d6-d36a-46d7-b461-264c28cb9043/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpxacwv3oz execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:48:41 compute-0 nova_compute[185723]: 2026-02-16 13:48:41.338 185727 DEBUG oslo_concurrency.processutils [None req-708decea-b236-4591-adce-55d911640ae7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/18d7e5d6-d36a-46d7-b461-264c28cb9043/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpxacwv3oz" returned: 0 in 0.120s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:48:41 compute-0 kernel: tap321f3fac-00: entered promiscuous mode
Feb 16 13:48:41 compute-0 NetworkManager[56177]: <info>  [1771249721.3947] manager: (tap321f3fac-00): new Tun device (/org/freedesktop/NetworkManager/Devices/81)
Feb 16 13:48:41 compute-0 ovn_controller[96072]: 2026-02-16T13:48:41Z|00200|binding|INFO|Claiming lport 321f3fac-0060-4083-a357-cce4f142588b for this chassis.
Feb 16 13:48:41 compute-0 ovn_controller[96072]: 2026-02-16T13:48:41Z|00201|binding|INFO|321f3fac-0060-4083-a357-cce4f142588b: Claiming fa:16:3e:5e:12:9a 10.100.0.5
Feb 16 13:48:41 compute-0 nova_compute[185723]: 2026-02-16 13:48:41.396 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:48:41 compute-0 ovn_controller[96072]: 2026-02-16T13:48:41Z|00202|binding|INFO|Setting lport 321f3fac-0060-4083-a357-cce4f142588b ovn-installed in OVS
Feb 16 13:48:41 compute-0 nova_compute[185723]: 2026-02-16 13:48:41.402 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:48:41 compute-0 ovn_controller[96072]: 2026-02-16T13:48:41Z|00203|binding|INFO|Setting lport 321f3fac-0060-4083-a357-cce4f142588b up in Southbound
Feb 16 13:48:41 compute-0 nova_compute[185723]: 2026-02-16 13:48:41.404 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:48:41 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:48:41.405 105360 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5e:12:9a 10.100.0.5'], port_security=['fa:16:3e:5e:12:9a 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '18d7e5d6-d36a-46d7-b461-264c28cb9043', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '76c271745e704d5fa97fe16a7dcd4a81', 'neutron:revision_number': '2', 'neutron:security_group_ids': '832f9d98-96fb-45e2-8c11-0f4534711ee2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fb3ae2f5-242d-4288-83f4-c053c76499e5, chassis=[<ovs.db.idl.Row object at 0x7f2e53046760>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2e53046760>], logical_port=321f3fac-0060-4083-a357-cce4f142588b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 13:48:41 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:48:41.407 105360 INFO neutron.agent.ovn.metadata.agent [-] Port 321f3fac-0060-4083-a357-cce4f142588b in datapath 62a1ccdd-3048-4bbf-acc8-c791bff79ee8 bound to our chassis
Feb 16 13:48:41 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:48:41.409 105360 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 62a1ccdd-3048-4bbf-acc8-c791bff79ee8
Feb 16 13:48:41 compute-0 nova_compute[185723]: 2026-02-16 13:48:41.414 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:48:41 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:48:41.419 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[cff00591-97b0-4f95-9f3c-d95eb52e52b4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:48:41 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:48:41.419 105360 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap62a1ccdd-31 in ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Feb 16 13:48:41 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:48:41.421 206438 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap62a1ccdd-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Feb 16 13:48:41 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:48:41.421 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[378ee48e-a5da-4bbf-9b5a-67a4d14e9bbd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:48:41 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:48:41.422 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[3b768ba8-487f-4276-8801-ab29a5f5d19c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:48:41 compute-0 systemd-udevd[215314]: Network interface NamePolicy= disabled on kernel command line.
Feb 16 13:48:41 compute-0 systemd-machined[155229]: New machine qemu-19-instance-00000018.
Feb 16 13:48:41 compute-0 NetworkManager[56177]: <info>  [1771249721.4332] device (tap321f3fac-00): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 16 13:48:41 compute-0 NetworkManager[56177]: <info>  [1771249721.4340] device (tap321f3fac-00): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Feb 16 13:48:41 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:48:41.435 105762 DEBUG oslo.privsep.daemon [-] privsep: reply[101ec7f7-f20a-42d6-a404-5292370a7550]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:48:41 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:48:41.448 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[478255d7-6a03-4f42-afe6-800d206b0903]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:48:41 compute-0 systemd[1]: Started Virtual Machine qemu-19-instance-00000018.
Feb 16 13:48:41 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:48:41.469 206452 DEBUG oslo.privsep.daemon [-] privsep: reply[331b119a-4d72-41ed-bb77-afe340e1f1a7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:48:41 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:48:41.474 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[9bea73a6-585d-447b-ba22-eb69f2cc2160]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:48:41 compute-0 NetworkManager[56177]: <info>  [1771249721.4750] manager: (tap62a1ccdd-30): new Veth device (/org/freedesktop/NetworkManager/Devices/82)
Feb 16 13:48:41 compute-0 systemd-udevd[215318]: Network interface NamePolicy= disabled on kernel command line.
Feb 16 13:48:41 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:48:41.499 206452 DEBUG oslo.privsep.daemon [-] privsep: reply[a9a88405-2c24-4f02-933d-f4b0cbc02bf9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:48:41 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:48:41.503 206452 DEBUG oslo.privsep.daemon [-] privsep: reply[2cabf5b7-ae5d-4c23-a966-b8078ebac02d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:48:41 compute-0 NetworkManager[56177]: <info>  [1771249721.5213] device (tap62a1ccdd-30): carrier: link connected
Feb 16 13:48:41 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:48:41.526 206452 DEBUG oslo.privsep.daemon [-] privsep: reply[1c9b1775-509a-414f-823c-1e8a095536e5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:48:41 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:48:41.540 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[1a152110-b805-44e0-a0d4-1e0d43ab12f1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap62a1ccdd-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a9:94:92'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 58], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 580491, 'reachable_time': 24379, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215347, 'error': None, 'target': 'ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:48:41 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:48:41.550 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[be0d2b3e-2d40-44e3-89b7-4b392880909e]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fea9:9492'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 580491, 'tstamp': 580491}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215348, 'error': None, 'target': 'ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:48:41 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:48:41.560 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[eab6e78a-ee1a-4030-87cc-bef11a65aea6]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap62a1ccdd-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a9:94:92'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 58], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 580491, 'reachable_time': 24379, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 215349, 'error': None, 'target': 'ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:48:41 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:48:41.583 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[67320c3d-b8b4-4366-ba21-363136d187fa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:48:41 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:48:41.639 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[34a0946c-1311-46aa-9c75-aca0f80a7c5c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:48:41 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:48:41.641 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap62a1ccdd-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:48:41 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:48:41.641 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 13:48:41 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:48:41.642 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap62a1ccdd-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:48:41 compute-0 NetworkManager[56177]: <info>  [1771249721.6446] manager: (tap62a1ccdd-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/83)
Feb 16 13:48:41 compute-0 nova_compute[185723]: 2026-02-16 13:48:41.645 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:48:41 compute-0 kernel: tap62a1ccdd-30: entered promiscuous mode
Feb 16 13:48:41 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:48:41.651 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap62a1ccdd-30, col_values=(('external_ids', {'iface-id': 'ac21d57d-f71e-4560-b6aa-e9f6e3838308'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:48:41 compute-0 ovn_controller[96072]: 2026-02-16T13:48:41Z|00204|binding|INFO|Releasing lport ac21d57d-f71e-4560-b6aa-e9f6e3838308 from this chassis (sb_readonly=0)
Feb 16 13:48:41 compute-0 nova_compute[185723]: 2026-02-16 13:48:41.653 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:48:41 compute-0 nova_compute[185723]: 2026-02-16 13:48:41.656 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:48:41 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:48:41.659 105360 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/62a1ccdd-3048-4bbf-acc8-c791bff79ee8.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/62a1ccdd-3048-4bbf-acc8-c791bff79ee8.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Feb 16 13:48:41 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:48:41.660 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[79d5a1c6-f9fe-4655-9d55-2376139d1eee]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:48:41 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:48:41.660 105360 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Feb 16 13:48:41 compute-0 ovn_metadata_agent[105355]: global
Feb 16 13:48:41 compute-0 ovn_metadata_agent[105355]:     log         /dev/log local0 debug
Feb 16 13:48:41 compute-0 ovn_metadata_agent[105355]:     log-tag     haproxy-metadata-proxy-62a1ccdd-3048-4bbf-acc8-c791bff79ee8
Feb 16 13:48:41 compute-0 ovn_metadata_agent[105355]:     user        root
Feb 16 13:48:41 compute-0 ovn_metadata_agent[105355]:     group       root
Feb 16 13:48:41 compute-0 ovn_metadata_agent[105355]:     maxconn     1024
Feb 16 13:48:41 compute-0 ovn_metadata_agent[105355]:     pidfile     /var/lib/neutron/external/pids/62a1ccdd-3048-4bbf-acc8-c791bff79ee8.pid.haproxy
Feb 16 13:48:41 compute-0 ovn_metadata_agent[105355]:     daemon
Feb 16 13:48:41 compute-0 ovn_metadata_agent[105355]: 
Feb 16 13:48:41 compute-0 ovn_metadata_agent[105355]: defaults
Feb 16 13:48:41 compute-0 ovn_metadata_agent[105355]:     log global
Feb 16 13:48:41 compute-0 ovn_metadata_agent[105355]:     mode http
Feb 16 13:48:41 compute-0 ovn_metadata_agent[105355]:     option httplog
Feb 16 13:48:41 compute-0 ovn_metadata_agent[105355]:     option dontlognull
Feb 16 13:48:41 compute-0 ovn_metadata_agent[105355]:     option http-server-close
Feb 16 13:48:41 compute-0 ovn_metadata_agent[105355]:     option forwardfor
Feb 16 13:48:41 compute-0 ovn_metadata_agent[105355]:     retries                 3
Feb 16 13:48:41 compute-0 ovn_metadata_agent[105355]:     timeout http-request    30s
Feb 16 13:48:41 compute-0 ovn_metadata_agent[105355]:     timeout connect         30s
Feb 16 13:48:41 compute-0 ovn_metadata_agent[105355]:     timeout client          32s
Feb 16 13:48:41 compute-0 ovn_metadata_agent[105355]:     timeout server          32s
Feb 16 13:48:41 compute-0 ovn_metadata_agent[105355]:     timeout http-keep-alive 30s
Feb 16 13:48:41 compute-0 ovn_metadata_agent[105355]: 
Feb 16 13:48:41 compute-0 ovn_metadata_agent[105355]: 
Feb 16 13:48:41 compute-0 ovn_metadata_agent[105355]: listen listener
Feb 16 13:48:41 compute-0 ovn_metadata_agent[105355]:     bind 169.254.169.254:80
Feb 16 13:48:41 compute-0 ovn_metadata_agent[105355]:     server metadata /var/lib/neutron/metadata_proxy
Feb 16 13:48:41 compute-0 ovn_metadata_agent[105355]:     http-request add-header X-OVN-Network-ID 62a1ccdd-3048-4bbf-acc8-c791bff79ee8
Feb 16 13:48:41 compute-0 ovn_metadata_agent[105355]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Feb 16 13:48:41 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:48:41.661 105360 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'env', 'PROCESS_TAG=haproxy-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/62a1ccdd-3048-4bbf-acc8-c791bff79ee8.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Feb 16 13:48:41 compute-0 podman[215381]: 2026-02-16 13:48:41.96565699 +0000 UTC m=+0.042567386 container create ec803df620918179e624404a16a88eccf163884843909bb613b7fbbfffe21a4e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3)
Feb 16 13:48:42 compute-0 systemd[1]: Started libpod-conmon-ec803df620918179e624404a16a88eccf163884843909bb613b7fbbfffe21a4e.scope.
Feb 16 13:48:42 compute-0 systemd[1]: Started libcrun container.
Feb 16 13:48:42 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/162a069f8c85a53382bea5d11160c4d165fee1f65a30aa36713f4a016e78157d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 16 13:48:42 compute-0 podman[215381]: 2026-02-16 13:48:41.943655295 +0000 UTC m=+0.020565721 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Feb 16 13:48:42 compute-0 podman[215381]: 2026-02-16 13:48:42.040587547 +0000 UTC m=+0.117497973 container init ec803df620918179e624404a16a88eccf163884843909bb613b7fbbfffe21a4e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127)
Feb 16 13:48:42 compute-0 podman[215381]: 2026-02-16 13:48:42.046267968 +0000 UTC m=+0.123178374 container start ec803df620918179e624404a16a88eccf163884843909bb613b7fbbfffe21a4e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Feb 16 13:48:42 compute-0 neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8[215397]: [NOTICE]   (215406) : New worker (215409) forked
Feb 16 13:48:42 compute-0 neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8[215397]: [NOTICE]   (215406) : Loading success.
Feb 16 13:48:42 compute-0 nova_compute[185723]: 2026-02-16 13:48:42.113 185727 DEBUG nova.compute.manager [req-d28ca417-dfa6-4eb9-a93d-cf023a53af8f req-1f7b0f24-f4c5-420f-a501-570a343cb3a2 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 18d7e5d6-d36a-46d7-b461-264c28cb9043] Received event network-vif-plugged-321f3fac-0060-4083-a357-cce4f142588b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:48:42 compute-0 nova_compute[185723]: 2026-02-16 13:48:42.113 185727 DEBUG oslo_concurrency.lockutils [req-d28ca417-dfa6-4eb9-a93d-cf023a53af8f req-1f7b0f24-f4c5-420f-a501-570a343cb3a2 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "18d7e5d6-d36a-46d7-b461-264c28cb9043-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:48:42 compute-0 nova_compute[185723]: 2026-02-16 13:48:42.114 185727 DEBUG oslo_concurrency.lockutils [req-d28ca417-dfa6-4eb9-a93d-cf023a53af8f req-1f7b0f24-f4c5-420f-a501-570a343cb3a2 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "18d7e5d6-d36a-46d7-b461-264c28cb9043-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:48:42 compute-0 nova_compute[185723]: 2026-02-16 13:48:42.114 185727 DEBUG oslo_concurrency.lockutils [req-d28ca417-dfa6-4eb9-a93d-cf023a53af8f req-1f7b0f24-f4c5-420f-a501-570a343cb3a2 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "18d7e5d6-d36a-46d7-b461-264c28cb9043-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:48:42 compute-0 nova_compute[185723]: 2026-02-16 13:48:42.114 185727 DEBUG nova.compute.manager [req-d28ca417-dfa6-4eb9-a93d-cf023a53af8f req-1f7b0f24-f4c5-420f-a501-570a343cb3a2 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 18d7e5d6-d36a-46d7-b461-264c28cb9043] Processing event network-vif-plugged-321f3fac-0060-4083-a357-cce4f142588b _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Feb 16 13:48:42 compute-0 nova_compute[185723]: 2026-02-16 13:48:42.123 185727 DEBUG nova.compute.manager [None req-708decea-b236-4591-adce-55d911640ae7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 18d7e5d6-d36a-46d7-b461-264c28cb9043] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Feb 16 13:48:42 compute-0 nova_compute[185723]: 2026-02-16 13:48:42.124 185727 DEBUG nova.virt.driver [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] Emitting event <LifecycleEvent: 1771249722.1231163, 18d7e5d6-d36a-46d7-b461-264c28cb9043 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 13:48:42 compute-0 nova_compute[185723]: 2026-02-16 13:48:42.124 185727 INFO nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: 18d7e5d6-d36a-46d7-b461-264c28cb9043] VM Started (Lifecycle Event)
Feb 16 13:48:42 compute-0 nova_compute[185723]: 2026-02-16 13:48:42.127 185727 DEBUG nova.virt.libvirt.driver [None req-708decea-b236-4591-adce-55d911640ae7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 18d7e5d6-d36a-46d7-b461-264c28cb9043] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Feb 16 13:48:42 compute-0 nova_compute[185723]: 2026-02-16 13:48:42.130 185727 INFO nova.virt.libvirt.driver [-] [instance: 18d7e5d6-d36a-46d7-b461-264c28cb9043] Instance spawned successfully.
Feb 16 13:48:42 compute-0 nova_compute[185723]: 2026-02-16 13:48:42.131 185727 DEBUG nova.virt.libvirt.driver [None req-708decea-b236-4591-adce-55d911640ae7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 18d7e5d6-d36a-46d7-b461-264c28cb9043] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Feb 16 13:48:42 compute-0 nova_compute[185723]: 2026-02-16 13:48:42.157 185727 DEBUG nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: 18d7e5d6-d36a-46d7-b461-264c28cb9043] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:48:42 compute-0 nova_compute[185723]: 2026-02-16 13:48:42.164 185727 DEBUG nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: 18d7e5d6-d36a-46d7-b461-264c28cb9043] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 16 13:48:42 compute-0 nova_compute[185723]: 2026-02-16 13:48:42.167 185727 DEBUG nova.virt.libvirt.driver [None req-708decea-b236-4591-adce-55d911640ae7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 18d7e5d6-d36a-46d7-b461-264c28cb9043] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:48:42 compute-0 nova_compute[185723]: 2026-02-16 13:48:42.167 185727 DEBUG nova.virt.libvirt.driver [None req-708decea-b236-4591-adce-55d911640ae7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 18d7e5d6-d36a-46d7-b461-264c28cb9043] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:48:42 compute-0 nova_compute[185723]: 2026-02-16 13:48:42.168 185727 DEBUG nova.virt.libvirt.driver [None req-708decea-b236-4591-adce-55d911640ae7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 18d7e5d6-d36a-46d7-b461-264c28cb9043] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:48:42 compute-0 nova_compute[185723]: 2026-02-16 13:48:42.168 185727 DEBUG nova.virt.libvirt.driver [None req-708decea-b236-4591-adce-55d911640ae7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 18d7e5d6-d36a-46d7-b461-264c28cb9043] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:48:42 compute-0 nova_compute[185723]: 2026-02-16 13:48:42.169 185727 DEBUG nova.virt.libvirt.driver [None req-708decea-b236-4591-adce-55d911640ae7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 18d7e5d6-d36a-46d7-b461-264c28cb9043] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:48:42 compute-0 nova_compute[185723]: 2026-02-16 13:48:42.170 185727 DEBUG nova.virt.libvirt.driver [None req-708decea-b236-4591-adce-55d911640ae7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 18d7e5d6-d36a-46d7-b461-264c28cb9043] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:48:42 compute-0 nova_compute[185723]: 2026-02-16 13:48:42.223 185727 INFO nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: 18d7e5d6-d36a-46d7-b461-264c28cb9043] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 16 13:48:42 compute-0 nova_compute[185723]: 2026-02-16 13:48:42.224 185727 DEBUG nova.virt.driver [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] Emitting event <LifecycleEvent: 1771249722.1233027, 18d7e5d6-d36a-46d7-b461-264c28cb9043 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 13:48:42 compute-0 nova_compute[185723]: 2026-02-16 13:48:42.224 185727 INFO nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: 18d7e5d6-d36a-46d7-b461-264c28cb9043] VM Paused (Lifecycle Event)
Feb 16 13:48:42 compute-0 nova_compute[185723]: 2026-02-16 13:48:42.263 185727 DEBUG nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: 18d7e5d6-d36a-46d7-b461-264c28cb9043] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:48:42 compute-0 nova_compute[185723]: 2026-02-16 13:48:42.267 185727 DEBUG nova.virt.driver [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] Emitting event <LifecycleEvent: 1771249722.1268487, 18d7e5d6-d36a-46d7-b461-264c28cb9043 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 13:48:42 compute-0 nova_compute[185723]: 2026-02-16 13:48:42.268 185727 INFO nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: 18d7e5d6-d36a-46d7-b461-264c28cb9043] VM Resumed (Lifecycle Event)
Feb 16 13:48:42 compute-0 nova_compute[185723]: 2026-02-16 13:48:42.284 185727 DEBUG nova.network.neutron [req-f9375a17-3225-43ab-b335-4e6631c77311 req-002f8627-74f1-4564-8047-07ff07c04900 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 18d7e5d6-d36a-46d7-b461-264c28cb9043] Updated VIF entry in instance network info cache for port 321f3fac-0060-4083-a357-cce4f142588b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 16 13:48:42 compute-0 nova_compute[185723]: 2026-02-16 13:48:42.285 185727 DEBUG nova.network.neutron [req-f9375a17-3225-43ab-b335-4e6631c77311 req-002f8627-74f1-4564-8047-07ff07c04900 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 18d7e5d6-d36a-46d7-b461-264c28cb9043] Updating instance_info_cache with network_info: [{"id": "321f3fac-0060-4083-a357-cce4f142588b", "address": "fa:16:3e:5e:12:9a", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap321f3fac-00", "ovs_interfaceid": "321f3fac-0060-4083-a357-cce4f142588b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 13:48:42 compute-0 nova_compute[185723]: 2026-02-16 13:48:42.300 185727 INFO nova.compute.manager [None req-708decea-b236-4591-adce-55d911640ae7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 18d7e5d6-d36a-46d7-b461-264c28cb9043] Took 6.36 seconds to spawn the instance on the hypervisor.
Feb 16 13:48:42 compute-0 nova_compute[185723]: 2026-02-16 13:48:42.301 185727 DEBUG nova.compute.manager [None req-708decea-b236-4591-adce-55d911640ae7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 18d7e5d6-d36a-46d7-b461-264c28cb9043] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:48:42 compute-0 nova_compute[185723]: 2026-02-16 13:48:42.332 185727 DEBUG nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: 18d7e5d6-d36a-46d7-b461-264c28cb9043] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:48:42 compute-0 nova_compute[185723]: 2026-02-16 13:48:42.335 185727 DEBUG nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: 18d7e5d6-d36a-46d7-b461-264c28cb9043] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 16 13:48:42 compute-0 nova_compute[185723]: 2026-02-16 13:48:42.338 185727 DEBUG oslo_concurrency.lockutils [req-f9375a17-3225-43ab-b335-4e6631c77311 req-002f8627-74f1-4564-8047-07ff07c04900 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Releasing lock "refresh_cache-18d7e5d6-d36a-46d7-b461-264c28cb9043" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 13:48:42 compute-0 nova_compute[185723]: 2026-02-16 13:48:42.377 185727 INFO nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: 18d7e5d6-d36a-46d7-b461-264c28cb9043] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 16 13:48:42 compute-0 nova_compute[185723]: 2026-02-16 13:48:42.386 185727 INFO nova.compute.manager [None req-708decea-b236-4591-adce-55d911640ae7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 18d7e5d6-d36a-46d7-b461-264c28cb9043] Took 6.89 seconds to build instance.
Feb 16 13:48:42 compute-0 nova_compute[185723]: 2026-02-16 13:48:42.402 185727 DEBUG oslo_concurrency.lockutils [None req-708decea-b236-4591-adce-55d911640ae7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "18d7e5d6-d36a-46d7-b461-264c28cb9043" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.009s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:48:44 compute-0 nova_compute[185723]: 2026-02-16 13:48:44.217 185727 DEBUG nova.compute.manager [req-6e759573-ffbe-4e40-8ef1-d36f8ed6ca66 req-eda4ab7b-5a1f-4eb0-bbe5-8616024d6822 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 18d7e5d6-d36a-46d7-b461-264c28cb9043] Received event network-vif-plugged-321f3fac-0060-4083-a357-cce4f142588b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:48:44 compute-0 nova_compute[185723]: 2026-02-16 13:48:44.218 185727 DEBUG oslo_concurrency.lockutils [req-6e759573-ffbe-4e40-8ef1-d36f8ed6ca66 req-eda4ab7b-5a1f-4eb0-bbe5-8616024d6822 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "18d7e5d6-d36a-46d7-b461-264c28cb9043-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:48:44 compute-0 nova_compute[185723]: 2026-02-16 13:48:44.219 185727 DEBUG oslo_concurrency.lockutils [req-6e759573-ffbe-4e40-8ef1-d36f8ed6ca66 req-eda4ab7b-5a1f-4eb0-bbe5-8616024d6822 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "18d7e5d6-d36a-46d7-b461-264c28cb9043-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:48:44 compute-0 nova_compute[185723]: 2026-02-16 13:48:44.219 185727 DEBUG oslo_concurrency.lockutils [req-6e759573-ffbe-4e40-8ef1-d36f8ed6ca66 req-eda4ab7b-5a1f-4eb0-bbe5-8616024d6822 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "18d7e5d6-d36a-46d7-b461-264c28cb9043-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:48:44 compute-0 nova_compute[185723]: 2026-02-16 13:48:44.219 185727 DEBUG nova.compute.manager [req-6e759573-ffbe-4e40-8ef1-d36f8ed6ca66 req-eda4ab7b-5a1f-4eb0-bbe5-8616024d6822 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 18d7e5d6-d36a-46d7-b461-264c28cb9043] No waiting events found dispatching network-vif-plugged-321f3fac-0060-4083-a357-cce4f142588b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:48:44 compute-0 nova_compute[185723]: 2026-02-16 13:48:44.219 185727 WARNING nova.compute.manager [req-6e759573-ffbe-4e40-8ef1-d36f8ed6ca66 req-eda4ab7b-5a1f-4eb0-bbe5-8616024d6822 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 18d7e5d6-d36a-46d7-b461-264c28cb9043] Received unexpected event network-vif-plugged-321f3fac-0060-4083-a357-cce4f142588b for instance with vm_state active and task_state None.
Feb 16 13:48:45 compute-0 podman[215419]: 2026-02-16 13:48:45.035105151 +0000 UTC m=+0.073960754 container health_status 19961a2f872cc72daf954f4dd556f980b5f58f137d145776df6132c167a8a93c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20260127, container_name=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Feb 16 13:48:45 compute-0 nova_compute[185723]: 2026-02-16 13:48:45.203 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:48:45 compute-0 nova_compute[185723]: 2026-02-16 13:48:45.559 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:48:48 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:48:48.220 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=b0e583b2-47d7-4bde-bbd6-282143e0c194, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '27'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:48:50 compute-0 nova_compute[185723]: 2026-02-16 13:48:50.207 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:48:50 compute-0 nova_compute[185723]: 2026-02-16 13:48:50.599 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:48:53 compute-0 podman[215464]: 2026-02-16 13:48:53.008095214 +0000 UTC m=+0.046238927 container health_status 4ec6105d1727f201f656d7e6e358931be8ceabc5af37fb5ad779abc620122180 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 16 13:48:54 compute-0 nova_compute[185723]: 2026-02-16 13:48:54.433 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:48:55 compute-0 ovn_controller[96072]: 2026-02-16T13:48:55Z|00028|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:5e:12:9a 10.100.0.5
Feb 16 13:48:55 compute-0 ovn_controller[96072]: 2026-02-16T13:48:55Z|00029|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:5e:12:9a 10.100.0.5
Feb 16 13:48:55 compute-0 nova_compute[185723]: 2026-02-16 13:48:55.254 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:48:55 compute-0 nova_compute[185723]: 2026-02-16 13:48:55.601 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:48:59 compute-0 podman[195053]: time="2026-02-16T13:48:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:48:59 compute-0 podman[195053]: @ - - [16/Feb/2026:13:48:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17245 "" "Go-http-client/1.1"
Feb 16 13:48:59 compute-0 podman[195053]: @ - - [16/Feb/2026:13:48:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2637 "" "Go-http-client/1.1"
Feb 16 13:49:00 compute-0 nova_compute[185723]: 2026-02-16 13:49:00.257 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:49:00 compute-0 nova_compute[185723]: 2026-02-16 13:49:00.604 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:49:01 compute-0 openstack_network_exporter[197909]: ERROR   13:49:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:49:01 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:49:01 compute-0 openstack_network_exporter[197909]: ERROR   13:49:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:49:01 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:49:01 compute-0 sshd-session[215488]: Invalid user postgres from 188.166.42.159 port 40820
Feb 16 13:49:01 compute-0 sshd-session[215488]: Connection closed by invalid user postgres 188.166.42.159 port 40820 [preauth]
Feb 16 13:49:02 compute-0 sshd-session[215490]: Invalid user ubuntu from 64.227.72.94 port 51702
Feb 16 13:49:02 compute-0 sshd-session[215490]: Connection closed by invalid user ubuntu 64.227.72.94 port 51702 [preauth]
Feb 16 13:49:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:49:03.243 105360 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:49:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:49:03.245 105360 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:49:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:49:03.246 105360 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:49:05 compute-0 nova_compute[185723]: 2026-02-16 13:49:05.258 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:49:05 compute-0 nova_compute[185723]: 2026-02-16 13:49:05.605 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:49:10 compute-0 podman[215493]: 2026-02-16 13:49:10.009160792 +0000 UTC m=+0.046434252 container health_status d69bd0be854ec8043fcba555b70c2cd311c7a2510d07d6bc9929fe7684abece9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Feb 16 13:49:10 compute-0 podman[215492]: 2026-02-16 13:49:10.017268423 +0000 UTC m=+0.055971308 container health_status 93151dcbec9f159583042df5101bdd7b5b1bcb5f3117955b4bc9cd1c0e465b98 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, org.opencontainers.image.created=2026-02-05T04:57:10Z, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9/ubi-minimal, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.7, io.openshift.expose-services=, maintainer=Red Hat, Inc., release=1770267347, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, build-date=2026-02-05T04:57:10Z, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-type=git, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, vendor=Red Hat, Inc., managed_by=edpm_ansible)
Feb 16 13:49:10 compute-0 nova_compute[185723]: 2026-02-16 13:49:10.262 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:49:10 compute-0 nova_compute[185723]: 2026-02-16 13:49:10.608 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:49:11 compute-0 ovn_controller[96072]: 2026-02-16T13:49:11Z|00205|memory_trim|INFO|Detected inactivity (last active 30008 ms ago): trimming memory
Feb 16 13:49:13 compute-0 sshd-session[215530]: Invalid user weblogic from 146.190.22.227 port 53230
Feb 16 13:49:13 compute-0 sshd-session[215530]: Connection closed by invalid user weblogic 146.190.22.227 port 53230 [preauth]
Feb 16 13:49:15 compute-0 nova_compute[185723]: 2026-02-16 13:49:15.265 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:49:15 compute-0 nova_compute[185723]: 2026-02-16 13:49:15.610 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:49:16 compute-0 podman[215533]: 2026-02-16 13:49:16.060311258 +0000 UTC m=+0.105359139 container health_status 19961a2f872cc72daf954f4dd556f980b5f58f137d145776df6132c167a8a93c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20260127, tcib_managed=true)
Feb 16 13:49:18 compute-0 systemd[1]: virtproxyd.service: Deactivated successfully.
Feb 16 13:49:20 compute-0 nova_compute[185723]: 2026-02-16 13:49:20.269 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:49:20 compute-0 nova_compute[185723]: 2026-02-16 13:49:20.613 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:49:22 compute-0 nova_compute[185723]: 2026-02-16 13:49:22.453 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:49:22 compute-0 nova_compute[185723]: 2026-02-16 13:49:22.480 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:49:22 compute-0 nova_compute[185723]: 2026-02-16 13:49:22.480 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:49:22 compute-0 nova_compute[185723]: 2026-02-16 13:49:22.481 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:49:22 compute-0 nova_compute[185723]: 2026-02-16 13:49:22.481 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 16 13:49:22 compute-0 nova_compute[185723]: 2026-02-16 13:49:22.564 185727 DEBUG oslo_concurrency.processutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/18d7e5d6-d36a-46d7-b461-264c28cb9043/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:49:22 compute-0 nova_compute[185723]: 2026-02-16 13:49:22.620 185727 DEBUG oslo_concurrency.processutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/18d7e5d6-d36a-46d7-b461-264c28cb9043/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:49:22 compute-0 nova_compute[185723]: 2026-02-16 13:49:22.622 185727 DEBUG oslo_concurrency.processutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/18d7e5d6-d36a-46d7-b461-264c28cb9043/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:49:22 compute-0 nova_compute[185723]: 2026-02-16 13:49:22.676 185727 DEBUG oslo_concurrency.processutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/18d7e5d6-d36a-46d7-b461-264c28cb9043/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:49:22 compute-0 nova_compute[185723]: 2026-02-16 13:49:22.807 185727 WARNING nova.virt.libvirt.driver [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 13:49:22 compute-0 nova_compute[185723]: 2026-02-16 13:49:22.808 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5688MB free_disk=73.19513320922852GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 16 13:49:22 compute-0 nova_compute[185723]: 2026-02-16 13:49:22.809 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:49:22 compute-0 nova_compute[185723]: 2026-02-16 13:49:22.809 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:49:22 compute-0 nova_compute[185723]: 2026-02-16 13:49:22.933 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Instance 18d7e5d6-d36a-46d7-b461-264c28cb9043 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 16 13:49:22 compute-0 nova_compute[185723]: 2026-02-16 13:49:22.934 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 16 13:49:22 compute-0 nova_compute[185723]: 2026-02-16 13:49:22.934 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 16 13:49:23 compute-0 nova_compute[185723]: 2026-02-16 13:49:23.077 185727 DEBUG nova.compute.provider_tree [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Inventory has not changed in ProviderTree for provider: c9501a85-df32-4b8f-bce0-9425ef1e7866 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 13:49:23 compute-0 nova_compute[185723]: 2026-02-16 13:49:23.096 185727 DEBUG nova.scheduler.client.report [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Inventory has not changed for provider c9501a85-df32-4b8f-bce0-9425ef1e7866 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 13:49:23 compute-0 nova_compute[185723]: 2026-02-16 13:49:23.117 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 16 13:49:23 compute-0 nova_compute[185723]: 2026-02-16 13:49:23.117 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.308s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:49:23 compute-0 podman[215569]: 2026-02-16 13:49:23.998429543 +0000 UTC m=+0.041893622 container health_status 4ec6105d1727f201f656d7e6e358931be8ceabc5af37fb5ad779abc620122180 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 16 13:49:25 compute-0 nova_compute[185723]: 2026-02-16 13:49:25.272 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:49:25 compute-0 nova_compute[185723]: 2026-02-16 13:49:25.615 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:49:26 compute-0 nova_compute[185723]: 2026-02-16 13:49:26.097 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:49:26 compute-0 nova_compute[185723]: 2026-02-16 13:49:26.098 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:49:26 compute-0 nova_compute[185723]: 2026-02-16 13:49:26.434 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:49:27 compute-0 nova_compute[185723]: 2026-02-16 13:49:27.433 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:49:27 compute-0 nova_compute[185723]: 2026-02-16 13:49:27.434 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 16 13:49:27 compute-0 nova_compute[185723]: 2026-02-16 13:49:27.435 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 16 13:49:27 compute-0 nova_compute[185723]: 2026-02-16 13:49:27.691 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Acquiring lock "refresh_cache-18d7e5d6-d36a-46d7-b461-264c28cb9043" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 13:49:27 compute-0 nova_compute[185723]: 2026-02-16 13:49:27.692 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Acquired lock "refresh_cache-18d7e5d6-d36a-46d7-b461-264c28cb9043" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 13:49:27 compute-0 nova_compute[185723]: 2026-02-16 13:49:27.692 185727 DEBUG nova.network.neutron [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] [instance: 18d7e5d6-d36a-46d7-b461-264c28cb9043] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Feb 16 13:49:27 compute-0 nova_compute[185723]: 2026-02-16 13:49:27.692 185727 DEBUG nova.objects.instance [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 18d7e5d6-d36a-46d7-b461-264c28cb9043 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 13:49:27 compute-0 sshd-session[215594]: Invalid user test from 146.190.226.24 port 33606
Feb 16 13:49:28 compute-0 sshd-session[215594]: Connection closed by invalid user test 146.190.226.24 port 33606 [preauth]
Feb 16 13:49:29 compute-0 podman[195053]: time="2026-02-16T13:49:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:49:29 compute-0 podman[195053]: @ - - [16/Feb/2026:13:49:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17245 "" "Go-http-client/1.1"
Feb 16 13:49:29 compute-0 podman[195053]: @ - - [16/Feb/2026:13:49:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2636 "" "Go-http-client/1.1"
Feb 16 13:49:30 compute-0 nova_compute[185723]: 2026-02-16 13:49:30.276 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:49:30 compute-0 nova_compute[185723]: 2026-02-16 13:49:30.426 185727 DEBUG nova.network.neutron [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] [instance: 18d7e5d6-d36a-46d7-b461-264c28cb9043] Updating instance_info_cache with network_info: [{"id": "321f3fac-0060-4083-a357-cce4f142588b", "address": "fa:16:3e:5e:12:9a", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap321f3fac-00", "ovs_interfaceid": "321f3fac-0060-4083-a357-cce4f142588b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 13:49:30 compute-0 nova_compute[185723]: 2026-02-16 13:49:30.462 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Releasing lock "refresh_cache-18d7e5d6-d36a-46d7-b461-264c28cb9043" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 13:49:30 compute-0 nova_compute[185723]: 2026-02-16 13:49:30.462 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] [instance: 18d7e5d6-d36a-46d7-b461-264c28cb9043] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Feb 16 13:49:30 compute-0 nova_compute[185723]: 2026-02-16 13:49:30.462 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:49:30 compute-0 nova_compute[185723]: 2026-02-16 13:49:30.463 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:49:30 compute-0 nova_compute[185723]: 2026-02-16 13:49:30.463 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:49:30 compute-0 nova_compute[185723]: 2026-02-16 13:49:30.463 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 16 13:49:30 compute-0 nova_compute[185723]: 2026-02-16 13:49:30.616 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:49:31 compute-0 openstack_network_exporter[197909]: ERROR   13:49:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:49:31 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:49:31 compute-0 openstack_network_exporter[197909]: ERROR   13:49:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:49:31 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:49:31 compute-0 nova_compute[185723]: 2026-02-16 13:49:31.846 185727 DEBUG nova.virt.libvirt.driver [None req-11fb7a2f-7445-4f74-80a4-7fcd93c341ba bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 4433d998-a1da-44d3-ae35-b75895398b1f] Creating tmpfile /var/lib/nova/instances/tmpez75_a0r to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10041
Feb 16 13:49:31 compute-0 nova_compute[185723]: 2026-02-16 13:49:31.847 185727 DEBUG nova.compute.manager [None req-11fb7a2f-7445-4f74-80a4-7fcd93c341ba bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpez75_a0r',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.9/site-packages/nova/compute/manager.py:8476
Feb 16 13:49:32 compute-0 nova_compute[185723]: 2026-02-16 13:49:32.668 185727 DEBUG nova.compute.manager [None req-11fb7a2f-7445-4f74-80a4-7fcd93c341ba bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpez75_a0r',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='4433d998-a1da-44d3-ae35-b75895398b1f',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8604
Feb 16 13:49:32 compute-0 nova_compute[185723]: 2026-02-16 13:49:32.707 185727 DEBUG oslo_concurrency.lockutils [None req-11fb7a2f-7445-4f74-80a4-7fcd93c341ba bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "refresh_cache-4433d998-a1da-44d3-ae35-b75895398b1f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 13:49:32 compute-0 nova_compute[185723]: 2026-02-16 13:49:32.707 185727 DEBUG oslo_concurrency.lockutils [None req-11fb7a2f-7445-4f74-80a4-7fcd93c341ba bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Acquired lock "refresh_cache-4433d998-a1da-44d3-ae35-b75895398b1f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 13:49:32 compute-0 nova_compute[185723]: 2026-02-16 13:49:32.708 185727 DEBUG nova.network.neutron [None req-11fb7a2f-7445-4f74-80a4-7fcd93c341ba bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 4433d998-a1da-44d3-ae35-b75895398b1f] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 16 13:49:33 compute-0 nova_compute[185723]: 2026-02-16 13:49:33.458 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:49:33 compute-0 nova_compute[185723]: 2026-02-16 13:49:33.459 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:49:35 compute-0 nova_compute[185723]: 2026-02-16 13:49:35.279 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:49:35 compute-0 nova_compute[185723]: 2026-02-16 13:49:35.618 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:49:37 compute-0 nova_compute[185723]: 2026-02-16 13:49:37.500 185727 DEBUG nova.network.neutron [None req-11fb7a2f-7445-4f74-80a4-7fcd93c341ba bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 4433d998-a1da-44d3-ae35-b75895398b1f] Updating instance_info_cache with network_info: [{"id": "4005b3ce-3d4d-4741-91d2-940ee880a617", "address": "fa:16:3e:1c:67:1b", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4005b3ce-3d", "ovs_interfaceid": "4005b3ce-3d4d-4741-91d2-940ee880a617", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 13:49:37 compute-0 nova_compute[185723]: 2026-02-16 13:49:37.530 185727 DEBUG oslo_concurrency.lockutils [None req-11fb7a2f-7445-4f74-80a4-7fcd93c341ba bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Releasing lock "refresh_cache-4433d998-a1da-44d3-ae35-b75895398b1f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 13:49:37 compute-0 nova_compute[185723]: 2026-02-16 13:49:37.532 185727 DEBUG nova.virt.libvirt.driver [None req-11fb7a2f-7445-4f74-80a4-7fcd93c341ba bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 4433d998-a1da-44d3-ae35-b75895398b1f] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpez75_a0r',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='4433d998-a1da-44d3-ae35-b75895398b1f',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10827
Feb 16 13:49:37 compute-0 nova_compute[185723]: 2026-02-16 13:49:37.532 185727 DEBUG nova.virt.libvirt.driver [None req-11fb7a2f-7445-4f74-80a4-7fcd93c341ba bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 4433d998-a1da-44d3-ae35-b75895398b1f] Creating instance directory: /var/lib/nova/instances/4433d998-a1da-44d3-ae35-b75895398b1f pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10840
Feb 16 13:49:37 compute-0 nova_compute[185723]: 2026-02-16 13:49:37.533 185727 DEBUG nova.virt.libvirt.driver [None req-11fb7a2f-7445-4f74-80a4-7fcd93c341ba bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 4433d998-a1da-44d3-ae35-b75895398b1f] Creating disk.info with the contents: {'/var/lib/nova/instances/4433d998-a1da-44d3-ae35-b75895398b1f/disk': 'qcow2', '/var/lib/nova/instances/4433d998-a1da-44d3-ae35-b75895398b1f/disk.config': 'raw'} pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10854
Feb 16 13:49:37 compute-0 nova_compute[185723]: 2026-02-16 13:49:37.533 185727 DEBUG nova.virt.libvirt.driver [None req-11fb7a2f-7445-4f74-80a4-7fcd93c341ba bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 4433d998-a1da-44d3-ae35-b75895398b1f] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10864
Feb 16 13:49:37 compute-0 nova_compute[185723]: 2026-02-16 13:49:37.534 185727 DEBUG nova.objects.instance [None req-11fb7a2f-7445-4f74-80a4-7fcd93c341ba bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lazy-loading 'trusted_certs' on Instance uuid 4433d998-a1da-44d3-ae35-b75895398b1f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 13:49:37 compute-0 nova_compute[185723]: 2026-02-16 13:49:37.566 185727 DEBUG oslo_concurrency.processutils [None req-11fb7a2f-7445-4f74-80a4-7fcd93c341ba bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:49:37 compute-0 nova_compute[185723]: 2026-02-16 13:49:37.617 185727 DEBUG oslo_concurrency.processutils [None req-11fb7a2f-7445-4f74-80a4-7fcd93c341ba bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:49:37 compute-0 nova_compute[185723]: 2026-02-16 13:49:37.619 185727 DEBUG oslo_concurrency.lockutils [None req-11fb7a2f-7445-4f74-80a4-7fcd93c341ba bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "755ae6cb63977e865a71a8c38a1a67ce95c9d0b7" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:49:37 compute-0 nova_compute[185723]: 2026-02-16 13:49:37.619 185727 DEBUG oslo_concurrency.lockutils [None req-11fb7a2f-7445-4f74-80a4-7fcd93c341ba bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "755ae6cb63977e865a71a8c38a1a67ce95c9d0b7" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:49:37 compute-0 nova_compute[185723]: 2026-02-16 13:49:37.635 185727 DEBUG oslo_concurrency.processutils [None req-11fb7a2f-7445-4f74-80a4-7fcd93c341ba bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:49:37 compute-0 nova_compute[185723]: 2026-02-16 13:49:37.683 185727 DEBUG oslo_concurrency.processutils [None req-11fb7a2f-7445-4f74-80a4-7fcd93c341ba bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json" returned: 0 in 0.048s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:49:37 compute-0 nova_compute[185723]: 2026-02-16 13:49:37.684 185727 DEBUG oslo_concurrency.processutils [None req-11fb7a2f-7445-4f74-80a4-7fcd93c341ba bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7,backing_fmt=raw /var/lib/nova/instances/4433d998-a1da-44d3-ae35-b75895398b1f/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:49:37 compute-0 nova_compute[185723]: 2026-02-16 13:49:37.787 185727 DEBUG oslo_concurrency.processutils [None req-11fb7a2f-7445-4f74-80a4-7fcd93c341ba bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7,backing_fmt=raw /var/lib/nova/instances/4433d998-a1da-44d3-ae35-b75895398b1f/disk 1073741824" returned: 0 in 0.103s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:49:37 compute-0 nova_compute[185723]: 2026-02-16 13:49:37.788 185727 DEBUG oslo_concurrency.lockutils [None req-11fb7a2f-7445-4f74-80a4-7fcd93c341ba bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "755ae6cb63977e865a71a8c38a1a67ce95c9d0b7" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.169s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:49:37 compute-0 nova_compute[185723]: 2026-02-16 13:49:37.789 185727 DEBUG oslo_concurrency.processutils [None req-11fb7a2f-7445-4f74-80a4-7fcd93c341ba bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:49:37 compute-0 nova_compute[185723]: 2026-02-16 13:49:37.840 185727 DEBUG oslo_concurrency.processutils [None req-11fb7a2f-7445-4f74-80a4-7fcd93c341ba bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:49:37 compute-0 nova_compute[185723]: 2026-02-16 13:49:37.841 185727 DEBUG nova.virt.disk.api [None req-11fb7a2f-7445-4f74-80a4-7fcd93c341ba bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Checking if we can resize image /var/lib/nova/instances/4433d998-a1da-44d3-ae35-b75895398b1f/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Feb 16 13:49:37 compute-0 nova_compute[185723]: 2026-02-16 13:49:37.841 185727 DEBUG oslo_concurrency.processutils [None req-11fb7a2f-7445-4f74-80a4-7fcd93c341ba bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4433d998-a1da-44d3-ae35-b75895398b1f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:49:37 compute-0 nova_compute[185723]: 2026-02-16 13:49:37.888 185727 DEBUG oslo_concurrency.processutils [None req-11fb7a2f-7445-4f74-80a4-7fcd93c341ba bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4433d998-a1da-44d3-ae35-b75895398b1f/disk --force-share --output=json" returned: 0 in 0.047s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:49:37 compute-0 nova_compute[185723]: 2026-02-16 13:49:37.889 185727 DEBUG nova.virt.disk.api [None req-11fb7a2f-7445-4f74-80a4-7fcd93c341ba bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Cannot resize image /var/lib/nova/instances/4433d998-a1da-44d3-ae35-b75895398b1f/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Feb 16 13:49:37 compute-0 nova_compute[185723]: 2026-02-16 13:49:37.889 185727 DEBUG nova.objects.instance [None req-11fb7a2f-7445-4f74-80a4-7fcd93c341ba bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lazy-loading 'migration_context' on Instance uuid 4433d998-a1da-44d3-ae35-b75895398b1f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 13:49:37 compute-0 nova_compute[185723]: 2026-02-16 13:49:37.916 185727 DEBUG oslo_concurrency.processutils [None req-11fb7a2f-7445-4f74-80a4-7fcd93c341ba bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/4433d998-a1da-44d3-ae35-b75895398b1f/disk.config 485376 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:49:37 compute-0 nova_compute[185723]: 2026-02-16 13:49:37.935 185727 DEBUG oslo_concurrency.processutils [None req-11fb7a2f-7445-4f74-80a4-7fcd93c341ba bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/4433d998-a1da-44d3-ae35-b75895398b1f/disk.config 485376" returned: 0 in 0.018s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:49:37 compute-0 nova_compute[185723]: 2026-02-16 13:49:37.937 185727 DEBUG nova.virt.libvirt.volume.remotefs [None req-11fb7a2f-7445-4f74-80a4-7fcd93c341ba bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Copying file compute-1.ctlplane.example.com:/var/lib/nova/instances/4433d998-a1da-44d3-ae35-b75895398b1f/disk.config to /var/lib/nova/instances/4433d998-a1da-44d3-ae35-b75895398b1f copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Feb 16 13:49:37 compute-0 nova_compute[185723]: 2026-02-16 13:49:37.937 185727 DEBUG oslo_concurrency.processutils [None req-11fb7a2f-7445-4f74-80a4-7fcd93c341ba bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Running cmd (subprocess): scp -C -r compute-1.ctlplane.example.com:/var/lib/nova/instances/4433d998-a1da-44d3-ae35-b75895398b1f/disk.config /var/lib/nova/instances/4433d998-a1da-44d3-ae35-b75895398b1f execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:49:38 compute-0 nova_compute[185723]: 2026-02-16 13:49:38.390 185727 DEBUG oslo_concurrency.processutils [None req-11fb7a2f-7445-4f74-80a4-7fcd93c341ba bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] CMD "scp -C -r compute-1.ctlplane.example.com:/var/lib/nova/instances/4433d998-a1da-44d3-ae35-b75895398b1f/disk.config /var/lib/nova/instances/4433d998-a1da-44d3-ae35-b75895398b1f" returned: 0 in 0.453s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:49:38 compute-0 nova_compute[185723]: 2026-02-16 13:49:38.391 185727 DEBUG nova.virt.libvirt.driver [None req-11fb7a2f-7445-4f74-80a4-7fcd93c341ba bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 4433d998-a1da-44d3-ae35-b75895398b1f] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10794
Feb 16 13:49:38 compute-0 nova_compute[185723]: 2026-02-16 13:49:38.393 185727 DEBUG nova.virt.libvirt.vif [None req-11fb7a2f-7445-4f74-80a4-7fcd93c341ba bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-16T13:48:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-313355006',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-313355006',id=23,image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-16T13:48:29Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='76c271745e704d5fa97fe16a7dcd4a81',ramdisk_id='',reservation_id='r-0v5zmrzs',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-1085993185',owner_user_name='tempest-TestExecuteStrategies-1085993185-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2026-02-16T13:48:29Z,user_data=None,user_id='e19cd2d8a8894526ba620ca3249e9a63',uuid=4433d998-a1da-44d3-ae35-b75895398b1f,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "4005b3ce-3d4d-4741-91d2-940ee880a617", "address": "fa:16:3e:1c:67:1b", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap4005b3ce-3d", "ovs_interfaceid": "4005b3ce-3d4d-4741-91d2-940ee880a617", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Feb 16 13:49:38 compute-0 nova_compute[185723]: 2026-02-16 13:49:38.393 185727 DEBUG nova.network.os_vif_util [None req-11fb7a2f-7445-4f74-80a4-7fcd93c341ba bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Converting VIF {"id": "4005b3ce-3d4d-4741-91d2-940ee880a617", "address": "fa:16:3e:1c:67:1b", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap4005b3ce-3d", "ovs_interfaceid": "4005b3ce-3d4d-4741-91d2-940ee880a617", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 13:49:38 compute-0 nova_compute[185723]: 2026-02-16 13:49:38.394 185727 DEBUG nova.network.os_vif_util [None req-11fb7a2f-7445-4f74-80a4-7fcd93c341ba bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1c:67:1b,bridge_name='br-int',has_traffic_filtering=True,id=4005b3ce-3d4d-4741-91d2-940ee880a617,network=Network(62a1ccdd-3048-4bbf-acc8-c791bff79ee8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4005b3ce-3d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 13:49:38 compute-0 nova_compute[185723]: 2026-02-16 13:49:38.395 185727 DEBUG os_vif [None req-11fb7a2f-7445-4f74-80a4-7fcd93c341ba bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:1c:67:1b,bridge_name='br-int',has_traffic_filtering=True,id=4005b3ce-3d4d-4741-91d2-940ee880a617,network=Network(62a1ccdd-3048-4bbf-acc8-c791bff79ee8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4005b3ce-3d') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Feb 16 13:49:38 compute-0 nova_compute[185723]: 2026-02-16 13:49:38.396 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:49:38 compute-0 nova_compute[185723]: 2026-02-16 13:49:38.396 185727 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:49:38 compute-0 nova_compute[185723]: 2026-02-16 13:49:38.397 185727 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 13:49:38 compute-0 nova_compute[185723]: 2026-02-16 13:49:38.399 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:49:38 compute-0 nova_compute[185723]: 2026-02-16 13:49:38.400 185727 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4005b3ce-3d, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:49:38 compute-0 nova_compute[185723]: 2026-02-16 13:49:38.400 185727 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap4005b3ce-3d, col_values=(('external_ids', {'iface-id': '4005b3ce-3d4d-4741-91d2-940ee880a617', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:1c:67:1b', 'vm-uuid': '4433d998-a1da-44d3-ae35-b75895398b1f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:49:38 compute-0 nova_compute[185723]: 2026-02-16 13:49:38.402 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:49:38 compute-0 NetworkManager[56177]: <info>  [1771249778.4035] manager: (tap4005b3ce-3d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/84)
Feb 16 13:49:38 compute-0 nova_compute[185723]: 2026-02-16 13:49:38.406 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 13:49:38 compute-0 nova_compute[185723]: 2026-02-16 13:49:38.409 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:49:38 compute-0 nova_compute[185723]: 2026-02-16 13:49:38.410 185727 INFO os_vif [None req-11fb7a2f-7445-4f74-80a4-7fcd93c341ba bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:1c:67:1b,bridge_name='br-int',has_traffic_filtering=True,id=4005b3ce-3d4d-4741-91d2-940ee880a617,network=Network(62a1ccdd-3048-4bbf-acc8-c791bff79ee8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4005b3ce-3d')
Feb 16 13:49:38 compute-0 nova_compute[185723]: 2026-02-16 13:49:38.411 185727 DEBUG nova.virt.libvirt.driver [None req-11fb7a2f-7445-4f74-80a4-7fcd93c341ba bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10954
Feb 16 13:49:38 compute-0 nova_compute[185723]: 2026-02-16 13:49:38.411 185727 DEBUG nova.compute.manager [None req-11fb7a2f-7445-4f74-80a4-7fcd93c341ba bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpez75_a0r',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='4433d998-a1da-44d3-ae35-b75895398b1f',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8668
Feb 16 13:49:39 compute-0 nova_compute[185723]: 2026-02-16 13:49:39.419 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:49:39 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:49:39.421 105360 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=28, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '52:96:1b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '16:6c:7a:bd:49:39'}, ipsec=False) old=SB_Global(nb_cfg=27) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 13:49:39 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:49:39.422 105360 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 16 13:49:40 compute-0 nova_compute[185723]: 2026-02-16 13:49:40.157 185727 DEBUG nova.network.neutron [None req-11fb7a2f-7445-4f74-80a4-7fcd93c341ba bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 4433d998-a1da-44d3-ae35-b75895398b1f] Port 4005b3ce-3d4d-4741-91d2-940ee880a617 updated with migration profile {'migrating_to': 'compute-0.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.9/site-packages/nova/network/neutron.py:354
Feb 16 13:49:40 compute-0 nova_compute[185723]: 2026-02-16 13:49:40.159 185727 DEBUG nova.compute.manager [None req-11fb7a2f-7445-4f74-80a4-7fcd93c341ba bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpez75_a0r',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='4433d998-a1da-44d3-ae35-b75895398b1f',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8723
Feb 16 13:49:40 compute-0 systemd[1]: Starting libvirt proxy daemon...
Feb 16 13:49:40 compute-0 systemd[1]: Started libvirt proxy daemon.
Feb 16 13:49:40 compute-0 podman[215619]: 2026-02-16 13:49:40.399938152 +0000 UTC m=+0.047417639 container health_status d69bd0be854ec8043fcba555b70c2cd311c7a2510d07d6bc9929fe7684abece9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Feb 16 13:49:40 compute-0 podman[215618]: 2026-02-16 13:49:40.434290076 +0000 UTC m=+0.082519842 container health_status 93151dcbec9f159583042df5101bdd7b5b1bcb5f3117955b4bc9cd1c0e465b98 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, vcs-type=git, build-date=2026-02-05T04:57:10Z, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., io.buildah.version=1.33.7, name=ubi9/ubi-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, org.opencontainers.image.created=2026-02-05T04:57:10Z, release=1770267347, com.redhat.component=ubi9-minimal-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64)
Feb 16 13:49:40 compute-0 kernel: tap4005b3ce-3d: entered promiscuous mode
Feb 16 13:49:40 compute-0 NetworkManager[56177]: <info>  [1771249780.5168] manager: (tap4005b3ce-3d): new Tun device (/org/freedesktop/NetworkManager/Devices/85)
Feb 16 13:49:40 compute-0 systemd-udevd[215689]: Network interface NamePolicy= disabled on kernel command line.
Feb 16 13:49:40 compute-0 nova_compute[185723]: 2026-02-16 13:49:40.571 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:49:40 compute-0 ovn_controller[96072]: 2026-02-16T13:49:40Z|00206|binding|INFO|Claiming lport 4005b3ce-3d4d-4741-91d2-940ee880a617 for this additional chassis.
Feb 16 13:49:40 compute-0 ovn_controller[96072]: 2026-02-16T13:49:40Z|00207|binding|INFO|4005b3ce-3d4d-4741-91d2-940ee880a617: Claiming fa:16:3e:1c:67:1b 10.100.0.11
Feb 16 13:49:40 compute-0 ovn_controller[96072]: 2026-02-16T13:49:40Z|00208|binding|INFO|Setting lport 4005b3ce-3d4d-4741-91d2-940ee880a617 ovn-installed in OVS
Feb 16 13:49:40 compute-0 nova_compute[185723]: 2026-02-16 13:49:40.578 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:49:40 compute-0 NetworkManager[56177]: <info>  [1771249780.5862] device (tap4005b3ce-3d): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 16 13:49:40 compute-0 NetworkManager[56177]: <info>  [1771249780.5872] device (tap4005b3ce-3d): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Feb 16 13:49:40 compute-0 systemd-machined[155229]: New machine qemu-20-instance-00000017.
Feb 16 13:49:40 compute-0 nova_compute[185723]: 2026-02-16 13:49:40.619 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:49:40 compute-0 systemd[1]: Started Virtual Machine qemu-20-instance-00000017.
Feb 16 13:49:41 compute-0 nova_compute[185723]: 2026-02-16 13:49:41.517 185727 DEBUG nova.virt.driver [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] Emitting event <LifecycleEvent: 1771249781.5172079, 4433d998-a1da-44d3-ae35-b75895398b1f => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 13:49:41 compute-0 nova_compute[185723]: 2026-02-16 13:49:41.518 185727 INFO nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: 4433d998-a1da-44d3-ae35-b75895398b1f] VM Started (Lifecycle Event)
Feb 16 13:49:41 compute-0 nova_compute[185723]: 2026-02-16 13:49:41.541 185727 DEBUG nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: 4433d998-a1da-44d3-ae35-b75895398b1f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:49:42 compute-0 nova_compute[185723]: 2026-02-16 13:49:42.190 185727 DEBUG nova.virt.driver [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] Emitting event <LifecycleEvent: 1771249782.1905353, 4433d998-a1da-44d3-ae35-b75895398b1f => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 13:49:42 compute-0 nova_compute[185723]: 2026-02-16 13:49:42.192 185727 INFO nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: 4433d998-a1da-44d3-ae35-b75895398b1f] VM Resumed (Lifecycle Event)
Feb 16 13:49:42 compute-0 nova_compute[185723]: 2026-02-16 13:49:42.220 185727 DEBUG nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: 4433d998-a1da-44d3-ae35-b75895398b1f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:49:42 compute-0 nova_compute[185723]: 2026-02-16 13:49:42.224 185727 DEBUG nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: 4433d998-a1da-44d3-ae35-b75895398b1f] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 16 13:49:42 compute-0 nova_compute[185723]: 2026-02-16 13:49:42.255 185727 INFO nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: 4433d998-a1da-44d3-ae35-b75895398b1f] During the sync_power process the instance has moved from host compute-1.ctlplane.example.com to host compute-0.ctlplane.example.com
Feb 16 13:49:43 compute-0 nova_compute[185723]: 2026-02-16 13:49:43.437 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:49:43 compute-0 ovn_controller[96072]: 2026-02-16T13:49:43Z|00209|binding|INFO|Claiming lport 4005b3ce-3d4d-4741-91d2-940ee880a617 for this chassis.
Feb 16 13:49:43 compute-0 ovn_controller[96072]: 2026-02-16T13:49:43Z|00210|binding|INFO|4005b3ce-3d4d-4741-91d2-940ee880a617: Claiming fa:16:3e:1c:67:1b 10.100.0.11
Feb 16 13:49:43 compute-0 ovn_controller[96072]: 2026-02-16T13:49:43Z|00211|binding|INFO|Setting lport 4005b3ce-3d4d-4741-91d2-940ee880a617 up in Southbound
Feb 16 13:49:43 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:49:43.804 105360 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1c:67:1b 10.100.0.11'], port_security=['fa:16:3e:1c:67:1b 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '4433d998-a1da-44d3-ae35-b75895398b1f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '76c271745e704d5fa97fe16a7dcd4a81', 'neutron:revision_number': '11', 'neutron:security_group_ids': '832f9d98-96fb-45e2-8c11-0f4534711ee2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fb3ae2f5-242d-4288-83f4-c053c76499e5, chassis=[<ovs.db.idl.Row object at 0x7f2e53046760>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2e53046760>], logical_port=4005b3ce-3d4d-4741-91d2-940ee880a617) old=Port_Binding(up=[False], additional_chassis=[<ovs.db.idl.Row object at 0x7f2e53046760>], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 13:49:43 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:49:43.806 105360 INFO neutron.agent.ovn.metadata.agent [-] Port 4005b3ce-3d4d-4741-91d2-940ee880a617 in datapath 62a1ccdd-3048-4bbf-acc8-c791bff79ee8 bound to our chassis
Feb 16 13:49:43 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:49:43.807 105360 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 62a1ccdd-3048-4bbf-acc8-c791bff79ee8
Feb 16 13:49:43 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:49:43.820 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[a3509459-c21e-4991-a714-d9f3376f5a15]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:49:43 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:49:43.845 206452 DEBUG oslo.privsep.daemon [-] privsep: reply[f724366b-d667-4b46-aab6-5ce456c662c1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:49:43 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:49:43.848 206452 DEBUG oslo.privsep.daemon [-] privsep: reply[740e7484-75ae-471f-b8a6-dc8253a0bbe9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:49:43 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:49:43.870 206452 DEBUG oslo.privsep.daemon [-] privsep: reply[735fc5ea-1cc8-444f-a23e-44adcd402d6a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:49:43 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:49:43.883 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[fa057619-657a-4541-bf9e-61c65911dc82]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap62a1ccdd-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a9:94:92'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 13, 'tx_packets': 5, 'rx_bytes': 826, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 13, 'tx_packets': 5, 'rx_bytes': 826, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 58], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 580491, 'reachable_time': 24379, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215726, 'error': None, 'target': 'ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:49:43 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:49:43.897 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[d84c0a0d-9009-4862-bd71-955ac677025b]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap62a1ccdd-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 580500, 'tstamp': 580500}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215727, 'error': None, 'target': 'ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap62a1ccdd-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 580502, 'tstamp': 580502}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215727, 'error': None, 'target': 'ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:49:43 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:49:43.899 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap62a1ccdd-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:49:43 compute-0 nova_compute[185723]: 2026-02-16 13:49:43.901 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:49:43 compute-0 nova_compute[185723]: 2026-02-16 13:49:43.901 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:49:43 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:49:43.902 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap62a1ccdd-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:49:43 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:49:43.902 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 13:49:43 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:49:43.903 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap62a1ccdd-30, col_values=(('external_ids', {'iface-id': 'ac21d57d-f71e-4560-b6aa-e9f6e3838308'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:49:43 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:49:43.903 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 13:49:44 compute-0 nova_compute[185723]: 2026-02-16 13:49:44.135 185727 INFO nova.compute.manager [None req-11fb7a2f-7445-4f74-80a4-7fcd93c341ba bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 4433d998-a1da-44d3-ae35-b75895398b1f] Post operation of migration started
Feb 16 13:49:44 compute-0 nova_compute[185723]: 2026-02-16 13:49:44.474 185727 DEBUG oslo_concurrency.lockutils [None req-11fb7a2f-7445-4f74-80a4-7fcd93c341ba bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "refresh_cache-4433d998-a1da-44d3-ae35-b75895398b1f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 13:49:44 compute-0 nova_compute[185723]: 2026-02-16 13:49:44.475 185727 DEBUG oslo_concurrency.lockutils [None req-11fb7a2f-7445-4f74-80a4-7fcd93c341ba bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Acquired lock "refresh_cache-4433d998-a1da-44d3-ae35-b75895398b1f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 13:49:44 compute-0 nova_compute[185723]: 2026-02-16 13:49:44.475 185727 DEBUG nova.network.neutron [None req-11fb7a2f-7445-4f74-80a4-7fcd93c341ba bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 4433d998-a1da-44d3-ae35-b75895398b1f] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 16 13:49:45 compute-0 nova_compute[185723]: 2026-02-16 13:49:45.621 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:49:45 compute-0 nova_compute[185723]: 2026-02-16 13:49:45.741 185727 DEBUG nova.network.neutron [None req-11fb7a2f-7445-4f74-80a4-7fcd93c341ba bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 4433d998-a1da-44d3-ae35-b75895398b1f] Updating instance_info_cache with network_info: [{"id": "4005b3ce-3d4d-4741-91d2-940ee880a617", "address": "fa:16:3e:1c:67:1b", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4005b3ce-3d", "ovs_interfaceid": "4005b3ce-3d4d-4741-91d2-940ee880a617", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 13:49:45 compute-0 nova_compute[185723]: 2026-02-16 13:49:45.761 185727 DEBUG oslo_concurrency.lockutils [None req-11fb7a2f-7445-4f74-80a4-7fcd93c341ba bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Releasing lock "refresh_cache-4433d998-a1da-44d3-ae35-b75895398b1f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 13:49:45 compute-0 nova_compute[185723]: 2026-02-16 13:49:45.775 185727 DEBUG oslo_concurrency.lockutils [None req-11fb7a2f-7445-4f74-80a4-7fcd93c341ba bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:49:45 compute-0 nova_compute[185723]: 2026-02-16 13:49:45.776 185727 DEBUG oslo_concurrency.lockutils [None req-11fb7a2f-7445-4f74-80a4-7fcd93c341ba bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:49:45 compute-0 nova_compute[185723]: 2026-02-16 13:49:45.776 185727 DEBUG oslo_concurrency.lockutils [None req-11fb7a2f-7445-4f74-80a4-7fcd93c341ba bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:49:45 compute-0 nova_compute[185723]: 2026-02-16 13:49:45.781 185727 INFO nova.virt.libvirt.driver [None req-11fb7a2f-7445-4f74-80a4-7fcd93c341ba bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 4433d998-a1da-44d3-ae35-b75895398b1f] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Feb 16 13:49:45 compute-0 virtqemud[184843]: Domain id=20 name='instance-00000017' uuid=4433d998-a1da-44d3-ae35-b75895398b1f is tainted: custom-monitor
Feb 16 13:49:46 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:49:46.424 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=b0e583b2-47d7-4bde-bbd6-282143e0c194, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '28'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:49:46 compute-0 sshd-session[215728]: Invalid user ubuntu from 64.227.72.94 port 47742
Feb 16 13:49:46 compute-0 sshd-session[215728]: Connection closed by invalid user ubuntu 64.227.72.94 port 47742 [preauth]
Feb 16 13:49:46 compute-0 podman[215730]: 2026-02-16 13:49:46.579085437 +0000 UTC m=+0.077398474 container health_status 19961a2f872cc72daf954f4dd556f980b5f58f137d145776df6132c167a8a93c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Feb 16 13:49:46 compute-0 nova_compute[185723]: 2026-02-16 13:49:46.789 185727 INFO nova.virt.libvirt.driver [None req-11fb7a2f-7445-4f74-80a4-7fcd93c341ba bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 4433d998-a1da-44d3-ae35-b75895398b1f] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Feb 16 13:49:47 compute-0 nova_compute[185723]: 2026-02-16 13:49:47.796 185727 INFO nova.virt.libvirt.driver [None req-11fb7a2f-7445-4f74-80a4-7fcd93c341ba bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 4433d998-a1da-44d3-ae35-b75895398b1f] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Feb 16 13:49:47 compute-0 nova_compute[185723]: 2026-02-16 13:49:47.806 185727 DEBUG nova.compute.manager [None req-11fb7a2f-7445-4f74-80a4-7fcd93c341ba bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 4433d998-a1da-44d3-ae35-b75895398b1f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:49:47 compute-0 nova_compute[185723]: 2026-02-16 13:49:47.838 185727 DEBUG nova.objects.instance [None req-11fb7a2f-7445-4f74-80a4-7fcd93c341ba bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 4433d998-a1da-44d3-ae35-b75895398b1f] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Feb 16 13:49:48 compute-0 nova_compute[185723]: 2026-02-16 13:49:48.440 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:49:50 compute-0 nova_compute[185723]: 2026-02-16 13:49:50.624 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:49:52 compute-0 nova_compute[185723]: 2026-02-16 13:49:52.160 185727 DEBUG oslo_concurrency.lockutils [None req-e689e1e5-0457-4425-b8fe-522283b9372c e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Acquiring lock "18d7e5d6-d36a-46d7-b461-264c28cb9043" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:49:52 compute-0 nova_compute[185723]: 2026-02-16 13:49:52.160 185727 DEBUG oslo_concurrency.lockutils [None req-e689e1e5-0457-4425-b8fe-522283b9372c e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "18d7e5d6-d36a-46d7-b461-264c28cb9043" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:49:52 compute-0 nova_compute[185723]: 2026-02-16 13:49:52.161 185727 DEBUG oslo_concurrency.lockutils [None req-e689e1e5-0457-4425-b8fe-522283b9372c e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Acquiring lock "18d7e5d6-d36a-46d7-b461-264c28cb9043-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:49:52 compute-0 nova_compute[185723]: 2026-02-16 13:49:52.161 185727 DEBUG oslo_concurrency.lockutils [None req-e689e1e5-0457-4425-b8fe-522283b9372c e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "18d7e5d6-d36a-46d7-b461-264c28cb9043-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:49:52 compute-0 nova_compute[185723]: 2026-02-16 13:49:52.161 185727 DEBUG oslo_concurrency.lockutils [None req-e689e1e5-0457-4425-b8fe-522283b9372c e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "18d7e5d6-d36a-46d7-b461-264c28cb9043-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:49:52 compute-0 nova_compute[185723]: 2026-02-16 13:49:52.162 185727 INFO nova.compute.manager [None req-e689e1e5-0457-4425-b8fe-522283b9372c e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 18d7e5d6-d36a-46d7-b461-264c28cb9043] Terminating instance
Feb 16 13:49:52 compute-0 nova_compute[185723]: 2026-02-16 13:49:52.163 185727 DEBUG nova.compute.manager [None req-e689e1e5-0457-4425-b8fe-522283b9372c e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 18d7e5d6-d36a-46d7-b461-264c28cb9043] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Feb 16 13:49:52 compute-0 kernel: tap321f3fac-00 (unregistering): left promiscuous mode
Feb 16 13:49:52 compute-0 NetworkManager[56177]: <info>  [1771249792.2257] device (tap321f3fac-00): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Feb 16 13:49:52 compute-0 ovn_controller[96072]: 2026-02-16T13:49:52Z|00212|binding|INFO|Releasing lport 321f3fac-0060-4083-a357-cce4f142588b from this chassis (sb_readonly=0)
Feb 16 13:49:52 compute-0 ovn_controller[96072]: 2026-02-16T13:49:52Z|00213|binding|INFO|Setting lport 321f3fac-0060-4083-a357-cce4f142588b down in Southbound
Feb 16 13:49:52 compute-0 ovn_controller[96072]: 2026-02-16T13:49:52Z|00214|binding|INFO|Removing iface tap321f3fac-00 ovn-installed in OVS
Feb 16 13:49:52 compute-0 nova_compute[185723]: 2026-02-16 13:49:52.233 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:49:52 compute-0 nova_compute[185723]: 2026-02-16 13:49:52.238 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:49:52 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:49:52.244 105360 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5e:12:9a 10.100.0.5'], port_security=['fa:16:3e:5e:12:9a 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '18d7e5d6-d36a-46d7-b461-264c28cb9043', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '76c271745e704d5fa97fe16a7dcd4a81', 'neutron:revision_number': '4', 'neutron:security_group_ids': '832f9d98-96fb-45e2-8c11-0f4534711ee2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fb3ae2f5-242d-4288-83f4-c053c76499e5, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2e53046760>], logical_port=321f3fac-0060-4083-a357-cce4f142588b) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2e53046760>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 13:49:52 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:49:52.245 105360 INFO neutron.agent.ovn.metadata.agent [-] Port 321f3fac-0060-4083-a357-cce4f142588b in datapath 62a1ccdd-3048-4bbf-acc8-c791bff79ee8 unbound from our chassis
Feb 16 13:49:52 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:49:52.246 105360 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 62a1ccdd-3048-4bbf-acc8-c791bff79ee8
Feb 16 13:49:52 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:49:52.258 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[7ea6e4e4-0a72-470d-bec0-b3cca3c1e65e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:49:52 compute-0 systemd[1]: machine-qemu\x2d19\x2dinstance\x2d00000018.scope: Deactivated successfully.
Feb 16 13:49:52 compute-0 systemd[1]: machine-qemu\x2d19\x2dinstance\x2d00000018.scope: Consumed 14.109s CPU time.
Feb 16 13:49:52 compute-0 systemd-machined[155229]: Machine qemu-19-instance-00000018 terminated.
Feb 16 13:49:52 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:49:52.284 206452 DEBUG oslo.privsep.daemon [-] privsep: reply[0945d51b-51ec-4672-b29c-e0fc062614a1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:49:52 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:49:52.287 206452 DEBUG oslo.privsep.daemon [-] privsep: reply[3ea993d3-4f21-4c87-8123-30287c63a30c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:49:52 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:49:52.302 206452 DEBUG oslo.privsep.daemon [-] privsep: reply[0874a48a-df83-47c7-95e3-d836e5391fbd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:49:52 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:49:52.317 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[dc019d83-dd26-464e-8650-0b80384e0da1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap62a1ccdd-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a9:94:92'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 28, 'tx_packets': 7, 'rx_bytes': 1456, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 28, 'tx_packets': 7, 'rx_bytes': 1456, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 58], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 580491, 'reachable_time': 24379, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215768, 'error': None, 'target': 'ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:49:52 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:49:52.327 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[f6aeb10c-89a9-4878-9013-e5090d23e2e2]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap62a1ccdd-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 580500, 'tstamp': 580500}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215769, 'error': None, 'target': 'ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap62a1ccdd-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 580502, 'tstamp': 580502}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215769, 'error': None, 'target': 'ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:49:52 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:49:52.329 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap62a1ccdd-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:49:52 compute-0 nova_compute[185723]: 2026-02-16 13:49:52.331 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:49:52 compute-0 nova_compute[185723]: 2026-02-16 13:49:52.336 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:49:52 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:49:52.336 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap62a1ccdd-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:49:52 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:49:52.337 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 13:49:52 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:49:52.337 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap62a1ccdd-30, col_values=(('external_ids', {'iface-id': 'ac21d57d-f71e-4560-b6aa-e9f6e3838308'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:49:52 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:49:52.338 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 13:49:52 compute-0 nova_compute[185723]: 2026-02-16 13:49:52.380 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:49:52 compute-0 nova_compute[185723]: 2026-02-16 13:49:52.384 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:49:52 compute-0 nova_compute[185723]: 2026-02-16 13:49:52.425 185727 INFO nova.virt.libvirt.driver [-] [instance: 18d7e5d6-d36a-46d7-b461-264c28cb9043] Instance destroyed successfully.
Feb 16 13:49:52 compute-0 nova_compute[185723]: 2026-02-16 13:49:52.425 185727 DEBUG nova.objects.instance [None req-e689e1e5-0457-4425-b8fe-522283b9372c e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lazy-loading 'resources' on Instance uuid 18d7e5d6-d36a-46d7-b461-264c28cb9043 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 13:49:52 compute-0 nova_compute[185723]: 2026-02-16 13:49:52.445 185727 DEBUG nova.virt.libvirt.vif [None req-e689e1e5-0457-4425-b8fe-522283b9372c e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-16T13:48:34Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1866584778',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1866584778',id=24,image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-16T13:48:42Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='76c271745e704d5fa97fe16a7dcd4a81',ramdisk_id='',reservation_id='r-lgc90ub5',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-1085993185',owner_user_name='tempest-TestExecuteStrategies-1085993185-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-16T13:48:42Z,user_data=None,user_id='e19cd2d8a8894526ba620ca3249e9a63',uuid=18d7e5d6-d36a-46d7-b461-264c28cb9043,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "321f3fac-0060-4083-a357-cce4f142588b", "address": "fa:16:3e:5e:12:9a", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap321f3fac-00", "ovs_interfaceid": "321f3fac-0060-4083-a357-cce4f142588b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Feb 16 13:49:52 compute-0 nova_compute[185723]: 2026-02-16 13:49:52.446 185727 DEBUG nova.network.os_vif_util [None req-e689e1e5-0457-4425-b8fe-522283b9372c e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Converting VIF {"id": "321f3fac-0060-4083-a357-cce4f142588b", "address": "fa:16:3e:5e:12:9a", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap321f3fac-00", "ovs_interfaceid": "321f3fac-0060-4083-a357-cce4f142588b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 13:49:52 compute-0 nova_compute[185723]: 2026-02-16 13:49:52.446 185727 DEBUG nova.network.os_vif_util [None req-e689e1e5-0457-4425-b8fe-522283b9372c e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:5e:12:9a,bridge_name='br-int',has_traffic_filtering=True,id=321f3fac-0060-4083-a357-cce4f142588b,network=Network(62a1ccdd-3048-4bbf-acc8-c791bff79ee8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap321f3fac-00') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 13:49:52 compute-0 nova_compute[185723]: 2026-02-16 13:49:52.446 185727 DEBUG os_vif [None req-e689e1e5-0457-4425-b8fe-522283b9372c e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:5e:12:9a,bridge_name='br-int',has_traffic_filtering=True,id=321f3fac-0060-4083-a357-cce4f142588b,network=Network(62a1ccdd-3048-4bbf-acc8-c791bff79ee8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap321f3fac-00') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Feb 16 13:49:52 compute-0 nova_compute[185723]: 2026-02-16 13:49:52.448 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:49:52 compute-0 nova_compute[185723]: 2026-02-16 13:49:52.449 185727 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap321f3fac-00, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:49:52 compute-0 nova_compute[185723]: 2026-02-16 13:49:52.450 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:49:52 compute-0 nova_compute[185723]: 2026-02-16 13:49:52.452 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 13:49:52 compute-0 nova_compute[185723]: 2026-02-16 13:49:52.454 185727 INFO os_vif [None req-e689e1e5-0457-4425-b8fe-522283b9372c e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:5e:12:9a,bridge_name='br-int',has_traffic_filtering=True,id=321f3fac-0060-4083-a357-cce4f142588b,network=Network(62a1ccdd-3048-4bbf-acc8-c791bff79ee8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap321f3fac-00')
Feb 16 13:49:52 compute-0 nova_compute[185723]: 2026-02-16 13:49:52.454 185727 INFO nova.virt.libvirt.driver [None req-e689e1e5-0457-4425-b8fe-522283b9372c e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 18d7e5d6-d36a-46d7-b461-264c28cb9043] Deleting instance files /var/lib/nova/instances/18d7e5d6-d36a-46d7-b461-264c28cb9043_del
Feb 16 13:49:52 compute-0 nova_compute[185723]: 2026-02-16 13:49:52.455 185727 INFO nova.virt.libvirt.driver [None req-e689e1e5-0457-4425-b8fe-522283b9372c e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 18d7e5d6-d36a-46d7-b461-264c28cb9043] Deletion of /var/lib/nova/instances/18d7e5d6-d36a-46d7-b461-264c28cb9043_del complete
Feb 16 13:49:52 compute-0 nova_compute[185723]: 2026-02-16 13:49:52.524 185727 INFO nova.compute.manager [None req-e689e1e5-0457-4425-b8fe-522283b9372c e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 18d7e5d6-d36a-46d7-b461-264c28cb9043] Took 0.36 seconds to destroy the instance on the hypervisor.
Feb 16 13:49:52 compute-0 nova_compute[185723]: 2026-02-16 13:49:52.524 185727 DEBUG oslo.service.loopingcall [None req-e689e1e5-0457-4425-b8fe-522283b9372c e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Feb 16 13:49:52 compute-0 nova_compute[185723]: 2026-02-16 13:49:52.525 185727 DEBUG nova.compute.manager [-] [instance: 18d7e5d6-d36a-46d7-b461-264c28cb9043] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Feb 16 13:49:52 compute-0 nova_compute[185723]: 2026-02-16 13:49:52.525 185727 DEBUG nova.network.neutron [-] [instance: 18d7e5d6-d36a-46d7-b461-264c28cb9043] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Feb 16 13:49:53 compute-0 nova_compute[185723]: 2026-02-16 13:49:53.295 185727 DEBUG nova.compute.manager [req-d739fe44-3b94-4252-ad47-34cc204014bf req-c2065869-32ee-4663-9475-e5c743766253 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 18d7e5d6-d36a-46d7-b461-264c28cb9043] Received event network-vif-unplugged-321f3fac-0060-4083-a357-cce4f142588b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:49:53 compute-0 nova_compute[185723]: 2026-02-16 13:49:53.295 185727 DEBUG oslo_concurrency.lockutils [req-d739fe44-3b94-4252-ad47-34cc204014bf req-c2065869-32ee-4663-9475-e5c743766253 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "18d7e5d6-d36a-46d7-b461-264c28cb9043-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:49:53 compute-0 nova_compute[185723]: 2026-02-16 13:49:53.295 185727 DEBUG oslo_concurrency.lockutils [req-d739fe44-3b94-4252-ad47-34cc204014bf req-c2065869-32ee-4663-9475-e5c743766253 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "18d7e5d6-d36a-46d7-b461-264c28cb9043-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:49:53 compute-0 nova_compute[185723]: 2026-02-16 13:49:53.295 185727 DEBUG oslo_concurrency.lockutils [req-d739fe44-3b94-4252-ad47-34cc204014bf req-c2065869-32ee-4663-9475-e5c743766253 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "18d7e5d6-d36a-46d7-b461-264c28cb9043-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:49:53 compute-0 nova_compute[185723]: 2026-02-16 13:49:53.296 185727 DEBUG nova.compute.manager [req-d739fe44-3b94-4252-ad47-34cc204014bf req-c2065869-32ee-4663-9475-e5c743766253 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 18d7e5d6-d36a-46d7-b461-264c28cb9043] No waiting events found dispatching network-vif-unplugged-321f3fac-0060-4083-a357-cce4f142588b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:49:53 compute-0 nova_compute[185723]: 2026-02-16 13:49:53.296 185727 DEBUG nova.compute.manager [req-d739fe44-3b94-4252-ad47-34cc204014bf req-c2065869-32ee-4663-9475-e5c743766253 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 18d7e5d6-d36a-46d7-b461-264c28cb9043] Received event network-vif-unplugged-321f3fac-0060-4083-a357-cce4f142588b for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Feb 16 13:49:53 compute-0 sshd-session[215787]: Invalid user postgres from 188.166.42.159 port 43550
Feb 16 13:49:53 compute-0 nova_compute[185723]: 2026-02-16 13:49:53.577 185727 DEBUG nova.network.neutron [-] [instance: 18d7e5d6-d36a-46d7-b461-264c28cb9043] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 13:49:53 compute-0 nova_compute[185723]: 2026-02-16 13:49:53.592 185727 INFO nova.compute.manager [-] [instance: 18d7e5d6-d36a-46d7-b461-264c28cb9043] Took 1.07 seconds to deallocate network for instance.
Feb 16 13:49:53 compute-0 sshd-session[215787]: Connection closed by invalid user postgres 188.166.42.159 port 43550 [preauth]
Feb 16 13:49:53 compute-0 nova_compute[185723]: 2026-02-16 13:49:53.668 185727 DEBUG oslo_concurrency.lockutils [None req-e689e1e5-0457-4425-b8fe-522283b9372c e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:49:53 compute-0 nova_compute[185723]: 2026-02-16 13:49:53.669 185727 DEBUG oslo_concurrency.lockutils [None req-e689e1e5-0457-4425-b8fe-522283b9372c e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:49:53 compute-0 nova_compute[185723]: 2026-02-16 13:49:53.686 185727 DEBUG nova.compute.manager [req-00f25309-45a1-4019-8823-a89248da6973 req-f34aa5bf-694e-43bd-80dd-f524a94381fd faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 18d7e5d6-d36a-46d7-b461-264c28cb9043] Received event network-vif-deleted-321f3fac-0060-4083-a357-cce4f142588b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:49:53 compute-0 nova_compute[185723]: 2026-02-16 13:49:53.752 185727 DEBUG nova.compute.provider_tree [None req-e689e1e5-0457-4425-b8fe-522283b9372c e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Inventory has not changed in ProviderTree for provider: c9501a85-df32-4b8f-bce0-9425ef1e7866 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 13:49:53 compute-0 nova_compute[185723]: 2026-02-16 13:49:53.769 185727 DEBUG nova.scheduler.client.report [None req-e689e1e5-0457-4425-b8fe-522283b9372c e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Inventory has not changed for provider c9501a85-df32-4b8f-bce0-9425ef1e7866 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 13:49:53 compute-0 nova_compute[185723]: 2026-02-16 13:49:53.793 185727 DEBUG oslo_concurrency.lockutils [None req-e689e1e5-0457-4425-b8fe-522283b9372c e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.124s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:49:53 compute-0 nova_compute[185723]: 2026-02-16 13:49:53.830 185727 INFO nova.scheduler.client.report [None req-e689e1e5-0457-4425-b8fe-522283b9372c e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Deleted allocations for instance 18d7e5d6-d36a-46d7-b461-264c28cb9043
Feb 16 13:49:53 compute-0 nova_compute[185723]: 2026-02-16 13:49:53.929 185727 DEBUG oslo_concurrency.lockutils [None req-e689e1e5-0457-4425-b8fe-522283b9372c e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "18d7e5d6-d36a-46d7-b461-264c28cb9043" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.769s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:49:54 compute-0 nova_compute[185723]: 2026-02-16 13:49:54.714 185727 DEBUG oslo_concurrency.lockutils [None req-b5676753-ec9f-44cc-95e0-60554ad991a7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Acquiring lock "4433d998-a1da-44d3-ae35-b75895398b1f" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:49:54 compute-0 nova_compute[185723]: 2026-02-16 13:49:54.714 185727 DEBUG oslo_concurrency.lockutils [None req-b5676753-ec9f-44cc-95e0-60554ad991a7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "4433d998-a1da-44d3-ae35-b75895398b1f" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:49:54 compute-0 nova_compute[185723]: 2026-02-16 13:49:54.714 185727 DEBUG oslo_concurrency.lockutils [None req-b5676753-ec9f-44cc-95e0-60554ad991a7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Acquiring lock "4433d998-a1da-44d3-ae35-b75895398b1f-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:49:54 compute-0 nova_compute[185723]: 2026-02-16 13:49:54.715 185727 DEBUG oslo_concurrency.lockutils [None req-b5676753-ec9f-44cc-95e0-60554ad991a7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "4433d998-a1da-44d3-ae35-b75895398b1f-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:49:54 compute-0 nova_compute[185723]: 2026-02-16 13:49:54.715 185727 DEBUG oslo_concurrency.lockutils [None req-b5676753-ec9f-44cc-95e0-60554ad991a7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "4433d998-a1da-44d3-ae35-b75895398b1f-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:49:54 compute-0 nova_compute[185723]: 2026-02-16 13:49:54.716 185727 INFO nova.compute.manager [None req-b5676753-ec9f-44cc-95e0-60554ad991a7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 4433d998-a1da-44d3-ae35-b75895398b1f] Terminating instance
Feb 16 13:49:54 compute-0 nova_compute[185723]: 2026-02-16 13:49:54.717 185727 DEBUG nova.compute.manager [None req-b5676753-ec9f-44cc-95e0-60554ad991a7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 4433d998-a1da-44d3-ae35-b75895398b1f] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Feb 16 13:49:54 compute-0 kernel: tap4005b3ce-3d (unregistering): left promiscuous mode
Feb 16 13:49:54 compute-0 NetworkManager[56177]: <info>  [1771249794.7405] device (tap4005b3ce-3d): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Feb 16 13:49:54 compute-0 ovn_controller[96072]: 2026-02-16T13:49:54Z|00215|binding|INFO|Releasing lport 4005b3ce-3d4d-4741-91d2-940ee880a617 from this chassis (sb_readonly=0)
Feb 16 13:49:54 compute-0 ovn_controller[96072]: 2026-02-16T13:49:54Z|00216|binding|INFO|Setting lport 4005b3ce-3d4d-4741-91d2-940ee880a617 down in Southbound
Feb 16 13:49:54 compute-0 nova_compute[185723]: 2026-02-16 13:49:54.748 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:49:54 compute-0 ovn_controller[96072]: 2026-02-16T13:49:54Z|00217|binding|INFO|Removing iface tap4005b3ce-3d ovn-installed in OVS
Feb 16 13:49:54 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:49:54.756 105360 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1c:67:1b 10.100.0.11'], port_security=['fa:16:3e:1c:67:1b 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '4433d998-a1da-44d3-ae35-b75895398b1f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '76c271745e704d5fa97fe16a7dcd4a81', 'neutron:revision_number': '13', 'neutron:security_group_ids': '832f9d98-96fb-45e2-8c11-0f4534711ee2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fb3ae2f5-242d-4288-83f4-c053c76499e5, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2e53046760>], logical_port=4005b3ce-3d4d-4741-91d2-940ee880a617) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2e53046760>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 13:49:54 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:49:54.758 105360 INFO neutron.agent.ovn.metadata.agent [-] Port 4005b3ce-3d4d-4741-91d2-940ee880a617 in datapath 62a1ccdd-3048-4bbf-acc8-c791bff79ee8 unbound from our chassis
Feb 16 13:49:54 compute-0 nova_compute[185723]: 2026-02-16 13:49:54.758 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:49:54 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:49:54.759 105360 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 62a1ccdd-3048-4bbf-acc8-c791bff79ee8, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 16 13:49:54 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:49:54.760 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[4e0e2f35-f7c6-4639-9970-4e9bbe9fdd1d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:49:54 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:49:54.761 105360 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8 namespace which is not needed anymore
Feb 16 13:49:54 compute-0 systemd[1]: machine-qemu\x2d20\x2dinstance\x2d00000017.scope: Deactivated successfully.
Feb 16 13:49:54 compute-0 systemd[1]: machine-qemu\x2d20\x2dinstance\x2d00000017.scope: Consumed 1.686s CPU time.
Feb 16 13:49:54 compute-0 systemd-machined[155229]: Machine qemu-20-instance-00000017 terminated.
Feb 16 13:49:54 compute-0 podman[215792]: 2026-02-16 13:49:54.828455088 +0000 UTC m=+0.056839734 container health_status 4ec6105d1727f201f656d7e6e358931be8ceabc5af37fb5ad779abc620122180 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 16 13:49:54 compute-0 neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8[215397]: [NOTICE]   (215406) : haproxy version is 2.8.14-c23fe91
Feb 16 13:49:54 compute-0 neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8[215397]: [NOTICE]   (215406) : path to executable is /usr/sbin/haproxy
Feb 16 13:49:54 compute-0 neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8[215397]: [WARNING]  (215406) : Exiting Master process...
Feb 16 13:49:54 compute-0 neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8[215397]: [WARNING]  (215406) : Exiting Master process...
Feb 16 13:49:54 compute-0 neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8[215397]: [ALERT]    (215406) : Current worker (215409) exited with code 143 (Terminated)
Feb 16 13:49:54 compute-0 neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8[215397]: [WARNING]  (215406) : All workers exited. Exiting... (0)
Feb 16 13:49:54 compute-0 systemd[1]: libpod-ec803df620918179e624404a16a88eccf163884843909bb613b7fbbfffe21a4e.scope: Deactivated successfully.
Feb 16 13:49:54 compute-0 podman[215835]: 2026-02-16 13:49:54.885089824 +0000 UTC m=+0.043462620 container died ec803df620918179e624404a16a88eccf163884843909bb613b7fbbfffe21a4e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.schema-version=1.0)
Feb 16 13:49:54 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ec803df620918179e624404a16a88eccf163884843909bb613b7fbbfffe21a4e-userdata-shm.mount: Deactivated successfully.
Feb 16 13:49:54 compute-0 systemd[1]: var-lib-containers-storage-overlay-162a069f8c85a53382bea5d11160c4d165fee1f65a30aa36713f4a016e78157d-merged.mount: Deactivated successfully.
Feb 16 13:49:54 compute-0 podman[215835]: 2026-02-16 13:49:54.920562945 +0000 UTC m=+0.078935741 container cleanup ec803df620918179e624404a16a88eccf163884843909bb613b7fbbfffe21a4e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Feb 16 13:49:54 compute-0 NetworkManager[56177]: <info>  [1771249794.9335] manager: (tap4005b3ce-3d): new Tun device (/org/freedesktop/NetworkManager/Devices/86)
Feb 16 13:49:54 compute-0 systemd[1]: libpod-conmon-ec803df620918179e624404a16a88eccf163884843909bb613b7fbbfffe21a4e.scope: Deactivated successfully.
Feb 16 13:49:54 compute-0 kernel: tap4005b3ce-3d: entered promiscuous mode
Feb 16 13:49:54 compute-0 kernel: tap4005b3ce-3d (unregistering): left promiscuous mode
Feb 16 13:49:54 compute-0 nova_compute[185723]: 2026-02-16 13:49:54.938 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:49:54 compute-0 ovn_controller[96072]: 2026-02-16T13:49:54Z|00218|binding|INFO|Claiming lport 4005b3ce-3d4d-4741-91d2-940ee880a617 for this chassis.
Feb 16 13:49:54 compute-0 ovn_controller[96072]: 2026-02-16T13:49:54Z|00219|binding|INFO|4005b3ce-3d4d-4741-91d2-940ee880a617: Claiming fa:16:3e:1c:67:1b 10.100.0.11
Feb 16 13:49:54 compute-0 nova_compute[185723]: 2026-02-16 13:49:54.945 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:49:54 compute-0 ovn_controller[96072]: 2026-02-16T13:49:54Z|00220|if_status|INFO|Dropped 5 log messages in last 1539 seconds (most recently, 1539 seconds ago) due to excessive rate
Feb 16 13:49:54 compute-0 ovn_controller[96072]: 2026-02-16T13:49:54Z|00221|if_status|INFO|Not setting lport 4005b3ce-3d4d-4741-91d2-940ee880a617 down as sb is readonly
Feb 16 13:49:54 compute-0 nova_compute[185723]: 2026-02-16 13:49:54.946 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:49:54 compute-0 nova_compute[185723]: 2026-02-16 13:49:54.974 185727 INFO nova.virt.libvirt.driver [-] [instance: 4433d998-a1da-44d3-ae35-b75895398b1f] Instance destroyed successfully.
Feb 16 13:49:54 compute-0 nova_compute[185723]: 2026-02-16 13:49:54.975 185727 DEBUG nova.objects.instance [None req-b5676753-ec9f-44cc-95e0-60554ad991a7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lazy-loading 'resources' on Instance uuid 4433d998-a1da-44d3-ae35-b75895398b1f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 13:49:54 compute-0 podman[215867]: 2026-02-16 13:49:54.992065612 +0000 UTC m=+0.048365893 container remove ec803df620918179e624404a16a88eccf163884843909bb613b7fbbfffe21a4e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2)
Feb 16 13:49:54 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:49:54.996 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[95b29388-2119-4bd7-a63a-3a31bbbd8207]: (4, ('Mon Feb 16 01:49:54 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8 (ec803df620918179e624404a16a88eccf163884843909bb613b7fbbfffe21a4e)\nec803df620918179e624404a16a88eccf163884843909bb613b7fbbfffe21a4e\nMon Feb 16 01:49:54 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8 (ec803df620918179e624404a16a88eccf163884843909bb613b7fbbfffe21a4e)\nec803df620918179e624404a16a88eccf163884843909bb613b7fbbfffe21a4e\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:49:54 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:49:54.998 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[ce0d162c-20b5-485e-a58a-07cfed822355]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:49:54 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:49:54.999 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap62a1ccdd-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:49:55 compute-0 nova_compute[185723]: 2026-02-16 13:49:55.001 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:49:55 compute-0 kernel: tap62a1ccdd-30: left promiscuous mode
Feb 16 13:49:55 compute-0 nova_compute[185723]: 2026-02-16 13:49:55.010 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:49:55 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:49:55.014 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[bdfdbd8c-bb9c-42fd-845d-85f481236a76]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:49:55 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:49:55.029 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[44d1a814-3d3e-4da6-b597-c10c2f62ac58]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:49:55 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:49:55.031 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[35e65121-ae10-4e44-8809-246f6b288cd7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:49:55 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:49:55.045 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[2f0c1dfb-70b2-466e-a581-be23f8eb737c]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 580486, 'reachable_time': 23948, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215891, 'error': None, 'target': 'ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:49:55 compute-0 ovn_controller[96072]: 2026-02-16T13:49:55Z|00222|binding|INFO|Releasing lport 4005b3ce-3d4d-4741-91d2-940ee880a617 from this chassis (sb_readonly=0)
Feb 16 13:49:55 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:49:55.048 105762 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Feb 16 13:49:55 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:49:55.048 105762 DEBUG oslo.privsep.daemon [-] privsep: reply[8abe909f-0a76-4fe6-9a73-2ff612604439]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:49:55 compute-0 systemd[1]: run-netns-ovnmeta\x2d62a1ccdd\x2d3048\x2d4bbf\x2dacc8\x2dc791bff79ee8.mount: Deactivated successfully.
Feb 16 13:49:55 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:49:55.051 105360 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1c:67:1b 10.100.0.11'], port_security=['fa:16:3e:1c:67:1b 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '4433d998-a1da-44d3-ae35-b75895398b1f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '76c271745e704d5fa97fe16a7dcd4a81', 'neutron:revision_number': '13', 'neutron:security_group_ids': '832f9d98-96fb-45e2-8c11-0f4534711ee2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fb3ae2f5-242d-4288-83f4-c053c76499e5, chassis=[<ovs.db.idl.Row object at 0x7f2e53046760>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2e53046760>], logical_port=4005b3ce-3d4d-4741-91d2-940ee880a617) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 13:49:55 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:49:55.052 105360 INFO neutron.agent.ovn.metadata.agent [-] Port 4005b3ce-3d4d-4741-91d2-940ee880a617 in datapath 62a1ccdd-3048-4bbf-acc8-c791bff79ee8 bound to our chassis
Feb 16 13:49:55 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:49:55.053 105360 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 62a1ccdd-3048-4bbf-acc8-c791bff79ee8
Feb 16 13:49:55 compute-0 nova_compute[185723]: 2026-02-16 13:49:55.054 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:49:55 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:49:55.066 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[8a8e1127-bc24-4dda-93c6-264d64c40785]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:49:55 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:49:55.067 105360 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap62a1ccdd-31 in ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Feb 16 13:49:55 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:49:55.068 206438 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap62a1ccdd-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Feb 16 13:49:55 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:49:55.068 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[15e85e63-3a15-4bcf-8d57-08cc518d7c0d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:49:55 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:49:55.069 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[7009953a-e6fa-445e-aff5-adb37cac147a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:49:55 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:49:55.078 105762 DEBUG oslo.privsep.daemon [-] privsep: reply[5101732b-92b2-413b-8769-60c596af4fe8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:49:55 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:49:55.088 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[821cb4ec-46b9-4379-9a93-88a932de4d6d]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:49:55 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:49:55.107 206452 DEBUG oslo.privsep.daemon [-] privsep: reply[9116c43b-54cb-42a6-a081-ef9602019f9e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:49:55 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:49:55.110 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[c06f4a7b-5f47-4eb8-8118-e54c231962f6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:49:55 compute-0 systemd-udevd[215761]: Network interface NamePolicy= disabled on kernel command line.
Feb 16 13:49:55 compute-0 NetworkManager[56177]: <info>  [1771249795.1126] manager: (tap62a1ccdd-30): new Veth device (/org/freedesktop/NetworkManager/Devices/87)
Feb 16 13:49:55 compute-0 nova_compute[185723]: 2026-02-16 13:49:55.129 185727 DEBUG nova.virt.libvirt.vif [None req-b5676753-ec9f-44cc-95e0-60554ad991a7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-02-16T13:48:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-313355006',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-313355006',id=23,image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-16T13:48:29Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='76c271745e704d5fa97fe16a7dcd4a81',ramdisk_id='',reservation_id='r-0v5zmrzs',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-1085993185',owner_user_name='tempest-TestExecuteStrategies-1085993185-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-16T13:49:47Z,user_data=None,user_id='e19cd2d8a8894526ba620ca3249e9a63',uuid=4433d998-a1da-44d3-ae35-b75895398b1f,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "4005b3ce-3d4d-4741-91d2-940ee880a617", "address": "fa:16:3e:1c:67:1b", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4005b3ce-3d", "ovs_interfaceid": "4005b3ce-3d4d-4741-91d2-940ee880a617", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Feb 16 13:49:55 compute-0 nova_compute[185723]: 2026-02-16 13:49:55.129 185727 DEBUG nova.network.os_vif_util [None req-b5676753-ec9f-44cc-95e0-60554ad991a7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Converting VIF {"id": "4005b3ce-3d4d-4741-91d2-940ee880a617", "address": "fa:16:3e:1c:67:1b", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4005b3ce-3d", "ovs_interfaceid": "4005b3ce-3d4d-4741-91d2-940ee880a617", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 13:49:55 compute-0 nova_compute[185723]: 2026-02-16 13:49:55.130 185727 DEBUG nova.network.os_vif_util [None req-b5676753-ec9f-44cc-95e0-60554ad991a7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:1c:67:1b,bridge_name='br-int',has_traffic_filtering=True,id=4005b3ce-3d4d-4741-91d2-940ee880a617,network=Network(62a1ccdd-3048-4bbf-acc8-c791bff79ee8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4005b3ce-3d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 13:49:55 compute-0 nova_compute[185723]: 2026-02-16 13:49:55.131 185727 DEBUG os_vif [None req-b5676753-ec9f-44cc-95e0-60554ad991a7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:1c:67:1b,bridge_name='br-int',has_traffic_filtering=True,id=4005b3ce-3d4d-4741-91d2-940ee880a617,network=Network(62a1ccdd-3048-4bbf-acc8-c791bff79ee8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4005b3ce-3d') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Feb 16 13:49:55 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:49:55.129 105360 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1c:67:1b 10.100.0.11'], port_security=['fa:16:3e:1c:67:1b 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '4433d998-a1da-44d3-ae35-b75895398b1f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '76c271745e704d5fa97fe16a7dcd4a81', 'neutron:revision_number': '13', 'neutron:security_group_ids': '832f9d98-96fb-45e2-8c11-0f4534711ee2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fb3ae2f5-242d-4288-83f4-c053c76499e5, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2e53046760>], logical_port=4005b3ce-3d4d-4741-91d2-940ee880a617) old=Port_Binding(chassis=[<ovs.db.idl.Row object at 0x7f2e53046760>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 13:49:55 compute-0 nova_compute[185723]: 2026-02-16 13:49:55.133 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:49:55 compute-0 nova_compute[185723]: 2026-02-16 13:49:55.133 185727 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4005b3ce-3d, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:49:55 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:49:55.169 206452 DEBUG oslo.privsep.daemon [-] privsep: reply[fef001da-d0f2-43af-962b-bddc7ef1de22]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:49:55 compute-0 nova_compute[185723]: 2026-02-16 13:49:55.170 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:49:55 compute-0 nova_compute[185723]: 2026-02-16 13:49:55.173 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 13:49:55 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:49:55.172 206452 DEBUG oslo.privsep.daemon [-] privsep: reply[89a8924a-64f3-423f-9b8c-c4336d156f4b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:49:55 compute-0 nova_compute[185723]: 2026-02-16 13:49:55.174 185727 INFO os_vif [None req-b5676753-ec9f-44cc-95e0-60554ad991a7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:1c:67:1b,bridge_name='br-int',has_traffic_filtering=True,id=4005b3ce-3d4d-4741-91d2-940ee880a617,network=Network(62a1ccdd-3048-4bbf-acc8-c791bff79ee8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4005b3ce-3d')
Feb 16 13:49:55 compute-0 nova_compute[185723]: 2026-02-16 13:49:55.175 185727 INFO nova.virt.libvirt.driver [None req-b5676753-ec9f-44cc-95e0-60554ad991a7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 4433d998-a1da-44d3-ae35-b75895398b1f] Deleting instance files /var/lib/nova/instances/4433d998-a1da-44d3-ae35-b75895398b1f_del
Feb 16 13:49:55 compute-0 nova_compute[185723]: 2026-02-16 13:49:55.176 185727 INFO nova.virt.libvirt.driver [None req-b5676753-ec9f-44cc-95e0-60554ad991a7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 4433d998-a1da-44d3-ae35-b75895398b1f] Deletion of /var/lib/nova/instances/4433d998-a1da-44d3-ae35-b75895398b1f_del complete
Feb 16 13:49:55 compute-0 NetworkManager[56177]: <info>  [1771249795.1888] device (tap62a1ccdd-30): carrier: link connected
Feb 16 13:49:55 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:49:55.190 206452 DEBUG oslo.privsep.daemon [-] privsep: reply[49da0137-f29d-4596-b165-ff8a8a96c83b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:49:55 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:49:55.205 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[3df2e706-8656-41d8-9f22-6f92fb8bbc8e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap62a1ccdd-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a9:94:92'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 62], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 587858, 'reachable_time': 30004, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215916, 'error': None, 'target': 'ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:49:55 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:49:55.218 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[3bdee72f-48b5-443c-a72f-277a4b77d816]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fea9:9492'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 587858, 'tstamp': 587858}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215917, 'error': None, 'target': 'ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:49:55 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:49:55.236 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[2e69ccab-38f8-441a-8b0e-3bd89c539728]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap62a1ccdd-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a9:94:92'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 62], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 587858, 'reachable_time': 30004, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 215918, 'error': None, 'target': 'ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:49:55 compute-0 nova_compute[185723]: 2026-02-16 13:49:55.251 185727 INFO nova.compute.manager [None req-b5676753-ec9f-44cc-95e0-60554ad991a7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 4433d998-a1da-44d3-ae35-b75895398b1f] Took 0.53 seconds to destroy the instance on the hypervisor.
Feb 16 13:49:55 compute-0 nova_compute[185723]: 2026-02-16 13:49:55.251 185727 DEBUG oslo.service.loopingcall [None req-b5676753-ec9f-44cc-95e0-60554ad991a7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Feb 16 13:49:55 compute-0 nova_compute[185723]: 2026-02-16 13:49:55.251 185727 DEBUG nova.compute.manager [-] [instance: 4433d998-a1da-44d3-ae35-b75895398b1f] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Feb 16 13:49:55 compute-0 nova_compute[185723]: 2026-02-16 13:49:55.252 185727 DEBUG nova.network.neutron [-] [instance: 4433d998-a1da-44d3-ae35-b75895398b1f] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Feb 16 13:49:55 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:49:55.261 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[84c1ad3c-1630-4387-89cb-e46381c8c990]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:49:55 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:49:55.305 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[9bd4dfb5-c2f1-4e09-ab63-1b3157fcd755]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:49:55 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:49:55.307 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap62a1ccdd-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:49:55 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:49:55.307 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 13:49:55 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:49:55.308 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap62a1ccdd-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:49:55 compute-0 nova_compute[185723]: 2026-02-16 13:49:55.310 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:49:55 compute-0 NetworkManager[56177]: <info>  [1771249795.3107] manager: (tap62a1ccdd-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/88)
Feb 16 13:49:55 compute-0 kernel: tap62a1ccdd-30: entered promiscuous mode
Feb 16 13:49:55 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:49:55.314 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap62a1ccdd-30, col_values=(('external_ids', {'iface-id': 'ac21d57d-f71e-4560-b6aa-e9f6e3838308'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:49:55 compute-0 nova_compute[185723]: 2026-02-16 13:49:55.315 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:49:55 compute-0 ovn_controller[96072]: 2026-02-16T13:49:55Z|00223|binding|INFO|Releasing lport ac21d57d-f71e-4560-b6aa-e9f6e3838308 from this chassis (sb_readonly=0)
Feb 16 13:49:55 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:49:55.317 105360 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/62a1ccdd-3048-4bbf-acc8-c791bff79ee8.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/62a1ccdd-3048-4bbf-acc8-c791bff79ee8.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Feb 16 13:49:55 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:49:55.318 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[b34485b9-c2e2-405f-b8b6-f543a1616f91]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:49:55 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:49:55.319 105360 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Feb 16 13:49:55 compute-0 ovn_metadata_agent[105355]: global
Feb 16 13:49:55 compute-0 ovn_metadata_agent[105355]:     log         /dev/log local0 debug
Feb 16 13:49:55 compute-0 ovn_metadata_agent[105355]:     log-tag     haproxy-metadata-proxy-62a1ccdd-3048-4bbf-acc8-c791bff79ee8
Feb 16 13:49:55 compute-0 ovn_metadata_agent[105355]:     user        root
Feb 16 13:49:55 compute-0 ovn_metadata_agent[105355]:     group       root
Feb 16 13:49:55 compute-0 ovn_metadata_agent[105355]:     maxconn     1024
Feb 16 13:49:55 compute-0 ovn_metadata_agent[105355]:     pidfile     /var/lib/neutron/external/pids/62a1ccdd-3048-4bbf-acc8-c791bff79ee8.pid.haproxy
Feb 16 13:49:55 compute-0 ovn_metadata_agent[105355]:     daemon
Feb 16 13:49:55 compute-0 ovn_metadata_agent[105355]: 
Feb 16 13:49:55 compute-0 ovn_metadata_agent[105355]: defaults
Feb 16 13:49:55 compute-0 ovn_metadata_agent[105355]:     log global
Feb 16 13:49:55 compute-0 ovn_metadata_agent[105355]:     mode http
Feb 16 13:49:55 compute-0 ovn_metadata_agent[105355]:     option httplog
Feb 16 13:49:55 compute-0 ovn_metadata_agent[105355]:     option dontlognull
Feb 16 13:49:55 compute-0 ovn_metadata_agent[105355]:     option http-server-close
Feb 16 13:49:55 compute-0 ovn_metadata_agent[105355]:     option forwardfor
Feb 16 13:49:55 compute-0 ovn_metadata_agent[105355]:     retries                 3
Feb 16 13:49:55 compute-0 ovn_metadata_agent[105355]:     timeout http-request    30s
Feb 16 13:49:55 compute-0 ovn_metadata_agent[105355]:     timeout connect         30s
Feb 16 13:49:55 compute-0 ovn_metadata_agent[105355]:     timeout client          32s
Feb 16 13:49:55 compute-0 ovn_metadata_agent[105355]:     timeout server          32s
Feb 16 13:49:55 compute-0 ovn_metadata_agent[105355]:     timeout http-keep-alive 30s
Feb 16 13:49:55 compute-0 ovn_metadata_agent[105355]: 
Feb 16 13:49:55 compute-0 ovn_metadata_agent[105355]: 
Feb 16 13:49:55 compute-0 ovn_metadata_agent[105355]: listen listener
Feb 16 13:49:55 compute-0 ovn_metadata_agent[105355]:     bind 169.254.169.254:80
Feb 16 13:49:55 compute-0 ovn_metadata_agent[105355]:     server metadata /var/lib/neutron/metadata_proxy
Feb 16 13:49:55 compute-0 ovn_metadata_agent[105355]:     http-request add-header X-OVN-Network-ID 62a1ccdd-3048-4bbf-acc8-c791bff79ee8
Feb 16 13:49:55 compute-0 ovn_metadata_agent[105355]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Feb 16 13:49:55 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:49:55.320 105360 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'env', 'PROCESS_TAG=haproxy-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/62a1ccdd-3048-4bbf-acc8-c791bff79ee8.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Feb 16 13:49:55 compute-0 nova_compute[185723]: 2026-02-16 13:49:55.321 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:49:55 compute-0 nova_compute[185723]: 2026-02-16 13:49:55.474 185727 DEBUG nova.compute.manager [req-f3547de0-ca58-4374-b32f-bcb877a13724 req-a52e1722-9cc8-432b-82bd-d50b934cdb76 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 18d7e5d6-d36a-46d7-b461-264c28cb9043] Received event network-vif-plugged-321f3fac-0060-4083-a357-cce4f142588b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:49:55 compute-0 nova_compute[185723]: 2026-02-16 13:49:55.475 185727 DEBUG oslo_concurrency.lockutils [req-f3547de0-ca58-4374-b32f-bcb877a13724 req-a52e1722-9cc8-432b-82bd-d50b934cdb76 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "18d7e5d6-d36a-46d7-b461-264c28cb9043-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:49:55 compute-0 nova_compute[185723]: 2026-02-16 13:49:55.475 185727 DEBUG oslo_concurrency.lockutils [req-f3547de0-ca58-4374-b32f-bcb877a13724 req-a52e1722-9cc8-432b-82bd-d50b934cdb76 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "18d7e5d6-d36a-46d7-b461-264c28cb9043-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:49:55 compute-0 nova_compute[185723]: 2026-02-16 13:49:55.475 185727 DEBUG oslo_concurrency.lockutils [req-f3547de0-ca58-4374-b32f-bcb877a13724 req-a52e1722-9cc8-432b-82bd-d50b934cdb76 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "18d7e5d6-d36a-46d7-b461-264c28cb9043-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:49:55 compute-0 nova_compute[185723]: 2026-02-16 13:49:55.475 185727 DEBUG nova.compute.manager [req-f3547de0-ca58-4374-b32f-bcb877a13724 req-a52e1722-9cc8-432b-82bd-d50b934cdb76 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 18d7e5d6-d36a-46d7-b461-264c28cb9043] No waiting events found dispatching network-vif-plugged-321f3fac-0060-4083-a357-cce4f142588b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:49:55 compute-0 nova_compute[185723]: 2026-02-16 13:49:55.475 185727 WARNING nova.compute.manager [req-f3547de0-ca58-4374-b32f-bcb877a13724 req-a52e1722-9cc8-432b-82bd-d50b934cdb76 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 18d7e5d6-d36a-46d7-b461-264c28cb9043] Received unexpected event network-vif-plugged-321f3fac-0060-4083-a357-cce4f142588b for instance with vm_state deleted and task_state None.
Feb 16 13:49:55 compute-0 podman[215950]: 2026-02-16 13:49:55.620703203 +0000 UTC m=+0.042901787 container create ec11f03f8cf31827b2056d63aa965ca9eb12184aa61c277dc7d3ddd73f849755 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Feb 16 13:49:55 compute-0 nova_compute[185723]: 2026-02-16 13:49:55.627 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:49:55 compute-0 systemd[1]: Started libpod-conmon-ec11f03f8cf31827b2056d63aa965ca9eb12184aa61c277dc7d3ddd73f849755.scope.
Feb 16 13:49:55 compute-0 systemd[1]: Started libcrun container.
Feb 16 13:49:55 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/faff50e9fca03e8296d662eb2faaf7f2e7585f640adffbf51d422fc29ee2b115/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 16 13:49:55 compute-0 podman[215950]: 2026-02-16 13:49:55.597194859 +0000 UTC m=+0.019393463 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Feb 16 13:49:55 compute-0 podman[215950]: 2026-02-16 13:49:55.695782149 +0000 UTC m=+0.117980743 container init ec11f03f8cf31827b2056d63aa965ca9eb12184aa61c277dc7d3ddd73f849755 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 16 13:49:55 compute-0 podman[215950]: 2026-02-16 13:49:55.70023047 +0000 UTC m=+0.122429044 container start ec11f03f8cf31827b2056d63aa965ca9eb12184aa61c277dc7d3ddd73f849755 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Feb 16 13:49:55 compute-0 neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8[215965]: [NOTICE]   (215969) : New worker (215971) forked
Feb 16 13:49:55 compute-0 neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8[215965]: [NOTICE]   (215969) : Loading success.
Feb 16 13:49:55 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:49:55.746 105360 INFO neutron.agent.ovn.metadata.agent [-] Port 4005b3ce-3d4d-4741-91d2-940ee880a617 in datapath 62a1ccdd-3048-4bbf-acc8-c791bff79ee8 unbound from our chassis
Feb 16 13:49:55 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:49:55.747 105360 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 62a1ccdd-3048-4bbf-acc8-c791bff79ee8, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 16 13:49:55 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:49:55.748 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[5c2138eb-107e-4a7f-b748-27abbb91b453]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:49:55 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:49:55.749 105360 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8 namespace which is not needed anymore
Feb 16 13:49:55 compute-0 nova_compute[185723]: 2026-02-16 13:49:55.773 185727 DEBUG nova.compute.manager [req-e0ee45ee-bc79-49ea-8044-f44353c213f6 req-25970272-f1b8-4c39-9cb0-93334fd7ad2a faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 4433d998-a1da-44d3-ae35-b75895398b1f] Received event network-vif-unplugged-4005b3ce-3d4d-4741-91d2-940ee880a617 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:49:55 compute-0 nova_compute[185723]: 2026-02-16 13:49:55.773 185727 DEBUG oslo_concurrency.lockutils [req-e0ee45ee-bc79-49ea-8044-f44353c213f6 req-25970272-f1b8-4c39-9cb0-93334fd7ad2a faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "4433d998-a1da-44d3-ae35-b75895398b1f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:49:55 compute-0 nova_compute[185723]: 2026-02-16 13:49:55.774 185727 DEBUG oslo_concurrency.lockutils [req-e0ee45ee-bc79-49ea-8044-f44353c213f6 req-25970272-f1b8-4c39-9cb0-93334fd7ad2a faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "4433d998-a1da-44d3-ae35-b75895398b1f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:49:55 compute-0 nova_compute[185723]: 2026-02-16 13:49:55.774 185727 DEBUG oslo_concurrency.lockutils [req-e0ee45ee-bc79-49ea-8044-f44353c213f6 req-25970272-f1b8-4c39-9cb0-93334fd7ad2a faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "4433d998-a1da-44d3-ae35-b75895398b1f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:49:55 compute-0 nova_compute[185723]: 2026-02-16 13:49:55.774 185727 DEBUG nova.compute.manager [req-e0ee45ee-bc79-49ea-8044-f44353c213f6 req-25970272-f1b8-4c39-9cb0-93334fd7ad2a faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 4433d998-a1da-44d3-ae35-b75895398b1f] No waiting events found dispatching network-vif-unplugged-4005b3ce-3d4d-4741-91d2-940ee880a617 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:49:55 compute-0 nova_compute[185723]: 2026-02-16 13:49:55.774 185727 DEBUG nova.compute.manager [req-e0ee45ee-bc79-49ea-8044-f44353c213f6 req-25970272-f1b8-4c39-9cb0-93334fd7ad2a faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 4433d998-a1da-44d3-ae35-b75895398b1f] Received event network-vif-unplugged-4005b3ce-3d4d-4741-91d2-940ee880a617 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Feb 16 13:49:55 compute-0 nova_compute[185723]: 2026-02-16 13:49:55.774 185727 DEBUG nova.compute.manager [req-e0ee45ee-bc79-49ea-8044-f44353c213f6 req-25970272-f1b8-4c39-9cb0-93334fd7ad2a faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 4433d998-a1da-44d3-ae35-b75895398b1f] Received event network-vif-plugged-4005b3ce-3d4d-4741-91d2-940ee880a617 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:49:55 compute-0 nova_compute[185723]: 2026-02-16 13:49:55.774 185727 DEBUG oslo_concurrency.lockutils [req-e0ee45ee-bc79-49ea-8044-f44353c213f6 req-25970272-f1b8-4c39-9cb0-93334fd7ad2a faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "4433d998-a1da-44d3-ae35-b75895398b1f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:49:55 compute-0 nova_compute[185723]: 2026-02-16 13:49:55.775 185727 DEBUG oslo_concurrency.lockutils [req-e0ee45ee-bc79-49ea-8044-f44353c213f6 req-25970272-f1b8-4c39-9cb0-93334fd7ad2a faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "4433d998-a1da-44d3-ae35-b75895398b1f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:49:55 compute-0 nova_compute[185723]: 2026-02-16 13:49:55.775 185727 DEBUG oslo_concurrency.lockutils [req-e0ee45ee-bc79-49ea-8044-f44353c213f6 req-25970272-f1b8-4c39-9cb0-93334fd7ad2a faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "4433d998-a1da-44d3-ae35-b75895398b1f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:49:55 compute-0 nova_compute[185723]: 2026-02-16 13:49:55.775 185727 DEBUG nova.compute.manager [req-e0ee45ee-bc79-49ea-8044-f44353c213f6 req-25970272-f1b8-4c39-9cb0-93334fd7ad2a faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 4433d998-a1da-44d3-ae35-b75895398b1f] No waiting events found dispatching network-vif-plugged-4005b3ce-3d4d-4741-91d2-940ee880a617 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:49:55 compute-0 nova_compute[185723]: 2026-02-16 13:49:55.775 185727 WARNING nova.compute.manager [req-e0ee45ee-bc79-49ea-8044-f44353c213f6 req-25970272-f1b8-4c39-9cb0-93334fd7ad2a faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 4433d998-a1da-44d3-ae35-b75895398b1f] Received unexpected event network-vif-plugged-4005b3ce-3d4d-4741-91d2-940ee880a617 for instance with vm_state active and task_state deleting.
Feb 16 13:49:55 compute-0 neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8[215965]: [NOTICE]   (215969) : haproxy version is 2.8.14-c23fe91
Feb 16 13:49:55 compute-0 neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8[215965]: [NOTICE]   (215969) : path to executable is /usr/sbin/haproxy
Feb 16 13:49:55 compute-0 neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8[215965]: [WARNING]  (215969) : Exiting Master process...
Feb 16 13:49:55 compute-0 neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8[215965]: [ALERT]    (215969) : Current worker (215971) exited with code 143 (Terminated)
Feb 16 13:49:55 compute-0 neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8[215965]: [WARNING]  (215969) : All workers exited. Exiting... (0)
Feb 16 13:49:55 compute-0 systemd[1]: libpod-ec11f03f8cf31827b2056d63aa965ca9eb12184aa61c277dc7d3ddd73f849755.scope: Deactivated successfully.
Feb 16 13:49:55 compute-0 podman[215997]: 2026-02-16 13:49:55.858647486 +0000 UTC m=+0.039793230 container died ec11f03f8cf31827b2056d63aa965ca9eb12184aa61c277dc7d3ddd73f849755 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20260127)
Feb 16 13:49:55 compute-0 podman[215997]: 2026-02-16 13:49:55.890620561 +0000 UTC m=+0.071766285 container cleanup ec11f03f8cf31827b2056d63aa965ca9eb12184aa61c277dc7d3ddd73f849755 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2)
Feb 16 13:49:55 compute-0 systemd[1]: libpod-conmon-ec11f03f8cf31827b2056d63aa965ca9eb12184aa61c277dc7d3ddd73f849755.scope: Deactivated successfully.
Feb 16 13:49:55 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ec11f03f8cf31827b2056d63aa965ca9eb12184aa61c277dc7d3ddd73f849755-userdata-shm.mount: Deactivated successfully.
Feb 16 13:49:55 compute-0 podman[216028]: 2026-02-16 13:49:55.940535301 +0000 UTC m=+0.033778180 container remove ec11f03f8cf31827b2056d63aa965ca9eb12184aa61c277dc7d3ddd73f849755 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Feb 16 13:49:55 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:49:55.944 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[50df1913-5d6a-454c-b5c6-0076ecd74a83]: (4, ('Mon Feb 16 01:49:55 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8 (ec11f03f8cf31827b2056d63aa965ca9eb12184aa61c277dc7d3ddd73f849755)\nec11f03f8cf31827b2056d63aa965ca9eb12184aa61c277dc7d3ddd73f849755\nMon Feb 16 01:49:55 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8 (ec11f03f8cf31827b2056d63aa965ca9eb12184aa61c277dc7d3ddd73f849755)\nec11f03f8cf31827b2056d63aa965ca9eb12184aa61c277dc7d3ddd73f849755\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:49:55 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:49:55.946 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[647895a2-c25c-4568-bce5-df75a5c90844]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:49:55 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:49:55.947 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap62a1ccdd-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:49:55 compute-0 nova_compute[185723]: 2026-02-16 13:49:55.948 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:49:55 compute-0 kernel: tap62a1ccdd-30: left promiscuous mode
Feb 16 13:49:55 compute-0 nova_compute[185723]: 2026-02-16 13:49:55.952 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:49:55 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:49:55.954 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[f42f4e6f-92e1-4047-a553-83e65cde3535]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:49:55 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:49:55.969 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[dad0e8d5-3c64-4ddc-bc7e-924234dfa36a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:49:55 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:49:55.972 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[d337eaf4-809d-45be-8ed0-3d0f74a11d5b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:49:55 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:49:55.984 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[f2b4ff18-1ea9-473f-a188-3490e85fcb59]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 587849, 'reachable_time': 43258, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216041, 'error': None, 'target': 'ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:49:55 compute-0 systemd[1]: run-netns-ovnmeta\x2d62a1ccdd\x2d3048\x2d4bbf\x2dacc8\x2dc791bff79ee8.mount: Deactivated successfully.
Feb 16 13:49:55 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:49:55.987 105762 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Feb 16 13:49:55 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:49:55.987 105762 DEBUG oslo.privsep.daemon [-] privsep: reply[e21504dc-6a23-4e65-9a5f-cfb13ebf3040]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:49:56 compute-0 nova_compute[185723]: 2026-02-16 13:49:56.060 185727 DEBUG nova.network.neutron [-] [instance: 4433d998-a1da-44d3-ae35-b75895398b1f] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 13:49:56 compute-0 nova_compute[185723]: 2026-02-16 13:49:56.077 185727 INFO nova.compute.manager [-] [instance: 4433d998-a1da-44d3-ae35-b75895398b1f] Took 0.82 seconds to deallocate network for instance.
Feb 16 13:49:56 compute-0 nova_compute[185723]: 2026-02-16 13:49:56.119 185727 DEBUG oslo_concurrency.lockutils [None req-b5676753-ec9f-44cc-95e0-60554ad991a7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:49:56 compute-0 nova_compute[185723]: 2026-02-16 13:49:56.119 185727 DEBUG oslo_concurrency.lockutils [None req-b5676753-ec9f-44cc-95e0-60554ad991a7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:49:56 compute-0 nova_compute[185723]: 2026-02-16 13:49:56.124 185727 DEBUG oslo_concurrency.lockutils [None req-b5676753-ec9f-44cc-95e0-60554ad991a7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.004s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:49:56 compute-0 nova_compute[185723]: 2026-02-16 13:49:56.167 185727 INFO nova.scheduler.client.report [None req-b5676753-ec9f-44cc-95e0-60554ad991a7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Deleted allocations for instance 4433d998-a1da-44d3-ae35-b75895398b1f
Feb 16 13:49:56 compute-0 nova_compute[185723]: 2026-02-16 13:49:56.237 185727 DEBUG oslo_concurrency.lockutils [None req-b5676753-ec9f-44cc-95e0-60554ad991a7 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "4433d998-a1da-44d3-ae35-b75895398b1f" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.523s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:49:57 compute-0 nova_compute[185723]: 2026-02-16 13:49:57.903 185727 DEBUG nova.compute.manager [req-3d5dded9-9e12-42cc-9df0-76d907cd6030 req-432391f8-216d-43d2-a77a-0b430433deec faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 4433d998-a1da-44d3-ae35-b75895398b1f] Received event network-vif-deleted-4005b3ce-3d4d-4741-91d2-940ee880a617 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:49:59 compute-0 podman[195053]: time="2026-02-16T13:49:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:49:59 compute-0 podman[195053]: @ - - [16/Feb/2026:13:49:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16012 "" "Go-http-client/1.1"
Feb 16 13:49:59 compute-0 podman[195053]: @ - - [16/Feb/2026:13:49:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2178 "" "Go-http-client/1.1"
Feb 16 13:50:00 compute-0 nova_compute[185723]: 2026-02-16 13:50:00.171 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:50:00 compute-0 nova_compute[185723]: 2026-02-16 13:50:00.629 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:50:01 compute-0 openstack_network_exporter[197909]: ERROR   13:50:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:50:01 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:50:01 compute-0 openstack_network_exporter[197909]: ERROR   13:50:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:50:01 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:50:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:50:03.244 105360 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:50:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:50:03.244 105360 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:50:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:50:03.245 105360 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:50:05 compute-0 nova_compute[185723]: 2026-02-16 13:50:05.173 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:50:05 compute-0 nova_compute[185723]: 2026-02-16 13:50:05.630 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:50:07 compute-0 nova_compute[185723]: 2026-02-16 13:50:07.424 185727 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1771249792.4226122, 18d7e5d6-d36a-46d7-b461-264c28cb9043 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 13:50:07 compute-0 nova_compute[185723]: 2026-02-16 13:50:07.424 185727 INFO nova.compute.manager [-] [instance: 18d7e5d6-d36a-46d7-b461-264c28cb9043] VM Stopped (Lifecycle Event)
Feb 16 13:50:07 compute-0 nova_compute[185723]: 2026-02-16 13:50:07.452 185727 DEBUG nova.compute.manager [None req-a9d6d0e5-b254-4507-9fb1-3723fa8d434b - - - - - -] [instance: 18d7e5d6-d36a-46d7-b461-264c28cb9043] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:50:09 compute-0 nova_compute[185723]: 2026-02-16 13:50:09.973 185727 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1771249794.9715981, 4433d998-a1da-44d3-ae35-b75895398b1f => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 13:50:09 compute-0 nova_compute[185723]: 2026-02-16 13:50:09.973 185727 INFO nova.compute.manager [-] [instance: 4433d998-a1da-44d3-ae35-b75895398b1f] VM Stopped (Lifecycle Event)
Feb 16 13:50:09 compute-0 nova_compute[185723]: 2026-02-16 13:50:09.996 185727 DEBUG nova.compute.manager [None req-7b238f5c-c9d4-4bf1-b2a6-ab94d0b6a9e4 - - - - - -] [instance: 4433d998-a1da-44d3-ae35-b75895398b1f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:50:10 compute-0 nova_compute[185723]: 2026-02-16 13:50:10.176 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:50:10 compute-0 nova_compute[185723]: 2026-02-16 13:50:10.631 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:50:11 compute-0 podman[216047]: 2026-02-16 13:50:11.062461209 +0000 UTC m=+0.101677228 container health_status d69bd0be854ec8043fcba555b70c2cd311c7a2510d07d6bc9929fe7684abece9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3)
Feb 16 13:50:11 compute-0 podman[216046]: 2026-02-16 13:50:11.062999842 +0000 UTC m=+0.104651891 container health_status 93151dcbec9f159583042df5101bdd7b5b1bcb5f3117955b4bc9cd1c0e465b98 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, container_name=openstack_network_exporter, io.buildah.version=1.33.7, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9/ubi-minimal, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.component=ubi9-minimal-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.created=2026-02-05T04:57:10Z, vendor=Red Hat, Inc., vcs-type=git, release=1770267347, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2026-02-05T04:57:10Z, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, io.openshift.tags=minimal rhel9)
Feb 16 13:50:15 compute-0 nova_compute[185723]: 2026-02-16 13:50:15.217 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:50:15 compute-0 nova_compute[185723]: 2026-02-16 13:50:15.632 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:50:17 compute-0 podman[216086]: 2026-02-16 13:50:17.018085501 +0000 UTC m=+0.059303275 container health_status 19961a2f872cc72daf954f4dd556f980b5f58f137d145776df6132c167a8a93c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Feb 16 13:50:17 compute-0 nova_compute[185723]: 2026-02-16 13:50:17.249 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:50:20 compute-0 nova_compute[185723]: 2026-02-16 13:50:20.219 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:50:20 compute-0 nova_compute[185723]: 2026-02-16 13:50:20.633 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:50:24 compute-0 nova_compute[185723]: 2026-02-16 13:50:24.433 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:50:24 compute-0 nova_compute[185723]: 2026-02-16 13:50:24.460 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:50:24 compute-0 nova_compute[185723]: 2026-02-16 13:50:24.460 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:50:24 compute-0 nova_compute[185723]: 2026-02-16 13:50:24.460 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:50:24 compute-0 nova_compute[185723]: 2026-02-16 13:50:24.461 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 16 13:50:24 compute-0 nova_compute[185723]: 2026-02-16 13:50:24.584 185727 WARNING nova.virt.libvirt.driver [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 13:50:24 compute-0 nova_compute[185723]: 2026-02-16 13:50:24.585 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5820MB free_disk=73.2200698852539GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 16 13:50:24 compute-0 nova_compute[185723]: 2026-02-16 13:50:24.585 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:50:24 compute-0 nova_compute[185723]: 2026-02-16 13:50:24.585 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:50:24 compute-0 nova_compute[185723]: 2026-02-16 13:50:24.697 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 16 13:50:24 compute-0 nova_compute[185723]: 2026-02-16 13:50:24.698 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 16 13:50:24 compute-0 nova_compute[185723]: 2026-02-16 13:50:24.727 185727 DEBUG nova.compute.provider_tree [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Inventory has not changed in ProviderTree for provider: c9501a85-df32-4b8f-bce0-9425ef1e7866 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 13:50:24 compute-0 nova_compute[185723]: 2026-02-16 13:50:24.743 185727 DEBUG nova.scheduler.client.report [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Inventory has not changed for provider c9501a85-df32-4b8f-bce0-9425ef1e7866 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 13:50:24 compute-0 nova_compute[185723]: 2026-02-16 13:50:24.778 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 16 13:50:24 compute-0 nova_compute[185723]: 2026-02-16 13:50:24.779 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.194s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:50:24 compute-0 podman[216113]: 2026-02-16 13:50:24.999054501 +0000 UTC m=+0.044054116 container health_status 4ec6105d1727f201f656d7e6e358931be8ceabc5af37fb5ad779abc620122180 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 16 13:50:25 compute-0 nova_compute[185723]: 2026-02-16 13:50:25.221 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:50:25 compute-0 nova_compute[185723]: 2026-02-16 13:50:25.637 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:50:26 compute-0 nova_compute[185723]: 2026-02-16 13:50:26.780 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:50:26 compute-0 nova_compute[185723]: 2026-02-16 13:50:26.781 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:50:28 compute-0 nova_compute[185723]: 2026-02-16 13:50:28.433 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:50:28 compute-0 nova_compute[185723]: 2026-02-16 13:50:28.434 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 16 13:50:28 compute-0 nova_compute[185723]: 2026-02-16 13:50:28.434 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 16 13:50:28 compute-0 nova_compute[185723]: 2026-02-16 13:50:28.452 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 16 13:50:28 compute-0 nova_compute[185723]: 2026-02-16 13:50:28.453 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:50:28 compute-0 nova_compute[185723]: 2026-02-16 13:50:28.453 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:50:29 compute-0 podman[195053]: time="2026-02-16T13:50:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:50:29 compute-0 podman[195053]: @ - - [16/Feb/2026:13:50:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16012 "" "Go-http-client/1.1"
Feb 16 13:50:29 compute-0 podman[195053]: @ - - [16/Feb/2026:13:50:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2179 "" "Go-http-client/1.1"
Feb 16 13:50:30 compute-0 nova_compute[185723]: 2026-02-16 13:50:30.224 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:50:30 compute-0 nova_compute[185723]: 2026-02-16 13:50:30.432 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:50:30 compute-0 nova_compute[185723]: 2026-02-16 13:50:30.639 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:50:30 compute-0 sshd-session[216140]: Invalid user ubuntu from 64.227.72.94 port 55288
Feb 16 13:50:30 compute-0 sshd-session[216140]: Connection closed by invalid user ubuntu 64.227.72.94 port 55288 [preauth]
Feb 16 13:50:31 compute-0 openstack_network_exporter[197909]: ERROR   13:50:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:50:31 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:50:31 compute-0 openstack_network_exporter[197909]: ERROR   13:50:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:50:31 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:50:31 compute-0 nova_compute[185723]: 2026-02-16 13:50:31.433 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:50:31 compute-0 nova_compute[185723]: 2026-02-16 13:50:31.434 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 16 13:50:32 compute-0 nova_compute[185723]: 2026-02-16 13:50:32.429 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:50:35 compute-0 nova_compute[185723]: 2026-02-16 13:50:35.227 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:50:35 compute-0 nova_compute[185723]: 2026-02-16 13:50:35.641 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:50:37 compute-0 sshd-session[216142]: Invalid user test from 146.190.226.24 port 45674
Feb 16 13:50:37 compute-0 sshd-session[216142]: Connection closed by invalid user test 146.190.226.24 port 45674 [preauth]
Feb 16 13:50:40 compute-0 nova_compute[185723]: 2026-02-16 13:50:40.229 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:50:40 compute-0 nova_compute[185723]: 2026-02-16 13:50:40.643 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:50:42 compute-0 podman[216145]: 2026-02-16 13:50:42.017484912 +0000 UTC m=+0.053673515 container health_status 93151dcbec9f159583042df5101bdd7b5b1bcb5f3117955b4bc9cd1c0e465b98 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.7, container_name=openstack_network_exporter, io.openshift.expose-services=, architecture=x86_64, release=1770267347, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, config_id=openstack_network_exporter, maintainer=Red Hat, Inc., managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=ubi9/ubi-minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.openshift.tags=minimal rhel9, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, build-date=2026-02-05T04:57:10Z, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.created=2026-02-05T04:57:10Z, vendor=Red Hat, Inc., io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git)
Feb 16 13:50:42 compute-0 podman[216146]: 2026-02-16 13:50:42.018428695 +0000 UTC m=+0.051381608 container health_status d69bd0be854ec8043fcba555b70c2cd311c7a2510d07d6bc9929fe7684abece9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible)
Feb 16 13:50:43 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:50:43.581 105360 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=29, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '52:96:1b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '16:6c:7a:bd:49:39'}, ipsec=False) old=SB_Global(nb_cfg=28) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 13:50:43 compute-0 nova_compute[185723]: 2026-02-16 13:50:43.582 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:50:43 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:50:43.582 105360 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 16 13:50:43 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:50:43.583 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=b0e583b2-47d7-4bde-bbd6-282143e0c194, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '29'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:50:45 compute-0 nova_compute[185723]: 2026-02-16 13:50:45.232 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:50:45 compute-0 nova_compute[185723]: 2026-02-16 13:50:45.644 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:50:48 compute-0 podman[216185]: 2026-02-16 13:50:48.029288381 +0000 UTC m=+0.069425086 container health_status 19961a2f872cc72daf954f4dd556f980b5f58f137d145776df6132c167a8a93c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible)
Feb 16 13:50:50 compute-0 nova_compute[185723]: 2026-02-16 13:50:50.234 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:50:50 compute-0 nova_compute[185723]: 2026-02-16 13:50:50.647 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:50:51 compute-0 sshd-session[216211]: Invalid user postgres from 188.166.42.159 port 38838
Feb 16 13:50:51 compute-0 ovn_controller[96072]: 2026-02-16T13:50:51Z|00224|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Feb 16 13:50:51 compute-0 sshd-session[216211]: Connection closed by invalid user postgres 188.166.42.159 port 38838 [preauth]
Feb 16 13:50:52 compute-0 nova_compute[185723]: 2026-02-16 13:50:52.951 185727 DEBUG oslo_concurrency.lockutils [None req-4dc8126e-aca3-4847-a26b-d6c55a6d1df0 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] Acquiring lock "6b24deb5-a1f1-4154-a8a4-c31c69dc5d32" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:50:52 compute-0 nova_compute[185723]: 2026-02-16 13:50:52.952 185727 DEBUG oslo_concurrency.lockutils [None req-4dc8126e-aca3-4847-a26b-d6c55a6d1df0 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] Lock "6b24deb5-a1f1-4154-a8a4-c31c69dc5d32" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:50:52 compute-0 nova_compute[185723]: 2026-02-16 13:50:52.965 185727 DEBUG nova.compute.manager [None req-4dc8126e-aca3-4847-a26b-d6c55a6d1df0 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] [instance: 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Feb 16 13:50:53 compute-0 nova_compute[185723]: 2026-02-16 13:50:53.047 185727 DEBUG oslo_concurrency.lockutils [None req-4dc8126e-aca3-4847-a26b-d6c55a6d1df0 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:50:53 compute-0 nova_compute[185723]: 2026-02-16 13:50:53.048 185727 DEBUG oslo_concurrency.lockutils [None req-4dc8126e-aca3-4847-a26b-d6c55a6d1df0 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:50:53 compute-0 nova_compute[185723]: 2026-02-16 13:50:53.055 185727 DEBUG nova.virt.hardware [None req-4dc8126e-aca3-4847-a26b-d6c55a6d1df0 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Feb 16 13:50:53 compute-0 nova_compute[185723]: 2026-02-16 13:50:53.056 185727 INFO nova.compute.claims [None req-4dc8126e-aca3-4847-a26b-d6c55a6d1df0 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] [instance: 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32] Claim successful on node compute-0.ctlplane.example.com
Feb 16 13:50:53 compute-0 nova_compute[185723]: 2026-02-16 13:50:53.169 185727 DEBUG nova.compute.provider_tree [None req-4dc8126e-aca3-4847-a26b-d6c55a6d1df0 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] Inventory has not changed in ProviderTree for provider: c9501a85-df32-4b8f-bce0-9425ef1e7866 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 13:50:53 compute-0 nova_compute[185723]: 2026-02-16 13:50:53.185 185727 DEBUG nova.scheduler.client.report [None req-4dc8126e-aca3-4847-a26b-d6c55a6d1df0 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] Inventory has not changed for provider c9501a85-df32-4b8f-bce0-9425ef1e7866 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 13:50:53 compute-0 nova_compute[185723]: 2026-02-16 13:50:53.221 185727 DEBUG oslo_concurrency.lockutils [None req-4dc8126e-aca3-4847-a26b-d6c55a6d1df0 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.173s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:50:53 compute-0 nova_compute[185723]: 2026-02-16 13:50:53.221 185727 DEBUG nova.compute.manager [None req-4dc8126e-aca3-4847-a26b-d6c55a6d1df0 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] [instance: 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Feb 16 13:50:53 compute-0 nova_compute[185723]: 2026-02-16 13:50:53.268 185727 DEBUG nova.compute.manager [None req-4dc8126e-aca3-4847-a26b-d6c55a6d1df0 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] [instance: 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Feb 16 13:50:53 compute-0 nova_compute[185723]: 2026-02-16 13:50:53.268 185727 DEBUG nova.network.neutron [None req-4dc8126e-aca3-4847-a26b-d6c55a6d1df0 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] [instance: 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Feb 16 13:50:53 compute-0 nova_compute[185723]: 2026-02-16 13:50:53.295 185727 INFO nova.virt.libvirt.driver [None req-4dc8126e-aca3-4847-a26b-d6c55a6d1df0 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] [instance: 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Feb 16 13:50:53 compute-0 nova_compute[185723]: 2026-02-16 13:50:53.316 185727 DEBUG nova.compute.manager [None req-4dc8126e-aca3-4847-a26b-d6c55a6d1df0 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] [instance: 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Feb 16 13:50:53 compute-0 nova_compute[185723]: 2026-02-16 13:50:53.451 185727 DEBUG nova.compute.manager [None req-4dc8126e-aca3-4847-a26b-d6c55a6d1df0 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] [instance: 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Feb 16 13:50:53 compute-0 nova_compute[185723]: 2026-02-16 13:50:53.452 185727 DEBUG nova.virt.libvirt.driver [None req-4dc8126e-aca3-4847-a26b-d6c55a6d1df0 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] [instance: 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Feb 16 13:50:53 compute-0 nova_compute[185723]: 2026-02-16 13:50:53.452 185727 INFO nova.virt.libvirt.driver [None req-4dc8126e-aca3-4847-a26b-d6c55a6d1df0 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] [instance: 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32] Creating image(s)
Feb 16 13:50:53 compute-0 nova_compute[185723]: 2026-02-16 13:50:53.453 185727 DEBUG oslo_concurrency.lockutils [None req-4dc8126e-aca3-4847-a26b-d6c55a6d1df0 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] Acquiring lock "/var/lib/nova/instances/6b24deb5-a1f1-4154-a8a4-c31c69dc5d32/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:50:53 compute-0 nova_compute[185723]: 2026-02-16 13:50:53.453 185727 DEBUG oslo_concurrency.lockutils [None req-4dc8126e-aca3-4847-a26b-d6c55a6d1df0 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] Lock "/var/lib/nova/instances/6b24deb5-a1f1-4154-a8a4-c31c69dc5d32/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:50:53 compute-0 nova_compute[185723]: 2026-02-16 13:50:53.453 185727 DEBUG oslo_concurrency.lockutils [None req-4dc8126e-aca3-4847-a26b-d6c55a6d1df0 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] Lock "/var/lib/nova/instances/6b24deb5-a1f1-4154-a8a4-c31c69dc5d32/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:50:53 compute-0 nova_compute[185723]: 2026-02-16 13:50:53.464 185727 DEBUG oslo_concurrency.processutils [None req-4dc8126e-aca3-4847-a26b-d6c55a6d1df0 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:50:53 compute-0 nova_compute[185723]: 2026-02-16 13:50:53.509 185727 DEBUG oslo_concurrency.processutils [None req-4dc8126e-aca3-4847-a26b-d6c55a6d1df0 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json" returned: 0 in 0.045s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:50:53 compute-0 nova_compute[185723]: 2026-02-16 13:50:53.510 185727 DEBUG oslo_concurrency.lockutils [None req-4dc8126e-aca3-4847-a26b-d6c55a6d1df0 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] Acquiring lock "755ae6cb63977e865a71a8c38a1a67ce95c9d0b7" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:50:53 compute-0 nova_compute[185723]: 2026-02-16 13:50:53.510 185727 DEBUG oslo_concurrency.lockutils [None req-4dc8126e-aca3-4847-a26b-d6c55a6d1df0 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] Lock "755ae6cb63977e865a71a8c38a1a67ce95c9d0b7" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:50:53 compute-0 nova_compute[185723]: 2026-02-16 13:50:53.521 185727 DEBUG oslo_concurrency.processutils [None req-4dc8126e-aca3-4847-a26b-d6c55a6d1df0 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:50:53 compute-0 nova_compute[185723]: 2026-02-16 13:50:53.570 185727 DEBUG oslo_concurrency.processutils [None req-4dc8126e-aca3-4847-a26b-d6c55a6d1df0 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json" returned: 0 in 0.049s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:50:53 compute-0 nova_compute[185723]: 2026-02-16 13:50:53.571 185727 DEBUG oslo_concurrency.processutils [None req-4dc8126e-aca3-4847-a26b-d6c55a6d1df0 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7,backing_fmt=raw /var/lib/nova/instances/6b24deb5-a1f1-4154-a8a4-c31c69dc5d32/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:50:53 compute-0 nova_compute[185723]: 2026-02-16 13:50:53.600 185727 DEBUG oslo_concurrency.processutils [None req-4dc8126e-aca3-4847-a26b-d6c55a6d1df0 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7,backing_fmt=raw /var/lib/nova/instances/6b24deb5-a1f1-4154-a8a4-c31c69dc5d32/disk 1073741824" returned: 0 in 0.029s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:50:53 compute-0 nova_compute[185723]: 2026-02-16 13:50:53.601 185727 DEBUG oslo_concurrency.lockutils [None req-4dc8126e-aca3-4847-a26b-d6c55a6d1df0 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] Lock "755ae6cb63977e865a71a8c38a1a67ce95c9d0b7" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.091s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:50:53 compute-0 nova_compute[185723]: 2026-02-16 13:50:53.601 185727 DEBUG oslo_concurrency.processutils [None req-4dc8126e-aca3-4847-a26b-d6c55a6d1df0 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:50:53 compute-0 nova_compute[185723]: 2026-02-16 13:50:53.651 185727 DEBUG oslo_concurrency.processutils [None req-4dc8126e-aca3-4847-a26b-d6c55a6d1df0 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json" returned: 0 in 0.049s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:50:53 compute-0 nova_compute[185723]: 2026-02-16 13:50:53.652 185727 DEBUG nova.virt.disk.api [None req-4dc8126e-aca3-4847-a26b-d6c55a6d1df0 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] Checking if we can resize image /var/lib/nova/instances/6b24deb5-a1f1-4154-a8a4-c31c69dc5d32/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Feb 16 13:50:53 compute-0 nova_compute[185723]: 2026-02-16 13:50:53.652 185727 DEBUG oslo_concurrency.processutils [None req-4dc8126e-aca3-4847-a26b-d6c55a6d1df0 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6b24deb5-a1f1-4154-a8a4-c31c69dc5d32/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:50:53 compute-0 nova_compute[185723]: 2026-02-16 13:50:53.700 185727 DEBUG oslo_concurrency.processutils [None req-4dc8126e-aca3-4847-a26b-d6c55a6d1df0 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6b24deb5-a1f1-4154-a8a4-c31c69dc5d32/disk --force-share --output=json" returned: 0 in 0.047s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:50:53 compute-0 nova_compute[185723]: 2026-02-16 13:50:53.701 185727 DEBUG nova.virt.disk.api [None req-4dc8126e-aca3-4847-a26b-d6c55a6d1df0 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] Cannot resize image /var/lib/nova/instances/6b24deb5-a1f1-4154-a8a4-c31c69dc5d32/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Feb 16 13:50:53 compute-0 nova_compute[185723]: 2026-02-16 13:50:53.702 185727 DEBUG nova.objects.instance [None req-4dc8126e-aca3-4847-a26b-d6c55a6d1df0 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] Lazy-loading 'migration_context' on Instance uuid 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 13:50:53 compute-0 nova_compute[185723]: 2026-02-16 13:50:53.725 185727 DEBUG nova.virt.libvirt.driver [None req-4dc8126e-aca3-4847-a26b-d6c55a6d1df0 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] [instance: 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Feb 16 13:50:53 compute-0 nova_compute[185723]: 2026-02-16 13:50:53.726 185727 DEBUG nova.virt.libvirt.driver [None req-4dc8126e-aca3-4847-a26b-d6c55a6d1df0 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] [instance: 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32] Ensure instance console log exists: /var/lib/nova/instances/6b24deb5-a1f1-4154-a8a4-c31c69dc5d32/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Feb 16 13:50:53 compute-0 nova_compute[185723]: 2026-02-16 13:50:53.726 185727 DEBUG oslo_concurrency.lockutils [None req-4dc8126e-aca3-4847-a26b-d6c55a6d1df0 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:50:53 compute-0 nova_compute[185723]: 2026-02-16 13:50:53.727 185727 DEBUG oslo_concurrency.lockutils [None req-4dc8126e-aca3-4847-a26b-d6c55a6d1df0 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:50:53 compute-0 nova_compute[185723]: 2026-02-16 13:50:53.727 185727 DEBUG oslo_concurrency.lockutils [None req-4dc8126e-aca3-4847-a26b-d6c55a6d1df0 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:50:54 compute-0 nova_compute[185723]: 2026-02-16 13:50:54.090 185727 DEBUG nova.policy [None req-4dc8126e-aca3-4847-a26b-d6c55a6d1df0 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'c7c8dce27a2f4917a7dac485b1d8754a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '5c4a5b3f08ab466eaac86305d91fd9a8', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Feb 16 13:50:54 compute-0 sshd-session[216213]: Invalid user mysql from 146.190.22.227 port 59320
Feb 16 13:50:54 compute-0 sshd-session[216213]: Connection closed by invalid user mysql 146.190.22.227 port 59320 [preauth]
Feb 16 13:50:55 compute-0 nova_compute[185723]: 2026-02-16 13:50:55.237 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:50:55 compute-0 nova_compute[185723]: 2026-02-16 13:50:55.265 185727 DEBUG nova.network.neutron [None req-4dc8126e-aca3-4847-a26b-d6c55a6d1df0 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] [instance: 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32] Successfully created port: b7a22eb4-a31d-4a96-a5a9-9e56f37cecc1 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Feb 16 13:50:55 compute-0 nova_compute[185723]: 2026-02-16 13:50:55.647 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:50:56 compute-0 podman[216230]: 2026-02-16 13:50:56.031047347 +0000 UTC m=+0.070386600 container health_status 4ec6105d1727f201f656d7e6e358931be8ceabc5af37fb5ad779abc620122180 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 16 13:50:56 compute-0 nova_compute[185723]: 2026-02-16 13:50:56.049 185727 DEBUG nova.network.neutron [None req-4dc8126e-aca3-4847-a26b-d6c55a6d1df0 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] [instance: 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32] Successfully updated port: b7a22eb4-a31d-4a96-a5a9-9e56f37cecc1 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Feb 16 13:50:56 compute-0 nova_compute[185723]: 2026-02-16 13:50:56.070 185727 DEBUG oslo_concurrency.lockutils [None req-4dc8126e-aca3-4847-a26b-d6c55a6d1df0 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] Acquiring lock "refresh_cache-6b24deb5-a1f1-4154-a8a4-c31c69dc5d32" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 13:50:56 compute-0 nova_compute[185723]: 2026-02-16 13:50:56.071 185727 DEBUG oslo_concurrency.lockutils [None req-4dc8126e-aca3-4847-a26b-d6c55a6d1df0 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] Acquired lock "refresh_cache-6b24deb5-a1f1-4154-a8a4-c31c69dc5d32" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 13:50:56 compute-0 nova_compute[185723]: 2026-02-16 13:50:56.071 185727 DEBUG nova.network.neutron [None req-4dc8126e-aca3-4847-a26b-d6c55a6d1df0 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] [instance: 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 16 13:50:56 compute-0 nova_compute[185723]: 2026-02-16 13:50:56.148 185727 DEBUG nova.compute.manager [req-64d527c8-87bc-4032-9759-dc4756be9f5a req-47331e79-f684-4ed8-9b3d-0c21559c8f45 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32] Received event network-changed-b7a22eb4-a31d-4a96-a5a9-9e56f37cecc1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:50:56 compute-0 nova_compute[185723]: 2026-02-16 13:50:56.148 185727 DEBUG nova.compute.manager [req-64d527c8-87bc-4032-9759-dc4756be9f5a req-47331e79-f684-4ed8-9b3d-0c21559c8f45 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32] Refreshing instance network info cache due to event network-changed-b7a22eb4-a31d-4a96-a5a9-9e56f37cecc1. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 16 13:50:56 compute-0 nova_compute[185723]: 2026-02-16 13:50:56.148 185727 DEBUG oslo_concurrency.lockutils [req-64d527c8-87bc-4032-9759-dc4756be9f5a req-47331e79-f684-4ed8-9b3d-0c21559c8f45 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "refresh_cache-6b24deb5-a1f1-4154-a8a4-c31c69dc5d32" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 13:50:56 compute-0 nova_compute[185723]: 2026-02-16 13:50:56.200 185727 DEBUG nova.network.neutron [None req-4dc8126e-aca3-4847-a26b-d6c55a6d1df0 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] [instance: 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Feb 16 13:50:57 compute-0 nova_compute[185723]: 2026-02-16 13:50:57.393 185727 DEBUG nova.network.neutron [None req-4dc8126e-aca3-4847-a26b-d6c55a6d1df0 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] [instance: 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32] Updating instance_info_cache with network_info: [{"id": "b7a22eb4-a31d-4a96-a5a9-9e56f37cecc1", "address": "fa:16:3e:cd:81:d6", "network": {"id": "25f604b5-711f-4df5-a65b-4ca0c988350f", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1415641352-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c4a5b3f08ab466eaac86305d91fd9a8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb7a22eb4-a3", "ovs_interfaceid": "b7a22eb4-a31d-4a96-a5a9-9e56f37cecc1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 13:50:57 compute-0 nova_compute[185723]: 2026-02-16 13:50:57.413 185727 DEBUG oslo_concurrency.lockutils [None req-4dc8126e-aca3-4847-a26b-d6c55a6d1df0 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] Releasing lock "refresh_cache-6b24deb5-a1f1-4154-a8a4-c31c69dc5d32" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 13:50:57 compute-0 nova_compute[185723]: 2026-02-16 13:50:57.414 185727 DEBUG nova.compute.manager [None req-4dc8126e-aca3-4847-a26b-d6c55a6d1df0 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] [instance: 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32] Instance network_info: |[{"id": "b7a22eb4-a31d-4a96-a5a9-9e56f37cecc1", "address": "fa:16:3e:cd:81:d6", "network": {"id": "25f604b5-711f-4df5-a65b-4ca0c988350f", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1415641352-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c4a5b3f08ab466eaac86305d91fd9a8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb7a22eb4-a3", "ovs_interfaceid": "b7a22eb4-a31d-4a96-a5a9-9e56f37cecc1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Feb 16 13:50:57 compute-0 nova_compute[185723]: 2026-02-16 13:50:57.414 185727 DEBUG oslo_concurrency.lockutils [req-64d527c8-87bc-4032-9759-dc4756be9f5a req-47331e79-f684-4ed8-9b3d-0c21559c8f45 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquired lock "refresh_cache-6b24deb5-a1f1-4154-a8a4-c31c69dc5d32" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 13:50:57 compute-0 nova_compute[185723]: 2026-02-16 13:50:57.414 185727 DEBUG nova.network.neutron [req-64d527c8-87bc-4032-9759-dc4756be9f5a req-47331e79-f684-4ed8-9b3d-0c21559c8f45 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32] Refreshing network info cache for port b7a22eb4-a31d-4a96-a5a9-9e56f37cecc1 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 16 13:50:57 compute-0 nova_compute[185723]: 2026-02-16 13:50:57.416 185727 DEBUG nova.virt.libvirt.driver [None req-4dc8126e-aca3-4847-a26b-d6c55a6d1df0 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] [instance: 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32] Start _get_guest_xml network_info=[{"id": "b7a22eb4-a31d-4a96-a5a9-9e56f37cecc1", "address": "fa:16:3e:cd:81:d6", "network": {"id": "25f604b5-711f-4df5-a65b-4ca0c988350f", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1415641352-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c4a5b3f08ab466eaac86305d91fd9a8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb7a22eb4-a3", "ovs_interfaceid": "b7a22eb4-a31d-4a96-a5a9-9e56f37cecc1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-16T13:17:01Z,direct_url=<?>,disk_format='qcow2',id=6fb9af7f-2971-4890-a777-6e99e888717f,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='fa8ec31824694513a42cc22c81880a5c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-16T13:17:03Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'encrypted': False, 'device_type': 'disk', 'disk_bus': 'virtio', 'encryption_options': None, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'size': 0, 'image_id': '6fb9af7f-2971-4890-a777-6e99e888717f'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Feb 16 13:50:57 compute-0 nova_compute[185723]: 2026-02-16 13:50:57.420 185727 WARNING nova.virt.libvirt.driver [None req-4dc8126e-aca3-4847-a26b-d6c55a6d1df0 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 13:50:57 compute-0 nova_compute[185723]: 2026-02-16 13:50:57.425 185727 DEBUG nova.virt.libvirt.host [None req-4dc8126e-aca3-4847-a26b-d6c55a6d1df0 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Feb 16 13:50:57 compute-0 nova_compute[185723]: 2026-02-16 13:50:57.426 185727 DEBUG nova.virt.libvirt.host [None req-4dc8126e-aca3-4847-a26b-d6c55a6d1df0 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Feb 16 13:50:57 compute-0 nova_compute[185723]: 2026-02-16 13:50:57.429 185727 DEBUG nova.virt.libvirt.host [None req-4dc8126e-aca3-4847-a26b-d6c55a6d1df0 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Feb 16 13:50:57 compute-0 nova_compute[185723]: 2026-02-16 13:50:57.429 185727 DEBUG nova.virt.libvirt.host [None req-4dc8126e-aca3-4847-a26b-d6c55a6d1df0 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Feb 16 13:50:57 compute-0 nova_compute[185723]: 2026-02-16 13:50:57.430 185727 DEBUG nova.virt.libvirt.driver [None req-4dc8126e-aca3-4847-a26b-d6c55a6d1df0 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Feb 16 13:50:57 compute-0 nova_compute[185723]: 2026-02-16 13:50:57.430 185727 DEBUG nova.virt.hardware [None req-4dc8126e-aca3-4847-a26b-d6c55a6d1df0 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-02-16T13:16:59Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='6d89f72c-1760-421e-a5f2-83dfc3723b84',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-16T13:17:01Z,direct_url=<?>,disk_format='qcow2',id=6fb9af7f-2971-4890-a777-6e99e888717f,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='fa8ec31824694513a42cc22c81880a5c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-16T13:17:03Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Feb 16 13:50:57 compute-0 nova_compute[185723]: 2026-02-16 13:50:57.431 185727 DEBUG nova.virt.hardware [None req-4dc8126e-aca3-4847-a26b-d6c55a6d1df0 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Feb 16 13:50:57 compute-0 nova_compute[185723]: 2026-02-16 13:50:57.431 185727 DEBUG nova.virt.hardware [None req-4dc8126e-aca3-4847-a26b-d6c55a6d1df0 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Feb 16 13:50:57 compute-0 nova_compute[185723]: 2026-02-16 13:50:57.431 185727 DEBUG nova.virt.hardware [None req-4dc8126e-aca3-4847-a26b-d6c55a6d1df0 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Feb 16 13:50:57 compute-0 nova_compute[185723]: 2026-02-16 13:50:57.431 185727 DEBUG nova.virt.hardware [None req-4dc8126e-aca3-4847-a26b-d6c55a6d1df0 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Feb 16 13:50:57 compute-0 nova_compute[185723]: 2026-02-16 13:50:57.431 185727 DEBUG nova.virt.hardware [None req-4dc8126e-aca3-4847-a26b-d6c55a6d1df0 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Feb 16 13:50:57 compute-0 nova_compute[185723]: 2026-02-16 13:50:57.432 185727 DEBUG nova.virt.hardware [None req-4dc8126e-aca3-4847-a26b-d6c55a6d1df0 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Feb 16 13:50:57 compute-0 nova_compute[185723]: 2026-02-16 13:50:57.432 185727 DEBUG nova.virt.hardware [None req-4dc8126e-aca3-4847-a26b-d6c55a6d1df0 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Feb 16 13:50:57 compute-0 nova_compute[185723]: 2026-02-16 13:50:57.432 185727 DEBUG nova.virt.hardware [None req-4dc8126e-aca3-4847-a26b-d6c55a6d1df0 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Feb 16 13:50:57 compute-0 nova_compute[185723]: 2026-02-16 13:50:57.432 185727 DEBUG nova.virt.hardware [None req-4dc8126e-aca3-4847-a26b-d6c55a6d1df0 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Feb 16 13:50:57 compute-0 nova_compute[185723]: 2026-02-16 13:50:57.432 185727 DEBUG nova.virt.hardware [None req-4dc8126e-aca3-4847-a26b-d6c55a6d1df0 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Feb 16 13:50:57 compute-0 nova_compute[185723]: 2026-02-16 13:50:57.436 185727 DEBUG nova.virt.libvirt.vif [None req-4dc8126e-aca3-4847-a26b-d6c55a6d1df0 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-16T13:50:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteVmWorkloadBalanceStrategy-server-985821402',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutevmworkloadbalancestrategy-server-985821402',id=25,image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='5c4a5b3f08ab466eaac86305d91fd9a8',ramdisk_id='',reservation_id='r-jwp5hm0l',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteVmWorkloadBalanceStrategy-1500862259',owner_user_name='tempest-TestExecuteVmWorkloadBalanceStrategy-1500862259-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-16T13:50:53Z,user_data=None,user_id='c7c8dce27a2f4917a7dac485b1d8754a',uuid=6b24deb5-a1f1-4154-a8a4-c31c69dc5d32,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b7a22eb4-a31d-4a96-a5a9-9e56f37cecc1", "address": "fa:16:3e:cd:81:d6", "network": {"id": "25f604b5-711f-4df5-a65b-4ca0c988350f", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1415641352-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c4a5b3f08ab466eaac86305d91fd9a8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb7a22eb4-a3", "ovs_interfaceid": "b7a22eb4-a31d-4a96-a5a9-9e56f37cecc1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Feb 16 13:50:57 compute-0 nova_compute[185723]: 2026-02-16 13:50:57.436 185727 DEBUG nova.network.os_vif_util [None req-4dc8126e-aca3-4847-a26b-d6c55a6d1df0 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] Converting VIF {"id": "b7a22eb4-a31d-4a96-a5a9-9e56f37cecc1", "address": "fa:16:3e:cd:81:d6", "network": {"id": "25f604b5-711f-4df5-a65b-4ca0c988350f", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1415641352-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c4a5b3f08ab466eaac86305d91fd9a8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb7a22eb4-a3", "ovs_interfaceid": "b7a22eb4-a31d-4a96-a5a9-9e56f37cecc1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 13:50:57 compute-0 nova_compute[185723]: 2026-02-16 13:50:57.436 185727 DEBUG nova.network.os_vif_util [None req-4dc8126e-aca3-4847-a26b-d6c55a6d1df0 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:cd:81:d6,bridge_name='br-int',has_traffic_filtering=True,id=b7a22eb4-a31d-4a96-a5a9-9e56f37cecc1,network=Network(25f604b5-711f-4df5-a65b-4ca0c988350f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb7a22eb4-a3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 13:50:57 compute-0 nova_compute[185723]: 2026-02-16 13:50:57.437 185727 DEBUG nova.objects.instance [None req-4dc8126e-aca3-4847-a26b-d6c55a6d1df0 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] Lazy-loading 'pci_devices' on Instance uuid 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 13:50:57 compute-0 nova_compute[185723]: 2026-02-16 13:50:57.460 185727 DEBUG nova.virt.libvirt.driver [None req-4dc8126e-aca3-4847-a26b-d6c55a6d1df0 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] [instance: 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32] End _get_guest_xml xml=<domain type="kvm">
Feb 16 13:50:57 compute-0 nova_compute[185723]:   <uuid>6b24deb5-a1f1-4154-a8a4-c31c69dc5d32</uuid>
Feb 16 13:50:57 compute-0 nova_compute[185723]:   <name>instance-00000019</name>
Feb 16 13:50:57 compute-0 nova_compute[185723]:   <memory>131072</memory>
Feb 16 13:50:57 compute-0 nova_compute[185723]:   <vcpu>1</vcpu>
Feb 16 13:50:57 compute-0 nova_compute[185723]:   <metadata>
Feb 16 13:50:57 compute-0 nova_compute[185723]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb 16 13:50:57 compute-0 nova_compute[185723]:       <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Feb 16 13:50:57 compute-0 nova_compute[185723]:       <nova:name>tempest-TestExecuteVmWorkloadBalanceStrategy-server-985821402</nova:name>
Feb 16 13:50:57 compute-0 nova_compute[185723]:       <nova:creationTime>2026-02-16 13:50:57</nova:creationTime>
Feb 16 13:50:57 compute-0 nova_compute[185723]:       <nova:flavor name="m1.nano">
Feb 16 13:50:57 compute-0 nova_compute[185723]:         <nova:memory>128</nova:memory>
Feb 16 13:50:57 compute-0 nova_compute[185723]:         <nova:disk>1</nova:disk>
Feb 16 13:50:57 compute-0 nova_compute[185723]:         <nova:swap>0</nova:swap>
Feb 16 13:50:57 compute-0 nova_compute[185723]:         <nova:ephemeral>0</nova:ephemeral>
Feb 16 13:50:57 compute-0 nova_compute[185723]:         <nova:vcpus>1</nova:vcpus>
Feb 16 13:50:57 compute-0 nova_compute[185723]:       </nova:flavor>
Feb 16 13:50:57 compute-0 nova_compute[185723]:       <nova:owner>
Feb 16 13:50:57 compute-0 nova_compute[185723]:         <nova:user uuid="c7c8dce27a2f4917a7dac485b1d8754a">tempest-TestExecuteVmWorkloadBalanceStrategy-1500862259-project-member</nova:user>
Feb 16 13:50:57 compute-0 nova_compute[185723]:         <nova:project uuid="5c4a5b3f08ab466eaac86305d91fd9a8">tempest-TestExecuteVmWorkloadBalanceStrategy-1500862259</nova:project>
Feb 16 13:50:57 compute-0 nova_compute[185723]:       </nova:owner>
Feb 16 13:50:57 compute-0 nova_compute[185723]:       <nova:root type="image" uuid="6fb9af7f-2971-4890-a777-6e99e888717f"/>
Feb 16 13:50:57 compute-0 nova_compute[185723]:       <nova:ports>
Feb 16 13:50:57 compute-0 nova_compute[185723]:         <nova:port uuid="b7a22eb4-a31d-4a96-a5a9-9e56f37cecc1">
Feb 16 13:50:57 compute-0 nova_compute[185723]:           <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Feb 16 13:50:57 compute-0 nova_compute[185723]:         </nova:port>
Feb 16 13:50:57 compute-0 nova_compute[185723]:       </nova:ports>
Feb 16 13:50:57 compute-0 nova_compute[185723]:     </nova:instance>
Feb 16 13:50:57 compute-0 nova_compute[185723]:   </metadata>
Feb 16 13:50:57 compute-0 nova_compute[185723]:   <sysinfo type="smbios">
Feb 16 13:50:57 compute-0 nova_compute[185723]:     <system>
Feb 16 13:50:57 compute-0 nova_compute[185723]:       <entry name="manufacturer">RDO</entry>
Feb 16 13:50:57 compute-0 nova_compute[185723]:       <entry name="product">OpenStack Compute</entry>
Feb 16 13:50:57 compute-0 nova_compute[185723]:       <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Feb 16 13:50:57 compute-0 nova_compute[185723]:       <entry name="serial">6b24deb5-a1f1-4154-a8a4-c31c69dc5d32</entry>
Feb 16 13:50:57 compute-0 nova_compute[185723]:       <entry name="uuid">6b24deb5-a1f1-4154-a8a4-c31c69dc5d32</entry>
Feb 16 13:50:57 compute-0 nova_compute[185723]:       <entry name="family">Virtual Machine</entry>
Feb 16 13:50:57 compute-0 nova_compute[185723]:     </system>
Feb 16 13:50:57 compute-0 nova_compute[185723]:   </sysinfo>
Feb 16 13:50:57 compute-0 nova_compute[185723]:   <os>
Feb 16 13:50:57 compute-0 nova_compute[185723]:     <type arch="x86_64" machine="q35">hvm</type>
Feb 16 13:50:57 compute-0 nova_compute[185723]:     <boot dev="hd"/>
Feb 16 13:50:57 compute-0 nova_compute[185723]:     <smbios mode="sysinfo"/>
Feb 16 13:50:57 compute-0 nova_compute[185723]:   </os>
Feb 16 13:50:57 compute-0 nova_compute[185723]:   <features>
Feb 16 13:50:57 compute-0 nova_compute[185723]:     <acpi/>
Feb 16 13:50:57 compute-0 nova_compute[185723]:     <apic/>
Feb 16 13:50:57 compute-0 nova_compute[185723]:     <vmcoreinfo/>
Feb 16 13:50:57 compute-0 nova_compute[185723]:   </features>
Feb 16 13:50:57 compute-0 nova_compute[185723]:   <clock offset="utc">
Feb 16 13:50:57 compute-0 nova_compute[185723]:     <timer name="pit" tickpolicy="delay"/>
Feb 16 13:50:57 compute-0 nova_compute[185723]:     <timer name="rtc" tickpolicy="catchup"/>
Feb 16 13:50:57 compute-0 nova_compute[185723]:     <timer name="hpet" present="no"/>
Feb 16 13:50:57 compute-0 nova_compute[185723]:   </clock>
Feb 16 13:50:57 compute-0 nova_compute[185723]:   <cpu mode="custom" match="exact">
Feb 16 13:50:57 compute-0 nova_compute[185723]:     <model>Nehalem</model>
Feb 16 13:50:57 compute-0 nova_compute[185723]:     <topology sockets="1" cores="1" threads="1"/>
Feb 16 13:50:57 compute-0 nova_compute[185723]:   </cpu>
Feb 16 13:50:57 compute-0 nova_compute[185723]:   <devices>
Feb 16 13:50:57 compute-0 nova_compute[185723]:     <disk type="file" device="disk">
Feb 16 13:50:57 compute-0 nova_compute[185723]:       <driver name="qemu" type="qcow2" cache="none"/>
Feb 16 13:50:57 compute-0 nova_compute[185723]:       <source file="/var/lib/nova/instances/6b24deb5-a1f1-4154-a8a4-c31c69dc5d32/disk"/>
Feb 16 13:50:57 compute-0 nova_compute[185723]:       <target dev="vda" bus="virtio"/>
Feb 16 13:50:57 compute-0 nova_compute[185723]:     </disk>
Feb 16 13:50:57 compute-0 nova_compute[185723]:     <disk type="file" device="cdrom">
Feb 16 13:50:57 compute-0 nova_compute[185723]:       <driver name="qemu" type="raw" cache="none"/>
Feb 16 13:50:57 compute-0 nova_compute[185723]:       <source file="/var/lib/nova/instances/6b24deb5-a1f1-4154-a8a4-c31c69dc5d32/disk.config"/>
Feb 16 13:50:57 compute-0 nova_compute[185723]:       <target dev="sda" bus="sata"/>
Feb 16 13:50:57 compute-0 nova_compute[185723]:     </disk>
Feb 16 13:50:57 compute-0 nova_compute[185723]:     <interface type="ethernet">
Feb 16 13:50:57 compute-0 nova_compute[185723]:       <mac address="fa:16:3e:cd:81:d6"/>
Feb 16 13:50:57 compute-0 nova_compute[185723]:       <model type="virtio"/>
Feb 16 13:50:57 compute-0 nova_compute[185723]:       <driver name="vhost" rx_queue_size="512"/>
Feb 16 13:50:57 compute-0 nova_compute[185723]:       <mtu size="1442"/>
Feb 16 13:50:57 compute-0 nova_compute[185723]:       <target dev="tapb7a22eb4-a3"/>
Feb 16 13:50:57 compute-0 nova_compute[185723]:     </interface>
Feb 16 13:50:57 compute-0 nova_compute[185723]:     <serial type="pty">
Feb 16 13:50:57 compute-0 nova_compute[185723]:       <log file="/var/lib/nova/instances/6b24deb5-a1f1-4154-a8a4-c31c69dc5d32/console.log" append="off"/>
Feb 16 13:50:57 compute-0 nova_compute[185723]:     </serial>
Feb 16 13:50:57 compute-0 nova_compute[185723]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Feb 16 13:50:57 compute-0 nova_compute[185723]:     <video>
Feb 16 13:50:57 compute-0 nova_compute[185723]:       <model type="virtio"/>
Feb 16 13:50:57 compute-0 nova_compute[185723]:     </video>
Feb 16 13:50:57 compute-0 nova_compute[185723]:     <input type="tablet" bus="usb"/>
Feb 16 13:50:57 compute-0 nova_compute[185723]:     <rng model="virtio">
Feb 16 13:50:57 compute-0 nova_compute[185723]:       <backend model="random">/dev/urandom</backend>
Feb 16 13:50:57 compute-0 nova_compute[185723]:     </rng>
Feb 16 13:50:57 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root"/>
Feb 16 13:50:57 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:50:57 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:50:57 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:50:57 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:50:57 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:50:57 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:50:57 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:50:57 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:50:57 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:50:57 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:50:57 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:50:57 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:50:57 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:50:57 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:50:57 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:50:57 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:50:57 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:50:57 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:50:57 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:50:57 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:50:57 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:50:57 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:50:57 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:50:57 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:50:57 compute-0 nova_compute[185723]:     <controller type="usb" index="0"/>
Feb 16 13:50:57 compute-0 nova_compute[185723]:     <memballoon model="virtio">
Feb 16 13:50:57 compute-0 nova_compute[185723]:       <stats period="10"/>
Feb 16 13:50:57 compute-0 nova_compute[185723]:     </memballoon>
Feb 16 13:50:57 compute-0 nova_compute[185723]:   </devices>
Feb 16 13:50:57 compute-0 nova_compute[185723]: </domain>
Feb 16 13:50:57 compute-0 nova_compute[185723]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Feb 16 13:50:57 compute-0 nova_compute[185723]: 2026-02-16 13:50:57.461 185727 DEBUG nova.compute.manager [None req-4dc8126e-aca3-4847-a26b-d6c55a6d1df0 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] [instance: 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32] Preparing to wait for external event network-vif-plugged-b7a22eb4-a31d-4a96-a5a9-9e56f37cecc1 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Feb 16 13:50:57 compute-0 nova_compute[185723]: 2026-02-16 13:50:57.461 185727 DEBUG oslo_concurrency.lockutils [None req-4dc8126e-aca3-4847-a26b-d6c55a6d1df0 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] Acquiring lock "6b24deb5-a1f1-4154-a8a4-c31c69dc5d32-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:50:57 compute-0 nova_compute[185723]: 2026-02-16 13:50:57.461 185727 DEBUG oslo_concurrency.lockutils [None req-4dc8126e-aca3-4847-a26b-d6c55a6d1df0 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] Lock "6b24deb5-a1f1-4154-a8a4-c31c69dc5d32-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:50:57 compute-0 nova_compute[185723]: 2026-02-16 13:50:57.462 185727 DEBUG oslo_concurrency.lockutils [None req-4dc8126e-aca3-4847-a26b-d6c55a6d1df0 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] Lock "6b24deb5-a1f1-4154-a8a4-c31c69dc5d32-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:50:57 compute-0 nova_compute[185723]: 2026-02-16 13:50:57.462 185727 DEBUG nova.virt.libvirt.vif [None req-4dc8126e-aca3-4847-a26b-d6c55a6d1df0 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-16T13:50:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteVmWorkloadBalanceStrategy-server-985821402',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutevmworkloadbalancestrategy-server-985821402',id=25,image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='5c4a5b3f08ab466eaac86305d91fd9a8',ramdisk_id='',reservation_id='r-jwp5hm0l',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteVmWorkloadBalanceStrategy-1500862259',owner_user_name='tempest-TestExecuteVmWorkloadBalanceStrategy-1500862259-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-16T13:50:53Z,user_data=None,user_id='c7c8dce27a2f4917a7dac485b1d8754a',uuid=6b24deb5-a1f1-4154-a8a4-c31c69dc5d32,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b7a22eb4-a31d-4a96-a5a9-9e56f37cecc1", "address": "fa:16:3e:cd:81:d6", "network": {"id": "25f604b5-711f-4df5-a65b-4ca0c988350f", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1415641352-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c4a5b3f08ab466eaac86305d91fd9a8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb7a22eb4-a3", "ovs_interfaceid": "b7a22eb4-a31d-4a96-a5a9-9e56f37cecc1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Feb 16 13:50:57 compute-0 nova_compute[185723]: 2026-02-16 13:50:57.463 185727 DEBUG nova.network.os_vif_util [None req-4dc8126e-aca3-4847-a26b-d6c55a6d1df0 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] Converting VIF {"id": "b7a22eb4-a31d-4a96-a5a9-9e56f37cecc1", "address": "fa:16:3e:cd:81:d6", "network": {"id": "25f604b5-711f-4df5-a65b-4ca0c988350f", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1415641352-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c4a5b3f08ab466eaac86305d91fd9a8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb7a22eb4-a3", "ovs_interfaceid": "b7a22eb4-a31d-4a96-a5a9-9e56f37cecc1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 13:50:57 compute-0 nova_compute[185723]: 2026-02-16 13:50:57.463 185727 DEBUG nova.network.os_vif_util [None req-4dc8126e-aca3-4847-a26b-d6c55a6d1df0 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:cd:81:d6,bridge_name='br-int',has_traffic_filtering=True,id=b7a22eb4-a31d-4a96-a5a9-9e56f37cecc1,network=Network(25f604b5-711f-4df5-a65b-4ca0c988350f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb7a22eb4-a3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 13:50:57 compute-0 nova_compute[185723]: 2026-02-16 13:50:57.463 185727 DEBUG os_vif [None req-4dc8126e-aca3-4847-a26b-d6c55a6d1df0 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:cd:81:d6,bridge_name='br-int',has_traffic_filtering=True,id=b7a22eb4-a31d-4a96-a5a9-9e56f37cecc1,network=Network(25f604b5-711f-4df5-a65b-4ca0c988350f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb7a22eb4-a3') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Feb 16 13:50:57 compute-0 nova_compute[185723]: 2026-02-16 13:50:57.464 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:50:57 compute-0 nova_compute[185723]: 2026-02-16 13:50:57.464 185727 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:50:57 compute-0 nova_compute[185723]: 2026-02-16 13:50:57.465 185727 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 13:50:57 compute-0 nova_compute[185723]: 2026-02-16 13:50:57.467 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:50:57 compute-0 nova_compute[185723]: 2026-02-16 13:50:57.467 185727 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb7a22eb4-a3, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:50:57 compute-0 nova_compute[185723]: 2026-02-16 13:50:57.467 185727 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapb7a22eb4-a3, col_values=(('external_ids', {'iface-id': 'b7a22eb4-a31d-4a96-a5a9-9e56f37cecc1', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:cd:81:d6', 'vm-uuid': '6b24deb5-a1f1-4154-a8a4-c31c69dc5d32'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:50:57 compute-0 nova_compute[185723]: 2026-02-16 13:50:57.469 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:50:57 compute-0 NetworkManager[56177]: <info>  [1771249857.4699] manager: (tapb7a22eb4-a3): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/89)
Feb 16 13:50:57 compute-0 nova_compute[185723]: 2026-02-16 13:50:57.471 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 13:50:57 compute-0 nova_compute[185723]: 2026-02-16 13:50:57.474 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:50:57 compute-0 nova_compute[185723]: 2026-02-16 13:50:57.475 185727 INFO os_vif [None req-4dc8126e-aca3-4847-a26b-d6c55a6d1df0 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:cd:81:d6,bridge_name='br-int',has_traffic_filtering=True,id=b7a22eb4-a31d-4a96-a5a9-9e56f37cecc1,network=Network(25f604b5-711f-4df5-a65b-4ca0c988350f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb7a22eb4-a3')
Feb 16 13:50:57 compute-0 nova_compute[185723]: 2026-02-16 13:50:57.547 185727 DEBUG nova.virt.libvirt.driver [None req-4dc8126e-aca3-4847-a26b-d6c55a6d1df0 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 16 13:50:57 compute-0 nova_compute[185723]: 2026-02-16 13:50:57.548 185727 DEBUG nova.virt.libvirt.driver [None req-4dc8126e-aca3-4847-a26b-d6c55a6d1df0 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 16 13:50:57 compute-0 nova_compute[185723]: 2026-02-16 13:50:57.548 185727 DEBUG nova.virt.libvirt.driver [None req-4dc8126e-aca3-4847-a26b-d6c55a6d1df0 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] No VIF found with MAC fa:16:3e:cd:81:d6, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Feb 16 13:50:57 compute-0 nova_compute[185723]: 2026-02-16 13:50:57.549 185727 INFO nova.virt.libvirt.driver [None req-4dc8126e-aca3-4847-a26b-d6c55a6d1df0 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] [instance: 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32] Using config drive
Feb 16 13:50:58 compute-0 nova_compute[185723]: 2026-02-16 13:50:58.417 185727 INFO nova.virt.libvirt.driver [None req-4dc8126e-aca3-4847-a26b-d6c55a6d1df0 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] [instance: 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32] Creating config drive at /var/lib/nova/instances/6b24deb5-a1f1-4154-a8a4-c31c69dc5d32/disk.config
Feb 16 13:50:58 compute-0 nova_compute[185723]: 2026-02-16 13:50:58.421 185727 DEBUG oslo_concurrency.processutils [None req-4dc8126e-aca3-4847-a26b-d6c55a6d1df0 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/6b24deb5-a1f1-4154-a8a4-c31c69dc5d32/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpguovntlq execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:50:58 compute-0 nova_compute[185723]: 2026-02-16 13:50:58.542 185727 DEBUG oslo_concurrency.processutils [None req-4dc8126e-aca3-4847-a26b-d6c55a6d1df0 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/6b24deb5-a1f1-4154-a8a4-c31c69dc5d32/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpguovntlq" returned: 0 in 0.121s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:50:58 compute-0 kernel: tapb7a22eb4-a3: entered promiscuous mode
Feb 16 13:50:58 compute-0 ovn_controller[96072]: 2026-02-16T13:50:58Z|00225|binding|INFO|Claiming lport b7a22eb4-a31d-4a96-a5a9-9e56f37cecc1 for this chassis.
Feb 16 13:50:58 compute-0 ovn_controller[96072]: 2026-02-16T13:50:58Z|00226|binding|INFO|b7a22eb4-a31d-4a96-a5a9-9e56f37cecc1: Claiming fa:16:3e:cd:81:d6 10.100.0.12
Feb 16 13:50:58 compute-0 nova_compute[185723]: 2026-02-16 13:50:58.591 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:50:58 compute-0 NetworkManager[56177]: <info>  [1771249858.5927] manager: (tapb7a22eb4-a3): new Tun device (/org/freedesktop/NetworkManager/Devices/90)
Feb 16 13:50:58 compute-0 nova_compute[185723]: 2026-02-16 13:50:58.595 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:50:58 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:50:58.605 105360 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:cd:81:d6 10.100.0.12'], port_security=['fa:16:3e:cd:81:d6 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '6b24deb5-a1f1-4154-a8a4-c31c69dc5d32', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-25f604b5-711f-4df5-a65b-4ca0c988350f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5c4a5b3f08ab466eaac86305d91fd9a8', 'neutron:revision_number': '2', 'neutron:security_group_ids': '238bb162-7cdb-4292-ac0d-7fe46bc858a9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a96acfaa-4a70-40f4-bebe-b6fb536cb5a3, chassis=[<ovs.db.idl.Row object at 0x7f2e53046760>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2e53046760>], logical_port=b7a22eb4-a31d-4a96-a5a9-9e56f37cecc1) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 13:50:58 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:50:58.606 105360 INFO neutron.agent.ovn.metadata.agent [-] Port b7a22eb4-a31d-4a96-a5a9-9e56f37cecc1 in datapath 25f604b5-711f-4df5-a65b-4ca0c988350f bound to our chassis
Feb 16 13:50:58 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:50:58.607 105360 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 25f604b5-711f-4df5-a65b-4ca0c988350f
Feb 16 13:50:58 compute-0 nova_compute[185723]: 2026-02-16 13:50:58.613 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:50:58 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:50:58.615 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[d8ba598c-4399-4ba9-ab97-4549d254dbb0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:50:58 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:50:58.616 105360 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap25f604b5-71 in ovnmeta-25f604b5-711f-4df5-a65b-4ca0c988350f namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Feb 16 13:50:58 compute-0 systemd-udevd[216274]: Network interface NamePolicy= disabled on kernel command line.
Feb 16 13:50:58 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:50:58.618 206438 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap25f604b5-70 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Feb 16 13:50:58 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:50:58.618 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[5a26d3ee-5348-431c-96eb-9605c3e49bec]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:50:58 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:50:58.618 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[d28dac6e-eafe-4612-9594-0edc3d97e20a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:50:58 compute-0 ovn_controller[96072]: 2026-02-16T13:50:58Z|00227|binding|INFO|Setting lport b7a22eb4-a31d-4a96-a5a9-9e56f37cecc1 ovn-installed in OVS
Feb 16 13:50:58 compute-0 ovn_controller[96072]: 2026-02-16T13:50:58Z|00228|binding|INFO|Setting lport b7a22eb4-a31d-4a96-a5a9-9e56f37cecc1 up in Southbound
Feb 16 13:50:58 compute-0 nova_compute[185723]: 2026-02-16 13:50:58.620 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:50:58 compute-0 systemd-machined[155229]: New machine qemu-21-instance-00000019.
Feb 16 13:50:58 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:50:58.628 105762 DEBUG oslo.privsep.daemon [-] privsep: reply[7b808b98-2522-4f1d-8751-b3981381bd62]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:50:58 compute-0 NetworkManager[56177]: <info>  [1771249858.6310] device (tapb7a22eb4-a3): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 16 13:50:58 compute-0 NetworkManager[56177]: <info>  [1771249858.6319] device (tapb7a22eb4-a3): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Feb 16 13:50:58 compute-0 systemd[1]: Started Virtual Machine qemu-21-instance-00000019.
Feb 16 13:50:58 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:50:58.639 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[feffb477-cf85-4c61-9571-94b6536c6f0c]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:50:58 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:50:58.663 206452 DEBUG oslo.privsep.daemon [-] privsep: reply[66afe6cb-fa9e-45d1-b4c3-e4c78dc2f245]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:50:58 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:50:58.669 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[7c59f9aa-bb20-47da-8dc1-89a25ce70a5a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:50:58 compute-0 NetworkManager[56177]: <info>  [1771249858.6699] manager: (tap25f604b5-70): new Veth device (/org/freedesktop/NetworkManager/Devices/91)
Feb 16 13:50:58 compute-0 systemd-udevd[216279]: Network interface NamePolicy= disabled on kernel command line.
Feb 16 13:50:58 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:50:58.694 206452 DEBUG oslo.privsep.daemon [-] privsep: reply[b70bbf17-a851-4a9a-9de2-14a9ce7e1e00]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:50:58 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:50:58.699 206452 DEBUG oslo.privsep.daemon [-] privsep: reply[4f026067-9bcb-4d02-ba72-e1cc8c4bc521]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:50:58 compute-0 NetworkManager[56177]: <info>  [1771249858.7176] device (tap25f604b5-70): carrier: link connected
Feb 16 13:50:58 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:50:58.719 206452 DEBUG oslo.privsep.daemon [-] privsep: reply[0d0da4bb-35ba-410d-9f0e-13f25b433745]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:50:58 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:50:58.733 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[295071b0-bae9-41d0-8270-a4649b16cc1e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap25f604b5-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d8:17:ad'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 64], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 594211, 'reachable_time': 38757, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216308, 'error': None, 'target': 'ovnmeta-25f604b5-711f-4df5-a65b-4ca0c988350f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:50:58 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:50:58.746 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[9525a2ff-ba53-4b14-975c-6def1fad79e9]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fed8:17ad'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 594211, 'tstamp': 594211}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216309, 'error': None, 'target': 'ovnmeta-25f604b5-711f-4df5-a65b-4ca0c988350f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:50:58 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:50:58.759 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[f87df17e-7115-44e3-ae6a-c853a8c1870c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap25f604b5-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d8:17:ad'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 64], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 594211, 'reachable_time': 38757, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 216310, 'error': None, 'target': 'ovnmeta-25f604b5-711f-4df5-a65b-4ca0c988350f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:50:58 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:50:58.784 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[8034cea6-c827-4236-8621-1c73f71ac37c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:50:58 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:50:58.833 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[e28d25ff-b860-472f-9e52-21f7f8e03c41]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:50:58 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:50:58.834 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap25f604b5-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:50:58 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:50:58.835 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 13:50:58 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:50:58.835 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap25f604b5-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:50:58 compute-0 NetworkManager[56177]: <info>  [1771249858.8661] manager: (tap25f604b5-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/92)
Feb 16 13:50:58 compute-0 nova_compute[185723]: 2026-02-16 13:50:58.865 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:50:58 compute-0 kernel: tap25f604b5-70: entered promiscuous mode
Feb 16 13:50:58 compute-0 nova_compute[185723]: 2026-02-16 13:50:58.868 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:50:58 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:50:58.869 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap25f604b5-70, col_values=(('external_ids', {'iface-id': 'a43b300c-9dd2-4ad8-8dd7-aaeb277a3352'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:50:58 compute-0 nova_compute[185723]: 2026-02-16 13:50:58.870 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:50:58 compute-0 ovn_controller[96072]: 2026-02-16T13:50:58Z|00229|binding|INFO|Releasing lport a43b300c-9dd2-4ad8-8dd7-aaeb277a3352 from this chassis (sb_readonly=0)
Feb 16 13:50:58 compute-0 nova_compute[185723]: 2026-02-16 13:50:58.876 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:50:58 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:50:58.877 105360 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/25f604b5-711f-4df5-a65b-4ca0c988350f.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/25f604b5-711f-4df5-a65b-4ca0c988350f.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Feb 16 13:50:58 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:50:58.878 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[7517b129-fb5e-498b-b6c2-dce8ce87c5bd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:50:58 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:50:58.879 105360 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Feb 16 13:50:58 compute-0 ovn_metadata_agent[105355]: global
Feb 16 13:50:58 compute-0 ovn_metadata_agent[105355]:     log         /dev/log local0 debug
Feb 16 13:50:58 compute-0 ovn_metadata_agent[105355]:     log-tag     haproxy-metadata-proxy-25f604b5-711f-4df5-a65b-4ca0c988350f
Feb 16 13:50:58 compute-0 ovn_metadata_agent[105355]:     user        root
Feb 16 13:50:58 compute-0 ovn_metadata_agent[105355]:     group       root
Feb 16 13:50:58 compute-0 ovn_metadata_agent[105355]:     maxconn     1024
Feb 16 13:50:58 compute-0 ovn_metadata_agent[105355]:     pidfile     /var/lib/neutron/external/pids/25f604b5-711f-4df5-a65b-4ca0c988350f.pid.haproxy
Feb 16 13:50:58 compute-0 ovn_metadata_agent[105355]:     daemon
Feb 16 13:50:58 compute-0 ovn_metadata_agent[105355]: 
Feb 16 13:50:58 compute-0 ovn_metadata_agent[105355]: defaults
Feb 16 13:50:58 compute-0 ovn_metadata_agent[105355]:     log global
Feb 16 13:50:58 compute-0 ovn_metadata_agent[105355]:     mode http
Feb 16 13:50:58 compute-0 ovn_metadata_agent[105355]:     option httplog
Feb 16 13:50:58 compute-0 ovn_metadata_agent[105355]:     option dontlognull
Feb 16 13:50:58 compute-0 ovn_metadata_agent[105355]:     option http-server-close
Feb 16 13:50:58 compute-0 ovn_metadata_agent[105355]:     option forwardfor
Feb 16 13:50:58 compute-0 ovn_metadata_agent[105355]:     retries                 3
Feb 16 13:50:58 compute-0 ovn_metadata_agent[105355]:     timeout http-request    30s
Feb 16 13:50:58 compute-0 ovn_metadata_agent[105355]:     timeout connect         30s
Feb 16 13:50:58 compute-0 ovn_metadata_agent[105355]:     timeout client          32s
Feb 16 13:50:58 compute-0 ovn_metadata_agent[105355]:     timeout server          32s
Feb 16 13:50:58 compute-0 ovn_metadata_agent[105355]:     timeout http-keep-alive 30s
Feb 16 13:50:58 compute-0 ovn_metadata_agent[105355]: 
Feb 16 13:50:58 compute-0 ovn_metadata_agent[105355]: 
Feb 16 13:50:58 compute-0 ovn_metadata_agent[105355]: listen listener
Feb 16 13:50:58 compute-0 ovn_metadata_agent[105355]:     bind 169.254.169.254:80
Feb 16 13:50:58 compute-0 ovn_metadata_agent[105355]:     server metadata /var/lib/neutron/metadata_proxy
Feb 16 13:50:58 compute-0 ovn_metadata_agent[105355]:     http-request add-header X-OVN-Network-ID 25f604b5-711f-4df5-a65b-4ca0c988350f
Feb 16 13:50:58 compute-0 ovn_metadata_agent[105355]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Feb 16 13:50:58 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:50:58.880 105360 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-25f604b5-711f-4df5-a65b-4ca0c988350f', 'env', 'PROCESS_TAG=haproxy-25f604b5-711f-4df5-a65b-4ca0c988350f', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/25f604b5-711f-4df5-a65b-4ca0c988350f.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Feb 16 13:50:59 compute-0 podman[216342]: 2026-02-16 13:50:59.202056664 +0000 UTC m=+0.042101727 container create bc7b416965aff15e4dc070b4b65082ecd05fb91eee42816af986e3677b9fab70 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-25f604b5-711f-4df5-a65b-4ca0c988350f, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true)
Feb 16 13:50:59 compute-0 systemd[1]: Started libpod-conmon-bc7b416965aff15e4dc070b4b65082ecd05fb91eee42816af986e3677b9fab70.scope.
Feb 16 13:50:59 compute-0 systemd[1]: Started libcrun container.
Feb 16 13:50:59 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a1f33c7c6e4b22030a06599fb5d532d0072daa22bd37983c4893cbaef0881720/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 16 13:50:59 compute-0 podman[216342]: 2026-02-16 13:50:59.179064173 +0000 UTC m=+0.019109266 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Feb 16 13:50:59 compute-0 podman[216342]: 2026-02-16 13:50:59.281293473 +0000 UTC m=+0.121338566 container init bc7b416965aff15e4dc070b4b65082ecd05fb91eee42816af986e3677b9fab70 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-25f604b5-711f-4df5-a65b-4ca0c988350f, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 16 13:50:59 compute-0 podman[216342]: 2026-02-16 13:50:59.285500008 +0000 UTC m=+0.125545071 container start bc7b416965aff15e4dc070b4b65082ecd05fb91eee42816af986e3677b9fab70 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-25f604b5-711f-4df5-a65b-4ca0c988350f, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Feb 16 13:50:59 compute-0 neutron-haproxy-ovnmeta-25f604b5-711f-4df5-a65b-4ca0c988350f[216357]: [NOTICE]   (216361) : New worker (216364) forked
Feb 16 13:50:59 compute-0 neutron-haproxy-ovnmeta-25f604b5-711f-4df5-a65b-4ca0c988350f[216357]: [NOTICE]   (216361) : Loading success.
Feb 16 13:50:59 compute-0 nova_compute[185723]: 2026-02-16 13:50:59.328 185727 DEBUG nova.compute.manager [req-b6137473-c23e-4984-a2c4-abf62924eec5 req-712f1c82-a431-483a-b641-0ecd34ac631e faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32] Received event network-vif-plugged-b7a22eb4-a31d-4a96-a5a9-9e56f37cecc1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:50:59 compute-0 nova_compute[185723]: 2026-02-16 13:50:59.329 185727 DEBUG oslo_concurrency.lockutils [req-b6137473-c23e-4984-a2c4-abf62924eec5 req-712f1c82-a431-483a-b641-0ecd34ac631e faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "6b24deb5-a1f1-4154-a8a4-c31c69dc5d32-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:50:59 compute-0 nova_compute[185723]: 2026-02-16 13:50:59.330 185727 DEBUG oslo_concurrency.lockutils [req-b6137473-c23e-4984-a2c4-abf62924eec5 req-712f1c82-a431-483a-b641-0ecd34ac631e faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "6b24deb5-a1f1-4154-a8a4-c31c69dc5d32-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:50:59 compute-0 nova_compute[185723]: 2026-02-16 13:50:59.330 185727 DEBUG oslo_concurrency.lockutils [req-b6137473-c23e-4984-a2c4-abf62924eec5 req-712f1c82-a431-483a-b641-0ecd34ac631e faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "6b24deb5-a1f1-4154-a8a4-c31c69dc5d32-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:50:59 compute-0 nova_compute[185723]: 2026-02-16 13:50:59.330 185727 DEBUG nova.compute.manager [req-b6137473-c23e-4984-a2c4-abf62924eec5 req-712f1c82-a431-483a-b641-0ecd34ac631e faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32] Processing event network-vif-plugged-b7a22eb4-a31d-4a96-a5a9-9e56f37cecc1 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Feb 16 13:50:59 compute-0 nova_compute[185723]: 2026-02-16 13:50:59.392 185727 DEBUG nova.virt.driver [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] Emitting event <LifecycleEvent: 1771249859.3918118, 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 13:50:59 compute-0 nova_compute[185723]: 2026-02-16 13:50:59.392 185727 INFO nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32] VM Started (Lifecycle Event)
Feb 16 13:50:59 compute-0 nova_compute[185723]: 2026-02-16 13:50:59.394 185727 DEBUG nova.compute.manager [None req-4dc8126e-aca3-4847-a26b-d6c55a6d1df0 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] [instance: 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Feb 16 13:50:59 compute-0 nova_compute[185723]: 2026-02-16 13:50:59.397 185727 DEBUG nova.virt.libvirt.driver [None req-4dc8126e-aca3-4847-a26b-d6c55a6d1df0 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] [instance: 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Feb 16 13:50:59 compute-0 nova_compute[185723]: 2026-02-16 13:50:59.400 185727 INFO nova.virt.libvirt.driver [-] [instance: 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32] Instance spawned successfully.
Feb 16 13:50:59 compute-0 nova_compute[185723]: 2026-02-16 13:50:59.400 185727 DEBUG nova.virt.libvirt.driver [None req-4dc8126e-aca3-4847-a26b-d6c55a6d1df0 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] [instance: 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Feb 16 13:50:59 compute-0 nova_compute[185723]: 2026-02-16 13:50:59.416 185727 DEBUG nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:50:59 compute-0 nova_compute[185723]: 2026-02-16 13:50:59.421 185727 DEBUG nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 16 13:50:59 compute-0 nova_compute[185723]: 2026-02-16 13:50:59.425 185727 DEBUG nova.virt.libvirt.driver [None req-4dc8126e-aca3-4847-a26b-d6c55a6d1df0 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] [instance: 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:50:59 compute-0 nova_compute[185723]: 2026-02-16 13:50:59.425 185727 DEBUG nova.virt.libvirt.driver [None req-4dc8126e-aca3-4847-a26b-d6c55a6d1df0 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] [instance: 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:50:59 compute-0 nova_compute[185723]: 2026-02-16 13:50:59.426 185727 DEBUG nova.virt.libvirt.driver [None req-4dc8126e-aca3-4847-a26b-d6c55a6d1df0 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] [instance: 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:50:59 compute-0 nova_compute[185723]: 2026-02-16 13:50:59.426 185727 DEBUG nova.virt.libvirt.driver [None req-4dc8126e-aca3-4847-a26b-d6c55a6d1df0 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] [instance: 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:50:59 compute-0 nova_compute[185723]: 2026-02-16 13:50:59.426 185727 DEBUG nova.virt.libvirt.driver [None req-4dc8126e-aca3-4847-a26b-d6c55a6d1df0 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] [instance: 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:50:59 compute-0 nova_compute[185723]: 2026-02-16 13:50:59.427 185727 DEBUG nova.virt.libvirt.driver [None req-4dc8126e-aca3-4847-a26b-d6c55a6d1df0 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] [instance: 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:50:59 compute-0 nova_compute[185723]: 2026-02-16 13:50:59.452 185727 INFO nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 16 13:50:59 compute-0 nova_compute[185723]: 2026-02-16 13:50:59.453 185727 DEBUG nova.virt.driver [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] Emitting event <LifecycleEvent: 1771249859.3927453, 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 13:50:59 compute-0 nova_compute[185723]: 2026-02-16 13:50:59.453 185727 INFO nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32] VM Paused (Lifecycle Event)
Feb 16 13:50:59 compute-0 nova_compute[185723]: 2026-02-16 13:50:59.482 185727 DEBUG nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:50:59 compute-0 nova_compute[185723]: 2026-02-16 13:50:59.486 185727 DEBUG nova.virt.driver [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] Emitting event <LifecycleEvent: 1771249859.3969982, 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 13:50:59 compute-0 nova_compute[185723]: 2026-02-16 13:50:59.486 185727 INFO nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32] VM Resumed (Lifecycle Event)
Feb 16 13:50:59 compute-0 nova_compute[185723]: 2026-02-16 13:50:59.496 185727 INFO nova.compute.manager [None req-4dc8126e-aca3-4847-a26b-d6c55a6d1df0 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] [instance: 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32] Took 6.04 seconds to spawn the instance on the hypervisor.
Feb 16 13:50:59 compute-0 nova_compute[185723]: 2026-02-16 13:50:59.497 185727 DEBUG nova.compute.manager [None req-4dc8126e-aca3-4847-a26b-d6c55a6d1df0 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] [instance: 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:50:59 compute-0 nova_compute[185723]: 2026-02-16 13:50:59.508 185727 DEBUG nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:50:59 compute-0 nova_compute[185723]: 2026-02-16 13:50:59.511 185727 DEBUG nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 16 13:50:59 compute-0 nova_compute[185723]: 2026-02-16 13:50:59.543 185727 INFO nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 16 13:50:59 compute-0 nova_compute[185723]: 2026-02-16 13:50:59.581 185727 INFO nova.compute.manager [None req-4dc8126e-aca3-4847-a26b-d6c55a6d1df0 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] [instance: 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32] Took 6.56 seconds to build instance.
Feb 16 13:50:59 compute-0 nova_compute[185723]: 2026-02-16 13:50:59.597 185727 DEBUG oslo_concurrency.lockutils [None req-4dc8126e-aca3-4847-a26b-d6c55a6d1df0 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] Lock "6b24deb5-a1f1-4154-a8a4-c31c69dc5d32" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.645s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:50:59 compute-0 podman[195053]: time="2026-02-16T13:50:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:50:59 compute-0 podman[195053]: @ - - [16/Feb/2026:13:50:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17245 "" "Go-http-client/1.1"
Feb 16 13:50:59 compute-0 podman[195053]: @ - - [16/Feb/2026:13:50:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2635 "" "Go-http-client/1.1"
Feb 16 13:51:00 compute-0 nova_compute[185723]: 2026-02-16 13:51:00.110 185727 DEBUG nova.network.neutron [req-64d527c8-87bc-4032-9759-dc4756be9f5a req-47331e79-f684-4ed8-9b3d-0c21559c8f45 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32] Updated VIF entry in instance network info cache for port b7a22eb4-a31d-4a96-a5a9-9e56f37cecc1. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 16 13:51:00 compute-0 nova_compute[185723]: 2026-02-16 13:51:00.111 185727 DEBUG nova.network.neutron [req-64d527c8-87bc-4032-9759-dc4756be9f5a req-47331e79-f684-4ed8-9b3d-0c21559c8f45 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32] Updating instance_info_cache with network_info: [{"id": "b7a22eb4-a31d-4a96-a5a9-9e56f37cecc1", "address": "fa:16:3e:cd:81:d6", "network": {"id": "25f604b5-711f-4df5-a65b-4ca0c988350f", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1415641352-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c4a5b3f08ab466eaac86305d91fd9a8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb7a22eb4-a3", "ovs_interfaceid": "b7a22eb4-a31d-4a96-a5a9-9e56f37cecc1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 13:51:00 compute-0 nova_compute[185723]: 2026-02-16 13:51:00.129 185727 DEBUG oslo_concurrency.lockutils [req-64d527c8-87bc-4032-9759-dc4756be9f5a req-47331e79-f684-4ed8-9b3d-0c21559c8f45 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Releasing lock "refresh_cache-6b24deb5-a1f1-4154-a8a4-c31c69dc5d32" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 13:51:00 compute-0 nova_compute[185723]: 2026-02-16 13:51:00.649 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:51:01 compute-0 openstack_network_exporter[197909]: ERROR   13:51:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:51:01 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:51:01 compute-0 openstack_network_exporter[197909]: ERROR   13:51:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:51:01 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:51:01 compute-0 nova_compute[185723]: 2026-02-16 13:51:01.444 185727 DEBUG nova.compute.manager [req-608fb407-17ec-47db-96c8-35340d1aab5c req-95149426-6eef-4fa6-ae52-67f04324b7b7 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32] Received event network-vif-plugged-b7a22eb4-a31d-4a96-a5a9-9e56f37cecc1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:51:01 compute-0 nova_compute[185723]: 2026-02-16 13:51:01.444 185727 DEBUG oslo_concurrency.lockutils [req-608fb407-17ec-47db-96c8-35340d1aab5c req-95149426-6eef-4fa6-ae52-67f04324b7b7 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "6b24deb5-a1f1-4154-a8a4-c31c69dc5d32-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:51:01 compute-0 nova_compute[185723]: 2026-02-16 13:51:01.445 185727 DEBUG oslo_concurrency.lockutils [req-608fb407-17ec-47db-96c8-35340d1aab5c req-95149426-6eef-4fa6-ae52-67f04324b7b7 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "6b24deb5-a1f1-4154-a8a4-c31c69dc5d32-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:51:01 compute-0 nova_compute[185723]: 2026-02-16 13:51:01.445 185727 DEBUG oslo_concurrency.lockutils [req-608fb407-17ec-47db-96c8-35340d1aab5c req-95149426-6eef-4fa6-ae52-67f04324b7b7 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "6b24deb5-a1f1-4154-a8a4-c31c69dc5d32-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:51:01 compute-0 nova_compute[185723]: 2026-02-16 13:51:01.445 185727 DEBUG nova.compute.manager [req-608fb407-17ec-47db-96c8-35340d1aab5c req-95149426-6eef-4fa6-ae52-67f04324b7b7 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32] No waiting events found dispatching network-vif-plugged-b7a22eb4-a31d-4a96-a5a9-9e56f37cecc1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:51:01 compute-0 nova_compute[185723]: 2026-02-16 13:51:01.445 185727 WARNING nova.compute.manager [req-608fb407-17ec-47db-96c8-35340d1aab5c req-95149426-6eef-4fa6-ae52-67f04324b7b7 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32] Received unexpected event network-vif-plugged-b7a22eb4-a31d-4a96-a5a9-9e56f37cecc1 for instance with vm_state active and task_state None.
Feb 16 13:51:02 compute-0 nova_compute[185723]: 2026-02-16 13:51:02.518 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:51:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:51:03.244 105360 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:51:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:51:03.245 105360 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:51:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:51:03.246 105360 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:51:05 compute-0 nova_compute[185723]: 2026-02-16 13:51:05.651 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:51:07 compute-0 nova_compute[185723]: 2026-02-16 13:51:07.520 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:51:10 compute-0 nova_compute[185723]: 2026-02-16 13:51:10.695 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:51:11 compute-0 ovn_controller[96072]: 2026-02-16T13:51:11Z|00030|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:cd:81:d6 10.100.0.12
Feb 16 13:51:11 compute-0 ovn_controller[96072]: 2026-02-16T13:51:11Z|00031|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:cd:81:d6 10.100.0.12
Feb 16 13:51:12 compute-0 nova_compute[185723]: 2026-02-16 13:51:12.523 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:51:13 compute-0 podman[216391]: 2026-02-16 13:51:13.017105323 +0000 UTC m=+0.048134137 container health_status d69bd0be854ec8043fcba555b70c2cd311c7a2510d07d6bc9929fe7684abece9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0)
Feb 16 13:51:13 compute-0 podman[216390]: 2026-02-16 13:51:13.017484133 +0000 UTC m=+0.053287925 container health_status 93151dcbec9f159583042df5101bdd7b5b1bcb5f3117955b4bc9cd1c0e465b98 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2026-02-05T04:57:10Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.7, release=1770267347, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, container_name=openstack_network_exporter, distribution-scope=public, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2026-02-05T04:57:10Z, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., config_id=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9/ubi-minimal, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, architecture=x86_64, io.openshift.expose-services=)
Feb 16 13:51:15 compute-0 nova_compute[185723]: 2026-02-16 13:51:15.698 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:51:17 compute-0 sshd-session[216431]: Invalid user ubuntu from 64.227.72.94 port 41252
Feb 16 13:51:17 compute-0 sshd-session[216431]: Connection closed by invalid user ubuntu 64.227.72.94 port 41252 [preauth]
Feb 16 13:51:17 compute-0 nova_compute[185723]: 2026-02-16 13:51:17.566 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:51:19 compute-0 podman[216433]: 2026-02-16 13:51:19.047194204 +0000 UTC m=+0.082181913 container health_status 19961a2f872cc72daf954f4dd556f980b5f58f137d145776df6132c167a8a93c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Feb 16 13:51:20 compute-0 nova_compute[185723]: 2026-02-16 13:51:20.700 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:51:22 compute-0 nova_compute[185723]: 2026-02-16 13:51:22.569 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:51:24 compute-0 nova_compute[185723]: 2026-02-16 13:51:24.567 185727 DEBUG nova.compute.manager [None req-f951de9d-b9af-4acf-b91f-a6aa80a4f1b6 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Adding trait COMPUTE_STATUS_DISABLED to compute node resource provider c9501a85-df32-4b8f-bce0-9425ef1e7866 in placement. update_compute_provider_status /usr/lib/python3.9/site-packages/nova/compute/manager.py:610
Feb 16 13:51:24 compute-0 nova_compute[185723]: 2026-02-16 13:51:24.614 185727 DEBUG nova.compute.provider_tree [None req-f951de9d-b9af-4acf-b91f-a6aa80a4f1b6 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Updating resource provider c9501a85-df32-4b8f-bce0-9425ef1e7866 generation from 31 to 37 during operation: update_traits _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Feb 16 13:51:25 compute-0 nova_compute[185723]: 2026-02-16 13:51:25.701 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:51:26 compute-0 nova_compute[185723]: 2026-02-16 13:51:26.433 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:51:26 compute-0 nova_compute[185723]: 2026-02-16 13:51:26.433 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:51:26 compute-0 nova_compute[185723]: 2026-02-16 13:51:26.469 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:51:26 compute-0 nova_compute[185723]: 2026-02-16 13:51:26.470 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:51:26 compute-0 nova_compute[185723]: 2026-02-16 13:51:26.470 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:51:26 compute-0 nova_compute[185723]: 2026-02-16 13:51:26.470 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 16 13:51:26 compute-0 nova_compute[185723]: 2026-02-16 13:51:26.531 185727 DEBUG oslo_concurrency.processutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6b24deb5-a1f1-4154-a8a4-c31c69dc5d32/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:51:26 compute-0 podman[216461]: 2026-02-16 13:51:26.567288061 +0000 UTC m=+0.055120610 container health_status 4ec6105d1727f201f656d7e6e358931be8ceabc5af37fb5ad779abc620122180 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 16 13:51:26 compute-0 nova_compute[185723]: 2026-02-16 13:51:26.584 185727 DEBUG oslo_concurrency.processutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6b24deb5-a1f1-4154-a8a4-c31c69dc5d32/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:51:26 compute-0 nova_compute[185723]: 2026-02-16 13:51:26.585 185727 DEBUG oslo_concurrency.processutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6b24deb5-a1f1-4154-a8a4-c31c69dc5d32/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:51:26 compute-0 nova_compute[185723]: 2026-02-16 13:51:26.640 185727 DEBUG oslo_concurrency.processutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6b24deb5-a1f1-4154-a8a4-c31c69dc5d32/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:51:26 compute-0 nova_compute[185723]: 2026-02-16 13:51:26.763 185727 WARNING nova.virt.libvirt.driver [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 13:51:26 compute-0 nova_compute[185723]: 2026-02-16 13:51:26.764 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5640MB free_disk=73.19082260131836GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 16 13:51:26 compute-0 nova_compute[185723]: 2026-02-16 13:51:26.765 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:51:26 compute-0 nova_compute[185723]: 2026-02-16 13:51:26.765 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:51:26 compute-0 nova_compute[185723]: 2026-02-16 13:51:26.871 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Instance 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 16 13:51:26 compute-0 nova_compute[185723]: 2026-02-16 13:51:26.871 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 16 13:51:26 compute-0 nova_compute[185723]: 2026-02-16 13:51:26.872 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 16 13:51:26 compute-0 nova_compute[185723]: 2026-02-16 13:51:26.894 185727 DEBUG nova.scheduler.client.report [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Refreshing inventories for resource provider c9501a85-df32-4b8f-bce0-9425ef1e7866 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Feb 16 13:51:26 compute-0 nova_compute[185723]: 2026-02-16 13:51:26.913 185727 DEBUG nova.scheduler.client.report [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Updating ProviderTree inventory for provider c9501a85-df32-4b8f-bce0-9425ef1e7866 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Feb 16 13:51:26 compute-0 nova_compute[185723]: 2026-02-16 13:51:26.913 185727 DEBUG nova.compute.provider_tree [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Updating inventory in ProviderTree for provider c9501a85-df32-4b8f-bce0-9425ef1e7866 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Feb 16 13:51:26 compute-0 nova_compute[185723]: 2026-02-16 13:51:26.937 185727 DEBUG nova.scheduler.client.report [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Refreshing aggregate associations for resource provider c9501a85-df32-4b8f-bce0-9425ef1e7866, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Feb 16 13:51:26 compute-0 nova_compute[185723]: 2026-02-16 13:51:26.972 185727 DEBUG nova.scheduler.client.report [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Refreshing trait associations for resource provider c9501a85-df32-4b8f-bce0-9425ef1e7866, traits: COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_MMX,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_SSE,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_SSE2,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_STATUS_DISABLED,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_SECURITY_TPM_1_2,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_VOLUME_EXTEND,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_STORAGE_BUS_USB,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_SECURITY_TPM_2_0,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_ACCELERATORS,COMPUTE_STORAGE_BUS_FDC,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NODE,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_SSE41,COMPUTE_IMAGE_TYPE_AKI _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Feb 16 13:51:27 compute-0 nova_compute[185723]: 2026-02-16 13:51:27.023 185727 DEBUG nova.compute.provider_tree [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Inventory has not changed in ProviderTree for provider: c9501a85-df32-4b8f-bce0-9425ef1e7866 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 13:51:27 compute-0 nova_compute[185723]: 2026-02-16 13:51:27.039 185727 DEBUG nova.scheduler.client.report [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Inventory has not changed for provider c9501a85-df32-4b8f-bce0-9425ef1e7866 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 13:51:27 compute-0 nova_compute[185723]: 2026-02-16 13:51:27.083 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 16 13:51:27 compute-0 nova_compute[185723]: 2026-02-16 13:51:27.083 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.318s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:51:27 compute-0 nova_compute[185723]: 2026-02-16 13:51:27.572 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:51:28 compute-0 nova_compute[185723]: 2026-02-16 13:51:28.084 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:51:28 compute-0 nova_compute[185723]: 2026-02-16 13:51:28.433 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:51:29 compute-0 nova_compute[185723]: 2026-02-16 13:51:29.433 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:51:29 compute-0 podman[195053]: time="2026-02-16T13:51:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:51:29 compute-0 podman[195053]: @ - - [16/Feb/2026:13:51:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17245 "" "Go-http-client/1.1"
Feb 16 13:51:29 compute-0 podman[195053]: @ - - [16/Feb/2026:13:51:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2640 "" "Go-http-client/1.1"
Feb 16 13:51:30 compute-0 nova_compute[185723]: 2026-02-16 13:51:30.384 185727 DEBUG nova.virt.libvirt.driver [None req-f0ec306f-424e-4dbf-8261-4a3536a2121f bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32] Check if temp file /var/lib/nova/instances/tmpldl3rikf exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10065
Feb 16 13:51:30 compute-0 nova_compute[185723]: 2026-02-16 13:51:30.385 185727 DEBUG nova.compute.manager [None req-f0ec306f-424e-4dbf-8261-4a3536a2121f bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpldl3rikf',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='6b24deb5-a1f1-4154-a8a4-c31c69dc5d32',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.9/site-packages/nova/compute/manager.py:8587
Feb 16 13:51:30 compute-0 nova_compute[185723]: 2026-02-16 13:51:30.433 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:51:30 compute-0 nova_compute[185723]: 2026-02-16 13:51:30.434 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 16 13:51:30 compute-0 nova_compute[185723]: 2026-02-16 13:51:30.434 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 16 13:51:30 compute-0 nova_compute[185723]: 2026-02-16 13:51:30.449 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Acquiring lock "refresh_cache-6b24deb5-a1f1-4154-a8a4-c31c69dc5d32" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 13:51:30 compute-0 nova_compute[185723]: 2026-02-16 13:51:30.449 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Acquired lock "refresh_cache-6b24deb5-a1f1-4154-a8a4-c31c69dc5d32" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 13:51:30 compute-0 nova_compute[185723]: 2026-02-16 13:51:30.449 185727 DEBUG nova.network.neutron [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] [instance: 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Feb 16 13:51:30 compute-0 nova_compute[185723]: 2026-02-16 13:51:30.450 185727 DEBUG nova.objects.instance [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 13:51:30 compute-0 nova_compute[185723]: 2026-02-16 13:51:30.741 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:51:31 compute-0 openstack_network_exporter[197909]: ERROR   13:51:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:51:31 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:51:31 compute-0 openstack_network_exporter[197909]: ERROR   13:51:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:51:31 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:51:31 compute-0 nova_compute[185723]: 2026-02-16 13:51:31.539 185727 DEBUG oslo_concurrency.processutils [None req-f0ec306f-424e-4dbf-8261-4a3536a2121f bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6b24deb5-a1f1-4154-a8a4-c31c69dc5d32/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:51:31 compute-0 nova_compute[185723]: 2026-02-16 13:51:31.590 185727 DEBUG oslo_concurrency.processutils [None req-f0ec306f-424e-4dbf-8261-4a3536a2121f bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6b24deb5-a1f1-4154-a8a4-c31c69dc5d32/disk --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:51:31 compute-0 nova_compute[185723]: 2026-02-16 13:51:31.591 185727 DEBUG oslo_concurrency.processutils [None req-f0ec306f-424e-4dbf-8261-4a3536a2121f bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6b24deb5-a1f1-4154-a8a4-c31c69dc5d32/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:51:31 compute-0 nova_compute[185723]: 2026-02-16 13:51:31.640 185727 DEBUG oslo_concurrency.processutils [None req-f0ec306f-424e-4dbf-8261-4a3536a2121f bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6b24deb5-a1f1-4154-a8a4-c31c69dc5d32/disk --force-share --output=json" returned: 0 in 0.049s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:51:32 compute-0 nova_compute[185723]: 2026-02-16 13:51:32.575 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:51:32 compute-0 nova_compute[185723]: 2026-02-16 13:51:32.760 185727 DEBUG nova.network.neutron [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] [instance: 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32] Updating instance_info_cache with network_info: [{"id": "b7a22eb4-a31d-4a96-a5a9-9e56f37cecc1", "address": "fa:16:3e:cd:81:d6", "network": {"id": "25f604b5-711f-4df5-a65b-4ca0c988350f", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1415641352-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c4a5b3f08ab466eaac86305d91fd9a8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb7a22eb4-a3", "ovs_interfaceid": "b7a22eb4-a31d-4a96-a5a9-9e56f37cecc1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 13:51:32 compute-0 nova_compute[185723]: 2026-02-16 13:51:32.782 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Releasing lock "refresh_cache-6b24deb5-a1f1-4154-a8a4-c31c69dc5d32" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 13:51:32 compute-0 nova_compute[185723]: 2026-02-16 13:51:32.782 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] [instance: 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Feb 16 13:51:32 compute-0 nova_compute[185723]: 2026-02-16 13:51:32.782 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:51:32 compute-0 nova_compute[185723]: 2026-02-16 13:51:32.783 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:51:32 compute-0 nova_compute[185723]: 2026-02-16 13:51:32.783 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 16 13:51:33 compute-0 nova_compute[185723]: 2026-02-16 13:51:33.777 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:51:33 compute-0 nova_compute[185723]: 2026-02-16 13:51:33.778 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:51:33 compute-0 sshd-session[216498]: Accepted publickey for nova from 192.168.122.101 port 60676 ssh2: ECDSA SHA256:U309eBAZgvPXicX2lI3ib2903RjOpPXbPKVddWOb314
Feb 16 13:51:33 compute-0 systemd-logind[818]: New session 33 of user nova.
Feb 16 13:51:33 compute-0 systemd[1]: Created slice User Slice of UID 42436.
Feb 16 13:51:33 compute-0 systemd[1]: Starting User Runtime Directory /run/user/42436...
Feb 16 13:51:33 compute-0 systemd[1]: Finished User Runtime Directory /run/user/42436.
Feb 16 13:51:33 compute-0 systemd[1]: Starting User Manager for UID 42436...
Feb 16 13:51:33 compute-0 systemd[216502]: pam_unix(systemd-user:session): session opened for user nova(uid=42436) by nova(uid=0)
Feb 16 13:51:33 compute-0 systemd[216502]: Queued start job for default target Main User Target.
Feb 16 13:51:33 compute-0 systemd[216502]: Created slice User Application Slice.
Feb 16 13:51:33 compute-0 systemd[216502]: Started Mark boot as successful after the user session has run 2 minutes.
Feb 16 13:51:33 compute-0 systemd[216502]: Started Daily Cleanup of User's Temporary Directories.
Feb 16 13:51:33 compute-0 systemd[216502]: Reached target Paths.
Feb 16 13:51:33 compute-0 systemd[216502]: Reached target Timers.
Feb 16 13:51:34 compute-0 systemd[216502]: Starting D-Bus User Message Bus Socket...
Feb 16 13:51:34 compute-0 systemd[216502]: Starting Create User's Volatile Files and Directories...
Feb 16 13:51:34 compute-0 systemd[216502]: Finished Create User's Volatile Files and Directories.
Feb 16 13:51:34 compute-0 systemd[216502]: Listening on D-Bus User Message Bus Socket.
Feb 16 13:51:34 compute-0 systemd[216502]: Reached target Sockets.
Feb 16 13:51:34 compute-0 systemd[216502]: Reached target Basic System.
Feb 16 13:51:34 compute-0 systemd[216502]: Reached target Main User Target.
Feb 16 13:51:34 compute-0 systemd[216502]: Startup finished in 123ms.
Feb 16 13:51:34 compute-0 systemd[1]: Started User Manager for UID 42436.
Feb 16 13:51:34 compute-0 systemd[1]: Started Session 33 of User nova.
Feb 16 13:51:34 compute-0 sshd-session[216498]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Feb 16 13:51:34 compute-0 sshd-session[216517]: Received disconnect from 192.168.122.101 port 60676:11: disconnected by user
Feb 16 13:51:34 compute-0 sshd-session[216517]: Disconnected from user nova 192.168.122.101 port 60676
Feb 16 13:51:34 compute-0 sshd-session[216498]: pam_unix(sshd:session): session closed for user nova
Feb 16 13:51:34 compute-0 systemd[1]: session-33.scope: Deactivated successfully.
Feb 16 13:51:34 compute-0 systemd-logind[818]: Session 33 logged out. Waiting for processes to exit.
Feb 16 13:51:34 compute-0 systemd-logind[818]: Removed session 33.
Feb 16 13:51:34 compute-0 nova_compute[185723]: 2026-02-16 13:51:34.779 185727 DEBUG nova.compute.manager [req-8fed62b8-ad8c-4e60-84e6-b88b9faa8140 req-3cda5b10-a69d-466e-aca1-cbce93df5a00 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32] Received event network-vif-unplugged-b7a22eb4-a31d-4a96-a5a9-9e56f37cecc1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:51:34 compute-0 nova_compute[185723]: 2026-02-16 13:51:34.781 185727 DEBUG oslo_concurrency.lockutils [req-8fed62b8-ad8c-4e60-84e6-b88b9faa8140 req-3cda5b10-a69d-466e-aca1-cbce93df5a00 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "6b24deb5-a1f1-4154-a8a4-c31c69dc5d32-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:51:34 compute-0 nova_compute[185723]: 2026-02-16 13:51:34.781 185727 DEBUG oslo_concurrency.lockutils [req-8fed62b8-ad8c-4e60-84e6-b88b9faa8140 req-3cda5b10-a69d-466e-aca1-cbce93df5a00 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "6b24deb5-a1f1-4154-a8a4-c31c69dc5d32-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:51:34 compute-0 nova_compute[185723]: 2026-02-16 13:51:34.782 185727 DEBUG oslo_concurrency.lockutils [req-8fed62b8-ad8c-4e60-84e6-b88b9faa8140 req-3cda5b10-a69d-466e-aca1-cbce93df5a00 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "6b24deb5-a1f1-4154-a8a4-c31c69dc5d32-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:51:34 compute-0 nova_compute[185723]: 2026-02-16 13:51:34.782 185727 DEBUG nova.compute.manager [req-8fed62b8-ad8c-4e60-84e6-b88b9faa8140 req-3cda5b10-a69d-466e-aca1-cbce93df5a00 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32] No waiting events found dispatching network-vif-unplugged-b7a22eb4-a31d-4a96-a5a9-9e56f37cecc1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:51:34 compute-0 nova_compute[185723]: 2026-02-16 13:51:34.782 185727 DEBUG nova.compute.manager [req-8fed62b8-ad8c-4e60-84e6-b88b9faa8140 req-3cda5b10-a69d-466e-aca1-cbce93df5a00 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32] Received event network-vif-unplugged-b7a22eb4-a31d-4a96-a5a9-9e56f37cecc1 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Feb 16 13:51:35 compute-0 nova_compute[185723]: 2026-02-16 13:51:35.288 185727 INFO nova.compute.manager [None req-f0ec306f-424e-4dbf-8261-4a3536a2121f bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32] Took 3.65 seconds for pre_live_migration on destination host compute-1.ctlplane.example.com.
Feb 16 13:51:35 compute-0 nova_compute[185723]: 2026-02-16 13:51:35.288 185727 DEBUG nova.compute.manager [None req-f0ec306f-424e-4dbf-8261-4a3536a2121f bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Feb 16 13:51:35 compute-0 nova_compute[185723]: 2026-02-16 13:51:35.314 185727 DEBUG nova.compute.manager [None req-f0ec306f-424e-4dbf-8261-4a3536a2121f bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpldl3rikf',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='6b24deb5-a1f1-4154-a8a4-c31c69dc5d32',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(9adab84b-a5c2-47b5-9af4-697e29ea3af8),old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8939
Feb 16 13:51:35 compute-0 nova_compute[185723]: 2026-02-16 13:51:35.335 185727 DEBUG nova.objects.instance [None req-f0ec306f-424e-4dbf-8261-4a3536a2121f bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lazy-loading 'migration_context' on Instance uuid 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 13:51:35 compute-0 nova_compute[185723]: 2026-02-16 13:51:35.337 185727 DEBUG nova.virt.libvirt.driver [None req-f0ec306f-424e-4dbf-8261-4a3536a2121f bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32] Starting monitoring of live migration _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10639
Feb 16 13:51:35 compute-0 nova_compute[185723]: 2026-02-16 13:51:35.339 185727 DEBUG nova.virt.libvirt.driver [None req-f0ec306f-424e-4dbf-8261-4a3536a2121f bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32] Operation thread is still running _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10440
Feb 16 13:51:35 compute-0 nova_compute[185723]: 2026-02-16 13:51:35.339 185727 DEBUG nova.virt.libvirt.driver [None req-f0ec306f-424e-4dbf-8261-4a3536a2121f bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32] Migration not running yet _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10449
Feb 16 13:51:35 compute-0 nova_compute[185723]: 2026-02-16 13:51:35.356 185727 DEBUG nova.virt.libvirt.vif [None req-f0ec306f-424e-4dbf-8261-4a3536a2121f bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-16T13:50:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteVmWorkloadBalanceStrategy-server-985821402',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutevmworkloadbalancestrategy-server-985821402',id=25,image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-16T13:50:59Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='5c4a5b3f08ab466eaac86305d91fd9a8',ramdisk_id='',reservation_id='r-jwp5hm0l',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteVmWorkloadBalanceStrategy-1500862259',owner_user_name='tempest-TestExecuteVmWorkloadBalanceStrategy-1500862259-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-16T13:50:59Z,user_data=None,user_id='c7c8dce27a2f4917a7dac485b1d8754a',uuid=6b24deb5-a1f1-4154-a8a4-c31c69dc5d32,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b7a22eb4-a31d-4a96-a5a9-9e56f37cecc1", "address": "fa:16:3e:cd:81:d6", "network": {"id": "25f604b5-711f-4df5-a65b-4ca0c988350f", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1415641352-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c4a5b3f08ab466eaac86305d91fd9a8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapb7a22eb4-a3", "ovs_interfaceid": "b7a22eb4-a31d-4a96-a5a9-9e56f37cecc1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Feb 16 13:51:35 compute-0 nova_compute[185723]: 2026-02-16 13:51:35.356 185727 DEBUG nova.network.os_vif_util [None req-f0ec306f-424e-4dbf-8261-4a3536a2121f bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Converting VIF {"id": "b7a22eb4-a31d-4a96-a5a9-9e56f37cecc1", "address": "fa:16:3e:cd:81:d6", "network": {"id": "25f604b5-711f-4df5-a65b-4ca0c988350f", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1415641352-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c4a5b3f08ab466eaac86305d91fd9a8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapb7a22eb4-a3", "ovs_interfaceid": "b7a22eb4-a31d-4a96-a5a9-9e56f37cecc1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 13:51:35 compute-0 nova_compute[185723]: 2026-02-16 13:51:35.357 185727 DEBUG nova.network.os_vif_util [None req-f0ec306f-424e-4dbf-8261-4a3536a2121f bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:cd:81:d6,bridge_name='br-int',has_traffic_filtering=True,id=b7a22eb4-a31d-4a96-a5a9-9e56f37cecc1,network=Network(25f604b5-711f-4df5-a65b-4ca0c988350f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb7a22eb4-a3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 13:51:35 compute-0 nova_compute[185723]: 2026-02-16 13:51:35.358 185727 DEBUG nova.virt.libvirt.migration [None req-f0ec306f-424e-4dbf-8261-4a3536a2121f bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32] Updating guest XML with vif config: <interface type="ethernet">
Feb 16 13:51:35 compute-0 nova_compute[185723]:   <mac address="fa:16:3e:cd:81:d6"/>
Feb 16 13:51:35 compute-0 nova_compute[185723]:   <model type="virtio"/>
Feb 16 13:51:35 compute-0 nova_compute[185723]:   <driver name="vhost" rx_queue_size="512"/>
Feb 16 13:51:35 compute-0 nova_compute[185723]:   <mtu size="1442"/>
Feb 16 13:51:35 compute-0 nova_compute[185723]:   <target dev="tapb7a22eb4-a3"/>
Feb 16 13:51:35 compute-0 nova_compute[185723]: </interface>
Feb 16 13:51:35 compute-0 nova_compute[185723]:  _update_vif_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:388
Feb 16 13:51:35 compute-0 nova_compute[185723]: 2026-02-16 13:51:35.358 185727 DEBUG nova.virt.libvirt.driver [None req-f0ec306f-424e-4dbf-8261-4a3536a2121f bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32] About to invoke the migrate API _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10272
Feb 16 13:51:35 compute-0 nova_compute[185723]: 2026-02-16 13:51:35.744 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:51:35 compute-0 nova_compute[185723]: 2026-02-16 13:51:35.841 185727 DEBUG nova.virt.libvirt.migration [None req-f0ec306f-424e-4dbf-8261-4a3536a2121f bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32] Current None elapsed 0 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Feb 16 13:51:35 compute-0 nova_compute[185723]: 2026-02-16 13:51:35.842 185727 INFO nova.virt.libvirt.migration [None req-f0ec306f-424e-4dbf-8261-4a3536a2121f bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32] Increasing downtime to 50 ms after 0 sec elapsed time
Feb 16 13:51:35 compute-0 nova_compute[185723]: 2026-02-16 13:51:35.920 185727 INFO nova.virt.libvirt.driver [None req-f0ec306f-424e-4dbf-8261-4a3536a2121f bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32] Migration running for 0 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).
Feb 16 13:51:36 compute-0 nova_compute[185723]: 2026-02-16 13:51:36.425 185727 DEBUG nova.virt.libvirt.migration [None req-f0ec306f-424e-4dbf-8261-4a3536a2121f bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Feb 16 13:51:36 compute-0 nova_compute[185723]: 2026-02-16 13:51:36.426 185727 DEBUG nova.virt.libvirt.migration [None req-f0ec306f-424e-4dbf-8261-4a3536a2121f bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Feb 16 13:51:36 compute-0 nova_compute[185723]: 2026-02-16 13:51:36.803 185727 DEBUG nova.virt.driver [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] Emitting event <LifecycleEvent: 1771249896.8030853, 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 13:51:36 compute-0 nova_compute[185723]: 2026-02-16 13:51:36.804 185727 INFO nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32] VM Paused (Lifecycle Event)
Feb 16 13:51:36 compute-0 nova_compute[185723]: 2026-02-16 13:51:36.832 185727 DEBUG nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:51:36 compute-0 nova_compute[185723]: 2026-02-16 13:51:36.836 185727 DEBUG nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 16 13:51:36 compute-0 nova_compute[185723]: 2026-02-16 13:51:36.870 185727 INFO nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32] During sync_power_state the instance has a pending task (migrating). Skip.
Feb 16 13:51:36 compute-0 nova_compute[185723]: 2026-02-16 13:51:36.888 185727 DEBUG nova.compute.manager [req-185b2f17-134e-45e1-bdf2-947195cc48e0 req-81bed311-e8dd-4aa0-a1f5-8434aa2b64cf faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32] Received event network-vif-plugged-b7a22eb4-a31d-4a96-a5a9-9e56f37cecc1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:51:36 compute-0 nova_compute[185723]: 2026-02-16 13:51:36.889 185727 DEBUG oslo_concurrency.lockutils [req-185b2f17-134e-45e1-bdf2-947195cc48e0 req-81bed311-e8dd-4aa0-a1f5-8434aa2b64cf faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "6b24deb5-a1f1-4154-a8a4-c31c69dc5d32-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:51:36 compute-0 nova_compute[185723]: 2026-02-16 13:51:36.889 185727 DEBUG oslo_concurrency.lockutils [req-185b2f17-134e-45e1-bdf2-947195cc48e0 req-81bed311-e8dd-4aa0-a1f5-8434aa2b64cf faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "6b24deb5-a1f1-4154-a8a4-c31c69dc5d32-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:51:36 compute-0 nova_compute[185723]: 2026-02-16 13:51:36.889 185727 DEBUG oslo_concurrency.lockutils [req-185b2f17-134e-45e1-bdf2-947195cc48e0 req-81bed311-e8dd-4aa0-a1f5-8434aa2b64cf faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "6b24deb5-a1f1-4154-a8a4-c31c69dc5d32-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:51:36 compute-0 nova_compute[185723]: 2026-02-16 13:51:36.889 185727 DEBUG nova.compute.manager [req-185b2f17-134e-45e1-bdf2-947195cc48e0 req-81bed311-e8dd-4aa0-a1f5-8434aa2b64cf faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32] No waiting events found dispatching network-vif-plugged-b7a22eb4-a31d-4a96-a5a9-9e56f37cecc1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:51:36 compute-0 nova_compute[185723]: 2026-02-16 13:51:36.889 185727 WARNING nova.compute.manager [req-185b2f17-134e-45e1-bdf2-947195cc48e0 req-81bed311-e8dd-4aa0-a1f5-8434aa2b64cf faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32] Received unexpected event network-vif-plugged-b7a22eb4-a31d-4a96-a5a9-9e56f37cecc1 for instance with vm_state active and task_state migrating.
Feb 16 13:51:36 compute-0 nova_compute[185723]: 2026-02-16 13:51:36.890 185727 DEBUG nova.compute.manager [req-185b2f17-134e-45e1-bdf2-947195cc48e0 req-81bed311-e8dd-4aa0-a1f5-8434aa2b64cf faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32] Received event network-changed-b7a22eb4-a31d-4a96-a5a9-9e56f37cecc1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:51:36 compute-0 nova_compute[185723]: 2026-02-16 13:51:36.890 185727 DEBUG nova.compute.manager [req-185b2f17-134e-45e1-bdf2-947195cc48e0 req-81bed311-e8dd-4aa0-a1f5-8434aa2b64cf faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32] Refreshing instance network info cache due to event network-changed-b7a22eb4-a31d-4a96-a5a9-9e56f37cecc1. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 16 13:51:36 compute-0 nova_compute[185723]: 2026-02-16 13:51:36.890 185727 DEBUG oslo_concurrency.lockutils [req-185b2f17-134e-45e1-bdf2-947195cc48e0 req-81bed311-e8dd-4aa0-a1f5-8434aa2b64cf faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "refresh_cache-6b24deb5-a1f1-4154-a8a4-c31c69dc5d32" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 13:51:36 compute-0 nova_compute[185723]: 2026-02-16 13:51:36.890 185727 DEBUG oslo_concurrency.lockutils [req-185b2f17-134e-45e1-bdf2-947195cc48e0 req-81bed311-e8dd-4aa0-a1f5-8434aa2b64cf faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquired lock "refresh_cache-6b24deb5-a1f1-4154-a8a4-c31c69dc5d32" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 13:51:36 compute-0 nova_compute[185723]: 2026-02-16 13:51:36.890 185727 DEBUG nova.network.neutron [req-185b2f17-134e-45e1-bdf2-947195cc48e0 req-81bed311-e8dd-4aa0-a1f5-8434aa2b64cf faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32] Refreshing network info cache for port b7a22eb4-a31d-4a96-a5a9-9e56f37cecc1 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 16 13:51:36 compute-0 nova_compute[185723]: 2026-02-16 13:51:36.929 185727 DEBUG nova.virt.libvirt.migration [None req-f0ec306f-424e-4dbf-8261-4a3536a2121f bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Feb 16 13:51:36 compute-0 nova_compute[185723]: 2026-02-16 13:51:36.930 185727 DEBUG nova.virt.libvirt.migration [None req-f0ec306f-424e-4dbf-8261-4a3536a2121f bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Feb 16 13:51:36 compute-0 kernel: tapb7a22eb4-a3 (unregistering): left promiscuous mode
Feb 16 13:51:36 compute-0 NetworkManager[56177]: <info>  [1771249896.9917] device (tapb7a22eb4-a3): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Feb 16 13:51:36 compute-0 nova_compute[185723]: 2026-02-16 13:51:36.991 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:51:36 compute-0 ovn_controller[96072]: 2026-02-16T13:51:36Z|00230|binding|INFO|Releasing lport b7a22eb4-a31d-4a96-a5a9-9e56f37cecc1 from this chassis (sb_readonly=0)
Feb 16 13:51:36 compute-0 nova_compute[185723]: 2026-02-16 13:51:36.998 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:51:36 compute-0 ovn_controller[96072]: 2026-02-16T13:51:36Z|00231|binding|INFO|Setting lport b7a22eb4-a31d-4a96-a5a9-9e56f37cecc1 down in Southbound
Feb 16 13:51:36 compute-0 ovn_controller[96072]: 2026-02-16T13:51:36Z|00232|binding|INFO|Removing iface tapb7a22eb4-a3 ovn-installed in OVS
Feb 16 13:51:37 compute-0 nova_compute[185723]: 2026-02-16 13:51:37.000 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:51:37 compute-0 nova_compute[185723]: 2026-02-16 13:51:37.005 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:51:37 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:51:37.018 105360 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:cd:81:d6 10.100.0.12'], port_security=['fa:16:3e:cd:81:d6 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com,compute-1.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': '54c1a259-778a-4222-b2c6-8422ea19a065'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '6b24deb5-a1f1-4154-a8a4-c31c69dc5d32', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-25f604b5-711f-4df5-a65b-4ca0c988350f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5c4a5b3f08ab466eaac86305d91fd9a8', 'neutron:revision_number': '8', 'neutron:security_group_ids': '238bb162-7cdb-4292-ac0d-7fe46bc858a9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a96acfaa-4a70-40f4-bebe-b6fb536cb5a3, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2e53046760>], logical_port=b7a22eb4-a31d-4a96-a5a9-9e56f37cecc1) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2e53046760>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 13:51:37 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:51:37.019 105360 INFO neutron.agent.ovn.metadata.agent [-] Port b7a22eb4-a31d-4a96-a5a9-9e56f37cecc1 in datapath 25f604b5-711f-4df5-a65b-4ca0c988350f unbound from our chassis
Feb 16 13:51:37 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:51:37.020 105360 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 25f604b5-711f-4df5-a65b-4ca0c988350f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 16 13:51:37 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:51:37.021 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[eeb29b3d-1ced-4b92-b7fc-359a710c5b98]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:51:37 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:51:37.022 105360 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-25f604b5-711f-4df5-a65b-4ca0c988350f namespace which is not needed anymore
Feb 16 13:51:37 compute-0 systemd[1]: machine-qemu\x2d21\x2dinstance\x2d00000019.scope: Deactivated successfully.
Feb 16 13:51:37 compute-0 systemd[1]: machine-qemu\x2d21\x2dinstance\x2d00000019.scope: Consumed 13.674s CPU time.
Feb 16 13:51:37 compute-0 systemd-machined[155229]: Machine qemu-21-instance-00000019 terminated.
Feb 16 13:51:37 compute-0 neutron-haproxy-ovnmeta-25f604b5-711f-4df5-a65b-4ca0c988350f[216357]: [NOTICE]   (216361) : haproxy version is 2.8.14-c23fe91
Feb 16 13:51:37 compute-0 neutron-haproxy-ovnmeta-25f604b5-711f-4df5-a65b-4ca0c988350f[216357]: [NOTICE]   (216361) : path to executable is /usr/sbin/haproxy
Feb 16 13:51:37 compute-0 neutron-haproxy-ovnmeta-25f604b5-711f-4df5-a65b-4ca0c988350f[216357]: [WARNING]  (216361) : Exiting Master process...
Feb 16 13:51:37 compute-0 neutron-haproxy-ovnmeta-25f604b5-711f-4df5-a65b-4ca0c988350f[216357]: [WARNING]  (216361) : Exiting Master process...
Feb 16 13:51:37 compute-0 neutron-haproxy-ovnmeta-25f604b5-711f-4df5-a65b-4ca0c988350f[216357]: [ALERT]    (216361) : Current worker (216364) exited with code 143 (Terminated)
Feb 16 13:51:37 compute-0 neutron-haproxy-ovnmeta-25f604b5-711f-4df5-a65b-4ca0c988350f[216357]: [WARNING]  (216361) : All workers exited. Exiting... (0)
Feb 16 13:51:37 compute-0 systemd[1]: libpod-bc7b416965aff15e4dc070b4b65082ecd05fb91eee42816af986e3677b9fab70.scope: Deactivated successfully.
Feb 16 13:51:37 compute-0 podman[216552]: 2026-02-16 13:51:37.141003737 +0000 UTC m=+0.045752848 container died bc7b416965aff15e4dc070b4b65082ecd05fb91eee42816af986e3677b9fab70 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-25f604b5-711f-4df5-a65b-4ca0c988350f, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Feb 16 13:51:37 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-bc7b416965aff15e4dc070b4b65082ecd05fb91eee42816af986e3677b9fab70-userdata-shm.mount: Deactivated successfully.
Feb 16 13:51:37 compute-0 systemd[1]: var-lib-containers-storage-overlay-a1f33c7c6e4b22030a06599fb5d532d0072daa22bd37983c4893cbaef0881720-merged.mount: Deactivated successfully.
Feb 16 13:51:37 compute-0 podman[216552]: 2026-02-16 13:51:37.172306215 +0000 UTC m=+0.077055326 container cleanup bc7b416965aff15e4dc070b4b65082ecd05fb91eee42816af986e3677b9fab70 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-25f604b5-711f-4df5-a65b-4ca0c988350f, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Feb 16 13:51:37 compute-0 systemd[1]: libpod-conmon-bc7b416965aff15e4dc070b4b65082ecd05fb91eee42816af986e3677b9fab70.scope: Deactivated successfully.
Feb 16 13:51:37 compute-0 nova_compute[185723]: 2026-02-16 13:51:37.220 185727 DEBUG nova.virt.libvirt.driver [None req-f0ec306f-424e-4dbf-8261-4a3536a2121f bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32] Migrate API has completed _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10279
Feb 16 13:51:37 compute-0 nova_compute[185723]: 2026-02-16 13:51:37.221 185727 DEBUG nova.virt.libvirt.driver [None req-f0ec306f-424e-4dbf-8261-4a3536a2121f bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32] Migration operation thread has finished _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10327
Feb 16 13:51:37 compute-0 nova_compute[185723]: 2026-02-16 13:51:37.221 185727 DEBUG nova.virt.libvirt.driver [None req-f0ec306f-424e-4dbf-8261-4a3536a2121f bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32] Migration operation thread notification thread_finished /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10630
Feb 16 13:51:37 compute-0 podman[216583]: 2026-02-16 13:51:37.231132616 +0000 UTC m=+0.041216795 container remove bc7b416965aff15e4dc070b4b65082ecd05fb91eee42816af986e3677b9fab70 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-25f604b5-711f-4df5-a65b-4ca0c988350f, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Feb 16 13:51:37 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:51:37.236 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[0b1900c9-73ab-47bf-b1ac-abd9063b76ee]: (4, ('Mon Feb 16 01:51:37 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-25f604b5-711f-4df5-a65b-4ca0c988350f (bc7b416965aff15e4dc070b4b65082ecd05fb91eee42816af986e3677b9fab70)\nbc7b416965aff15e4dc070b4b65082ecd05fb91eee42816af986e3677b9fab70\nMon Feb 16 01:51:37 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-25f604b5-711f-4df5-a65b-4ca0c988350f (bc7b416965aff15e4dc070b4b65082ecd05fb91eee42816af986e3677b9fab70)\nbc7b416965aff15e4dc070b4b65082ecd05fb91eee42816af986e3677b9fab70\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:51:37 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:51:37.238 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[9d912597-5ae8-4d87-9205-5e6c61bcfa44]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:51:37 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:51:37.239 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap25f604b5-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:51:37 compute-0 nova_compute[185723]: 2026-02-16 13:51:37.240 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:51:37 compute-0 kernel: tap25f604b5-70: left promiscuous mode
Feb 16 13:51:37 compute-0 nova_compute[185723]: 2026-02-16 13:51:37.248 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:51:37 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:51:37.254 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[72d424a2-1b21-4e77-9722-860b197c1f74]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:51:37 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:51:37.271 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[de4fbddb-41eb-431c-917b-c00fb919fa46]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:51:37 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:51:37.273 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[4540016d-5fd9-4a8d-82cb-3cb2a8eae952]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:51:37 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:51:37.285 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[6263345f-e8c4-4666-a624-e9b914cc38d7]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 594205, 'reachable_time': 43615, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216618, 'error': None, 'target': 'ovnmeta-25f604b5-711f-4df5-a65b-4ca0c988350f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:51:37 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:51:37.287 105762 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-25f604b5-711f-4df5-a65b-4ca0c988350f deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Feb 16 13:51:37 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:51:37.287 105762 DEBUG oslo.privsep.daemon [-] privsep: reply[e493a4d8-2e9b-430f-85ca-fee76cabc452]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:51:37 compute-0 systemd[1]: run-netns-ovnmeta\x2d25f604b5\x2d711f\x2d4df5\x2da65b\x2d4ca0c988350f.mount: Deactivated successfully.
Feb 16 13:51:37 compute-0 nova_compute[185723]: 2026-02-16 13:51:37.431 185727 DEBUG nova.virt.libvirt.guest [None req-f0ec306f-424e-4dbf-8261-4a3536a2121f bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Domain has shutdown/gone away: Domain not found: no domain with matching uuid '6b24deb5-a1f1-4154-a8a4-c31c69dc5d32' (instance-00000019) get_job_info /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:688
Feb 16 13:51:37 compute-0 nova_compute[185723]: 2026-02-16 13:51:37.432 185727 INFO nova.virt.libvirt.driver [None req-f0ec306f-424e-4dbf-8261-4a3536a2121f bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32] Migration operation has completed
Feb 16 13:51:37 compute-0 nova_compute[185723]: 2026-02-16 13:51:37.432 185727 INFO nova.compute.manager [None req-f0ec306f-424e-4dbf-8261-4a3536a2121f bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32] _post_live_migration() is started..
Feb 16 13:51:37 compute-0 nova_compute[185723]: 2026-02-16 13:51:37.577 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:51:38 compute-0 nova_compute[185723]: 2026-02-16 13:51:38.040 185727 DEBUG nova.compute.manager [req-a3902a62-5b1d-4b5e-8b18-2a1523c4565e req-9a74645f-d33e-43f6-b304-8c89b24f460d faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32] Received event network-vif-unplugged-b7a22eb4-a31d-4a96-a5a9-9e56f37cecc1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:51:38 compute-0 nova_compute[185723]: 2026-02-16 13:51:38.040 185727 DEBUG oslo_concurrency.lockutils [req-a3902a62-5b1d-4b5e-8b18-2a1523c4565e req-9a74645f-d33e-43f6-b304-8c89b24f460d faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "6b24deb5-a1f1-4154-a8a4-c31c69dc5d32-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:51:38 compute-0 nova_compute[185723]: 2026-02-16 13:51:38.040 185727 DEBUG oslo_concurrency.lockutils [req-a3902a62-5b1d-4b5e-8b18-2a1523c4565e req-9a74645f-d33e-43f6-b304-8c89b24f460d faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "6b24deb5-a1f1-4154-a8a4-c31c69dc5d32-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:51:38 compute-0 nova_compute[185723]: 2026-02-16 13:51:38.040 185727 DEBUG oslo_concurrency.lockutils [req-a3902a62-5b1d-4b5e-8b18-2a1523c4565e req-9a74645f-d33e-43f6-b304-8c89b24f460d faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "6b24deb5-a1f1-4154-a8a4-c31c69dc5d32-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:51:38 compute-0 nova_compute[185723]: 2026-02-16 13:51:38.041 185727 DEBUG nova.compute.manager [req-a3902a62-5b1d-4b5e-8b18-2a1523c4565e req-9a74645f-d33e-43f6-b304-8c89b24f460d faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32] No waiting events found dispatching network-vif-unplugged-b7a22eb4-a31d-4a96-a5a9-9e56f37cecc1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:51:38 compute-0 nova_compute[185723]: 2026-02-16 13:51:38.041 185727 DEBUG nova.compute.manager [req-a3902a62-5b1d-4b5e-8b18-2a1523c4565e req-9a74645f-d33e-43f6-b304-8c89b24f460d faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32] Received event network-vif-unplugged-b7a22eb4-a31d-4a96-a5a9-9e56f37cecc1 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Feb 16 13:51:38 compute-0 nova_compute[185723]: 2026-02-16 13:51:38.146 185727 DEBUG nova.network.neutron [None req-f0ec306f-424e-4dbf-8261-4a3536a2121f bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Activated binding for port b7a22eb4-a31d-4a96-a5a9-9e56f37cecc1 and host compute-1.ctlplane.example.com migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3181
Feb 16 13:51:38 compute-0 nova_compute[185723]: 2026-02-16 13:51:38.146 185727 DEBUG nova.compute.manager [None req-f0ec306f-424e-4dbf-8261-4a3536a2121f bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "b7a22eb4-a31d-4a96-a5a9-9e56f37cecc1", "address": "fa:16:3e:cd:81:d6", "network": {"id": "25f604b5-711f-4df5-a65b-4ca0c988350f", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1415641352-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c4a5b3f08ab466eaac86305d91fd9a8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb7a22eb4-a3", "ovs_interfaceid": "b7a22eb4-a31d-4a96-a5a9-9e56f37cecc1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9326
Feb 16 13:51:38 compute-0 nova_compute[185723]: 2026-02-16 13:51:38.147 185727 DEBUG nova.virt.libvirt.vif [None req-f0ec306f-424e-4dbf-8261-4a3536a2121f bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-16T13:50:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteVmWorkloadBalanceStrategy-server-985821402',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutevmworkloadbalancestrategy-server-985821402',id=25,image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-16T13:50:59Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='5c4a5b3f08ab466eaac86305d91fd9a8',ramdisk_id='',reservation_id='r-jwp5hm0l',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteVmWorkloadBalanceStrategy-1500862259',owner_user_name='tempest-TestExecuteVmWorkloadBalanceStrategy-1500862259-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-16T13:51:28Z,user_data=None,user_id='c7c8dce27a2f4917a7dac485b1d8754a',uuid=6b24deb5-a1f1-4154-a8a4-c31c69dc5d32,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b7a22eb4-a31d-4a96-a5a9-9e56f37cecc1", "address": "fa:16:3e:cd:81:d6", "network": {"id": "25f604b5-711f-4df5-a65b-4ca0c988350f", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1415641352-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c4a5b3f08ab466eaac86305d91fd9a8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb7a22eb4-a3", "ovs_interfaceid": "b7a22eb4-a31d-4a96-a5a9-9e56f37cecc1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Feb 16 13:51:38 compute-0 nova_compute[185723]: 2026-02-16 13:51:38.147 185727 DEBUG nova.network.os_vif_util [None req-f0ec306f-424e-4dbf-8261-4a3536a2121f bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Converting VIF {"id": "b7a22eb4-a31d-4a96-a5a9-9e56f37cecc1", "address": "fa:16:3e:cd:81:d6", "network": {"id": "25f604b5-711f-4df5-a65b-4ca0c988350f", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1415641352-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c4a5b3f08ab466eaac86305d91fd9a8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb7a22eb4-a3", "ovs_interfaceid": "b7a22eb4-a31d-4a96-a5a9-9e56f37cecc1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 13:51:38 compute-0 nova_compute[185723]: 2026-02-16 13:51:38.148 185727 DEBUG nova.network.os_vif_util [None req-f0ec306f-424e-4dbf-8261-4a3536a2121f bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:cd:81:d6,bridge_name='br-int',has_traffic_filtering=True,id=b7a22eb4-a31d-4a96-a5a9-9e56f37cecc1,network=Network(25f604b5-711f-4df5-a65b-4ca0c988350f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb7a22eb4-a3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 13:51:38 compute-0 nova_compute[185723]: 2026-02-16 13:51:38.148 185727 DEBUG os_vif [None req-f0ec306f-424e-4dbf-8261-4a3536a2121f bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:cd:81:d6,bridge_name='br-int',has_traffic_filtering=True,id=b7a22eb4-a31d-4a96-a5a9-9e56f37cecc1,network=Network(25f604b5-711f-4df5-a65b-4ca0c988350f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb7a22eb4-a3') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Feb 16 13:51:38 compute-0 nova_compute[185723]: 2026-02-16 13:51:38.150 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:51:38 compute-0 nova_compute[185723]: 2026-02-16 13:51:38.150 185727 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb7a22eb4-a3, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:51:38 compute-0 nova_compute[185723]: 2026-02-16 13:51:38.151 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:51:38 compute-0 nova_compute[185723]: 2026-02-16 13:51:38.154 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 13:51:38 compute-0 nova_compute[185723]: 2026-02-16 13:51:38.156 185727 INFO os_vif [None req-f0ec306f-424e-4dbf-8261-4a3536a2121f bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:cd:81:d6,bridge_name='br-int',has_traffic_filtering=True,id=b7a22eb4-a31d-4a96-a5a9-9e56f37cecc1,network=Network(25f604b5-711f-4df5-a65b-4ca0c988350f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb7a22eb4-a3')
Feb 16 13:51:38 compute-0 nova_compute[185723]: 2026-02-16 13:51:38.156 185727 DEBUG oslo_concurrency.lockutils [None req-f0ec306f-424e-4dbf-8261-4a3536a2121f bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:51:38 compute-0 nova_compute[185723]: 2026-02-16 13:51:38.157 185727 DEBUG oslo_concurrency.lockutils [None req-f0ec306f-424e-4dbf-8261-4a3536a2121f bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:51:38 compute-0 nova_compute[185723]: 2026-02-16 13:51:38.157 185727 DEBUG oslo_concurrency.lockutils [None req-f0ec306f-424e-4dbf-8261-4a3536a2121f bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:51:38 compute-0 nova_compute[185723]: 2026-02-16 13:51:38.157 185727 DEBUG nova.compute.manager [None req-f0ec306f-424e-4dbf-8261-4a3536a2121f bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9349
Feb 16 13:51:38 compute-0 nova_compute[185723]: 2026-02-16 13:51:38.158 185727 INFO nova.virt.libvirt.driver [None req-f0ec306f-424e-4dbf-8261-4a3536a2121f bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32] Deleting instance files /var/lib/nova/instances/6b24deb5-a1f1-4154-a8a4-c31c69dc5d32_del
Feb 16 13:51:38 compute-0 nova_compute[185723]: 2026-02-16 13:51:38.159 185727 INFO nova.virt.libvirt.driver [None req-f0ec306f-424e-4dbf-8261-4a3536a2121f bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32] Deletion of /var/lib/nova/instances/6b24deb5-a1f1-4154-a8a4-c31c69dc5d32_del complete
Feb 16 13:51:38 compute-0 nova_compute[185723]: 2026-02-16 13:51:38.246 185727 DEBUG nova.network.neutron [req-185b2f17-134e-45e1-bdf2-947195cc48e0 req-81bed311-e8dd-4aa0-a1f5-8434aa2b64cf faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32] Updated VIF entry in instance network info cache for port b7a22eb4-a31d-4a96-a5a9-9e56f37cecc1. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 16 13:51:38 compute-0 nova_compute[185723]: 2026-02-16 13:51:38.247 185727 DEBUG nova.network.neutron [req-185b2f17-134e-45e1-bdf2-947195cc48e0 req-81bed311-e8dd-4aa0-a1f5-8434aa2b64cf faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32] Updating instance_info_cache with network_info: [{"id": "b7a22eb4-a31d-4a96-a5a9-9e56f37cecc1", "address": "fa:16:3e:cd:81:d6", "network": {"id": "25f604b5-711f-4df5-a65b-4ca0c988350f", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1415641352-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c4a5b3f08ab466eaac86305d91fd9a8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb7a22eb4-a3", "ovs_interfaceid": "b7a22eb4-a31d-4a96-a5a9-9e56f37cecc1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-1.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 13:51:38 compute-0 nova_compute[185723]: 2026-02-16 13:51:38.285 185727 DEBUG oslo_concurrency.lockutils [req-185b2f17-134e-45e1-bdf2-947195cc48e0 req-81bed311-e8dd-4aa0-a1f5-8434aa2b64cf faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Releasing lock "refresh_cache-6b24deb5-a1f1-4154-a8a4-c31c69dc5d32" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 13:51:38 compute-0 nova_compute[185723]: 2026-02-16 13:51:38.994 185727 DEBUG nova.compute.manager [req-de1c83ce-a6db-4188-a2fe-8ab5d9ae8c1b req-537f5a7e-dbf0-4d37-a14f-15c0186a241f faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32] Received event network-vif-unplugged-b7a22eb4-a31d-4a96-a5a9-9e56f37cecc1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:51:38 compute-0 nova_compute[185723]: 2026-02-16 13:51:38.994 185727 DEBUG oslo_concurrency.lockutils [req-de1c83ce-a6db-4188-a2fe-8ab5d9ae8c1b req-537f5a7e-dbf0-4d37-a14f-15c0186a241f faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "6b24deb5-a1f1-4154-a8a4-c31c69dc5d32-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:51:38 compute-0 nova_compute[185723]: 2026-02-16 13:51:38.994 185727 DEBUG oslo_concurrency.lockutils [req-de1c83ce-a6db-4188-a2fe-8ab5d9ae8c1b req-537f5a7e-dbf0-4d37-a14f-15c0186a241f faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "6b24deb5-a1f1-4154-a8a4-c31c69dc5d32-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:51:38 compute-0 nova_compute[185723]: 2026-02-16 13:51:38.995 185727 DEBUG oslo_concurrency.lockutils [req-de1c83ce-a6db-4188-a2fe-8ab5d9ae8c1b req-537f5a7e-dbf0-4d37-a14f-15c0186a241f faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "6b24deb5-a1f1-4154-a8a4-c31c69dc5d32-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:51:38 compute-0 nova_compute[185723]: 2026-02-16 13:51:38.995 185727 DEBUG nova.compute.manager [req-de1c83ce-a6db-4188-a2fe-8ab5d9ae8c1b req-537f5a7e-dbf0-4d37-a14f-15c0186a241f faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32] No waiting events found dispatching network-vif-unplugged-b7a22eb4-a31d-4a96-a5a9-9e56f37cecc1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:51:38 compute-0 nova_compute[185723]: 2026-02-16 13:51:38.995 185727 DEBUG nova.compute.manager [req-de1c83ce-a6db-4188-a2fe-8ab5d9ae8c1b req-537f5a7e-dbf0-4d37-a14f-15c0186a241f faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32] Received event network-vif-unplugged-b7a22eb4-a31d-4a96-a5a9-9e56f37cecc1 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Feb 16 13:51:38 compute-0 nova_compute[185723]: 2026-02-16 13:51:38.995 185727 DEBUG nova.compute.manager [req-de1c83ce-a6db-4188-a2fe-8ab5d9ae8c1b req-537f5a7e-dbf0-4d37-a14f-15c0186a241f faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32] Received event network-vif-plugged-b7a22eb4-a31d-4a96-a5a9-9e56f37cecc1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:51:38 compute-0 nova_compute[185723]: 2026-02-16 13:51:38.996 185727 DEBUG oslo_concurrency.lockutils [req-de1c83ce-a6db-4188-a2fe-8ab5d9ae8c1b req-537f5a7e-dbf0-4d37-a14f-15c0186a241f faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "6b24deb5-a1f1-4154-a8a4-c31c69dc5d32-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:51:38 compute-0 nova_compute[185723]: 2026-02-16 13:51:38.996 185727 DEBUG oslo_concurrency.lockutils [req-de1c83ce-a6db-4188-a2fe-8ab5d9ae8c1b req-537f5a7e-dbf0-4d37-a14f-15c0186a241f faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "6b24deb5-a1f1-4154-a8a4-c31c69dc5d32-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:51:38 compute-0 nova_compute[185723]: 2026-02-16 13:51:38.996 185727 DEBUG oslo_concurrency.lockutils [req-de1c83ce-a6db-4188-a2fe-8ab5d9ae8c1b req-537f5a7e-dbf0-4d37-a14f-15c0186a241f faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "6b24deb5-a1f1-4154-a8a4-c31c69dc5d32-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:51:38 compute-0 nova_compute[185723]: 2026-02-16 13:51:38.996 185727 DEBUG nova.compute.manager [req-de1c83ce-a6db-4188-a2fe-8ab5d9ae8c1b req-537f5a7e-dbf0-4d37-a14f-15c0186a241f faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32] No waiting events found dispatching network-vif-plugged-b7a22eb4-a31d-4a96-a5a9-9e56f37cecc1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:51:38 compute-0 nova_compute[185723]: 2026-02-16 13:51:38.996 185727 WARNING nova.compute.manager [req-de1c83ce-a6db-4188-a2fe-8ab5d9ae8c1b req-537f5a7e-dbf0-4d37-a14f-15c0186a241f faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32] Received unexpected event network-vif-plugged-b7a22eb4-a31d-4a96-a5a9-9e56f37cecc1 for instance with vm_state active and task_state migrating.
Feb 16 13:51:38 compute-0 nova_compute[185723]: 2026-02-16 13:51:38.997 185727 DEBUG nova.compute.manager [req-de1c83ce-a6db-4188-a2fe-8ab5d9ae8c1b req-537f5a7e-dbf0-4d37-a14f-15c0186a241f faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32] Received event network-vif-plugged-b7a22eb4-a31d-4a96-a5a9-9e56f37cecc1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:51:38 compute-0 nova_compute[185723]: 2026-02-16 13:51:38.997 185727 DEBUG oslo_concurrency.lockutils [req-de1c83ce-a6db-4188-a2fe-8ab5d9ae8c1b req-537f5a7e-dbf0-4d37-a14f-15c0186a241f faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "6b24deb5-a1f1-4154-a8a4-c31c69dc5d32-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:51:38 compute-0 nova_compute[185723]: 2026-02-16 13:51:38.997 185727 DEBUG oslo_concurrency.lockutils [req-de1c83ce-a6db-4188-a2fe-8ab5d9ae8c1b req-537f5a7e-dbf0-4d37-a14f-15c0186a241f faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "6b24deb5-a1f1-4154-a8a4-c31c69dc5d32-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:51:38 compute-0 nova_compute[185723]: 2026-02-16 13:51:38.997 185727 DEBUG oslo_concurrency.lockutils [req-de1c83ce-a6db-4188-a2fe-8ab5d9ae8c1b req-537f5a7e-dbf0-4d37-a14f-15c0186a241f faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "6b24deb5-a1f1-4154-a8a4-c31c69dc5d32-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:51:38 compute-0 nova_compute[185723]: 2026-02-16 13:51:38.998 185727 DEBUG nova.compute.manager [req-de1c83ce-a6db-4188-a2fe-8ab5d9ae8c1b req-537f5a7e-dbf0-4d37-a14f-15c0186a241f faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32] No waiting events found dispatching network-vif-plugged-b7a22eb4-a31d-4a96-a5a9-9e56f37cecc1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:51:38 compute-0 nova_compute[185723]: 2026-02-16 13:51:38.998 185727 WARNING nova.compute.manager [req-de1c83ce-a6db-4188-a2fe-8ab5d9ae8c1b req-537f5a7e-dbf0-4d37-a14f-15c0186a241f faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32] Received unexpected event network-vif-plugged-b7a22eb4-a31d-4a96-a5a9-9e56f37cecc1 for instance with vm_state active and task_state migrating.
Feb 16 13:51:38 compute-0 nova_compute[185723]: 2026-02-16 13:51:38.998 185727 DEBUG nova.compute.manager [req-de1c83ce-a6db-4188-a2fe-8ab5d9ae8c1b req-537f5a7e-dbf0-4d37-a14f-15c0186a241f faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32] Received event network-vif-plugged-b7a22eb4-a31d-4a96-a5a9-9e56f37cecc1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:51:38 compute-0 nova_compute[185723]: 2026-02-16 13:51:38.998 185727 DEBUG oslo_concurrency.lockutils [req-de1c83ce-a6db-4188-a2fe-8ab5d9ae8c1b req-537f5a7e-dbf0-4d37-a14f-15c0186a241f faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "6b24deb5-a1f1-4154-a8a4-c31c69dc5d32-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:51:38 compute-0 nova_compute[185723]: 2026-02-16 13:51:38.998 185727 DEBUG oslo_concurrency.lockutils [req-de1c83ce-a6db-4188-a2fe-8ab5d9ae8c1b req-537f5a7e-dbf0-4d37-a14f-15c0186a241f faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "6b24deb5-a1f1-4154-a8a4-c31c69dc5d32-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:51:38 compute-0 nova_compute[185723]: 2026-02-16 13:51:38.999 185727 DEBUG oslo_concurrency.lockutils [req-de1c83ce-a6db-4188-a2fe-8ab5d9ae8c1b req-537f5a7e-dbf0-4d37-a14f-15c0186a241f faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "6b24deb5-a1f1-4154-a8a4-c31c69dc5d32-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:51:38 compute-0 nova_compute[185723]: 2026-02-16 13:51:38.999 185727 DEBUG nova.compute.manager [req-de1c83ce-a6db-4188-a2fe-8ab5d9ae8c1b req-537f5a7e-dbf0-4d37-a14f-15c0186a241f faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32] No waiting events found dispatching network-vif-plugged-b7a22eb4-a31d-4a96-a5a9-9e56f37cecc1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:51:38 compute-0 nova_compute[185723]: 2026-02-16 13:51:38.999 185727 WARNING nova.compute.manager [req-de1c83ce-a6db-4188-a2fe-8ab5d9ae8c1b req-537f5a7e-dbf0-4d37-a14f-15c0186a241f faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32] Received unexpected event network-vif-plugged-b7a22eb4-a31d-4a96-a5a9-9e56f37cecc1 for instance with vm_state active and task_state migrating.
Feb 16 13:51:39 compute-0 nova_compute[185723]: 2026-02-16 13:51:38.999 185727 DEBUG nova.compute.manager [req-de1c83ce-a6db-4188-a2fe-8ab5d9ae8c1b req-537f5a7e-dbf0-4d37-a14f-15c0186a241f faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32] Received event network-vif-plugged-b7a22eb4-a31d-4a96-a5a9-9e56f37cecc1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:51:39 compute-0 nova_compute[185723]: 2026-02-16 13:51:39.000 185727 DEBUG oslo_concurrency.lockutils [req-de1c83ce-a6db-4188-a2fe-8ab5d9ae8c1b req-537f5a7e-dbf0-4d37-a14f-15c0186a241f faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "6b24deb5-a1f1-4154-a8a4-c31c69dc5d32-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:51:39 compute-0 nova_compute[185723]: 2026-02-16 13:51:39.000 185727 DEBUG oslo_concurrency.lockutils [req-de1c83ce-a6db-4188-a2fe-8ab5d9ae8c1b req-537f5a7e-dbf0-4d37-a14f-15c0186a241f faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "6b24deb5-a1f1-4154-a8a4-c31c69dc5d32-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:51:39 compute-0 nova_compute[185723]: 2026-02-16 13:51:39.000 185727 DEBUG oslo_concurrency.lockutils [req-de1c83ce-a6db-4188-a2fe-8ab5d9ae8c1b req-537f5a7e-dbf0-4d37-a14f-15c0186a241f faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "6b24deb5-a1f1-4154-a8a4-c31c69dc5d32-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:51:39 compute-0 nova_compute[185723]: 2026-02-16 13:51:39.000 185727 DEBUG nova.compute.manager [req-de1c83ce-a6db-4188-a2fe-8ab5d9ae8c1b req-537f5a7e-dbf0-4d37-a14f-15c0186a241f faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32] No waiting events found dispatching network-vif-plugged-b7a22eb4-a31d-4a96-a5a9-9e56f37cecc1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:51:39 compute-0 nova_compute[185723]: 2026-02-16 13:51:39.001 185727 WARNING nova.compute.manager [req-de1c83ce-a6db-4188-a2fe-8ab5d9ae8c1b req-537f5a7e-dbf0-4d37-a14f-15c0186a241f faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32] Received unexpected event network-vif-plugged-b7a22eb4-a31d-4a96-a5a9-9e56f37cecc1 for instance with vm_state active and task_state migrating.
Feb 16 13:51:40 compute-0 nova_compute[185723]: 2026-02-16 13:51:40.746 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:51:42 compute-0 systemd[1]: virtproxyd.service: Deactivated successfully.
Feb 16 13:51:43 compute-0 nova_compute[185723]: 2026-02-16 13:51:43.153 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:51:44 compute-0 podman[216621]: 2026-02-16 13:51:44.01224968 +0000 UTC m=+0.048214909 container health_status d69bd0be854ec8043fcba555b70c2cd311c7a2510d07d6bc9929fe7684abece9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0)
Feb 16 13:51:44 compute-0 podman[216620]: 2026-02-16 13:51:44.017667525 +0000 UTC m=+0.056392122 container health_status 93151dcbec9f159583042df5101bdd7b5b1bcb5f3117955b4bc9cd1c0e465b98 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.7, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, name=ubi9/ubi-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, org.opencontainers.image.created=2026-02-05T04:57:10Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Red Hat, Inc., config_id=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, architecture=x86_64, com.redhat.component=ubi9-minimal-container, build-date=2026-02-05T04:57:10Z, io.openshift.expose-services=, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1770267347)
Feb 16 13:51:44 compute-0 nova_compute[185723]: 2026-02-16 13:51:44.154 185727 DEBUG oslo_concurrency.lockutils [None req-f0ec306f-424e-4dbf-8261-4a3536a2121f bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "6b24deb5-a1f1-4154-a8a4-c31c69dc5d32-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:51:44 compute-0 nova_compute[185723]: 2026-02-16 13:51:44.155 185727 DEBUG oslo_concurrency.lockutils [None req-f0ec306f-424e-4dbf-8261-4a3536a2121f bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "6b24deb5-a1f1-4154-a8a4-c31c69dc5d32-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:51:44 compute-0 nova_compute[185723]: 2026-02-16 13:51:44.155 185727 DEBUG oslo_concurrency.lockutils [None req-f0ec306f-424e-4dbf-8261-4a3536a2121f bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "6b24deb5-a1f1-4154-a8a4-c31c69dc5d32-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:51:44 compute-0 nova_compute[185723]: 2026-02-16 13:51:44.176 185727 DEBUG oslo_concurrency.lockutils [None req-f0ec306f-424e-4dbf-8261-4a3536a2121f bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:51:44 compute-0 nova_compute[185723]: 2026-02-16 13:51:44.176 185727 DEBUG oslo_concurrency.lockutils [None req-f0ec306f-424e-4dbf-8261-4a3536a2121f bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:51:44 compute-0 nova_compute[185723]: 2026-02-16 13:51:44.176 185727 DEBUG oslo_concurrency.lockutils [None req-f0ec306f-424e-4dbf-8261-4a3536a2121f bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:51:44 compute-0 nova_compute[185723]: 2026-02-16 13:51:44.177 185727 DEBUG nova.compute.resource_tracker [None req-f0ec306f-424e-4dbf-8261-4a3536a2121f bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 16 13:51:44 compute-0 systemd[1]: Stopping User Manager for UID 42436...
Feb 16 13:51:44 compute-0 systemd[216502]: Activating special unit Exit the Session...
Feb 16 13:51:44 compute-0 systemd[216502]: Stopped target Main User Target.
Feb 16 13:51:44 compute-0 systemd[216502]: Stopped target Basic System.
Feb 16 13:51:44 compute-0 systemd[216502]: Stopped target Paths.
Feb 16 13:51:44 compute-0 systemd[216502]: Stopped target Sockets.
Feb 16 13:51:44 compute-0 systemd[216502]: Stopped target Timers.
Feb 16 13:51:44 compute-0 systemd[216502]: Stopped Mark boot as successful after the user session has run 2 minutes.
Feb 16 13:51:44 compute-0 systemd[216502]: Stopped Daily Cleanup of User's Temporary Directories.
Feb 16 13:51:44 compute-0 systemd[216502]: Closed D-Bus User Message Bus Socket.
Feb 16 13:51:44 compute-0 systemd[216502]: Stopped Create User's Volatile Files and Directories.
Feb 16 13:51:44 compute-0 systemd[216502]: Removed slice User Application Slice.
Feb 16 13:51:44 compute-0 systemd[216502]: Reached target Shutdown.
Feb 16 13:51:44 compute-0 systemd[216502]: Finished Exit the Session.
Feb 16 13:51:44 compute-0 systemd[216502]: Reached target Exit the Session.
Feb 16 13:51:44 compute-0 systemd[1]: user@42436.service: Deactivated successfully.
Feb 16 13:51:44 compute-0 systemd[1]: Stopped User Manager for UID 42436.
Feb 16 13:51:44 compute-0 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Feb 16 13:51:44 compute-0 systemd[1]: run-user-42436.mount: Deactivated successfully.
Feb 16 13:51:44 compute-0 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Feb 16 13:51:44 compute-0 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Feb 16 13:51:44 compute-0 systemd[1]: Removed slice User Slice of UID 42436.
Feb 16 13:51:44 compute-0 nova_compute[185723]: 2026-02-16 13:51:44.309 185727 WARNING nova.virt.libvirt.driver [None req-f0ec306f-424e-4dbf-8261-4a3536a2121f bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 13:51:44 compute-0 nova_compute[185723]: 2026-02-16 13:51:44.310 185727 DEBUG nova.compute.resource_tracker [None req-f0ec306f-424e-4dbf-8261-4a3536a2121f bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5821MB free_disk=73.22003173828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 16 13:51:44 compute-0 nova_compute[185723]: 2026-02-16 13:51:44.310 185727 DEBUG oslo_concurrency.lockutils [None req-f0ec306f-424e-4dbf-8261-4a3536a2121f bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:51:44 compute-0 nova_compute[185723]: 2026-02-16 13:51:44.311 185727 DEBUG oslo_concurrency.lockutils [None req-f0ec306f-424e-4dbf-8261-4a3536a2121f bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:51:44 compute-0 nova_compute[185723]: 2026-02-16 13:51:44.376 185727 DEBUG nova.compute.resource_tracker [None req-f0ec306f-424e-4dbf-8261-4a3536a2121f bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Migration for instance 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903
Feb 16 13:51:44 compute-0 nova_compute[185723]: 2026-02-16 13:51:44.399 185727 DEBUG nova.compute.resource_tracker [None req-f0ec306f-424e-4dbf-8261-4a3536a2121f bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1491
Feb 16 13:51:44 compute-0 nova_compute[185723]: 2026-02-16 13:51:44.436 185727 DEBUG nova.compute.resource_tracker [None req-f0ec306f-424e-4dbf-8261-4a3536a2121f bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Migration 9adab84b-a5c2-47b5-9af4-697e29ea3af8 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640
Feb 16 13:51:44 compute-0 nova_compute[185723]: 2026-02-16 13:51:44.437 185727 DEBUG nova.compute.resource_tracker [None req-f0ec306f-424e-4dbf-8261-4a3536a2121f bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 16 13:51:44 compute-0 nova_compute[185723]: 2026-02-16 13:51:44.437 185727 DEBUG nova.compute.resource_tracker [None req-f0ec306f-424e-4dbf-8261-4a3536a2121f bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 16 13:51:44 compute-0 nova_compute[185723]: 2026-02-16 13:51:44.501 185727 DEBUG nova.compute.provider_tree [None req-f0ec306f-424e-4dbf-8261-4a3536a2121f bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Inventory has not changed in ProviderTree for provider: c9501a85-df32-4b8f-bce0-9425ef1e7866 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 13:51:44 compute-0 nova_compute[185723]: 2026-02-16 13:51:44.531 185727 DEBUG nova.scheduler.client.report [None req-f0ec306f-424e-4dbf-8261-4a3536a2121f bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Inventory has not changed for provider c9501a85-df32-4b8f-bce0-9425ef1e7866 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 13:51:44 compute-0 nova_compute[185723]: 2026-02-16 13:51:44.563 185727 DEBUG nova.compute.resource_tracker [None req-f0ec306f-424e-4dbf-8261-4a3536a2121f bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 16 13:51:44 compute-0 nova_compute[185723]: 2026-02-16 13:51:44.564 185727 DEBUG oslo_concurrency.lockutils [None req-f0ec306f-424e-4dbf-8261-4a3536a2121f bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.253s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:51:44 compute-0 nova_compute[185723]: 2026-02-16 13:51:44.569 185727 INFO nova.compute.manager [None req-f0ec306f-424e-4dbf-8261-4a3536a2121f bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32] Migrating instance to compute-1.ctlplane.example.com finished successfully.
Feb 16 13:51:44 compute-0 nova_compute[185723]: 2026-02-16 13:51:44.695 185727 INFO nova.scheduler.client.report [None req-f0ec306f-424e-4dbf-8261-4a3536a2121f bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Deleted allocation for migration 9adab84b-a5c2-47b5-9af4-697e29ea3af8
Feb 16 13:51:44 compute-0 nova_compute[185723]: 2026-02-16 13:51:44.696 185727 DEBUG nova.virt.libvirt.driver [None req-f0ec306f-424e-4dbf-8261-4a3536a2121f bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32] Live migration monitoring is all done _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10662
Feb 16 13:51:44 compute-0 sshd-session[216662]: Invalid user test from 146.190.226.24 port 56178
Feb 16 13:51:44 compute-0 sshd-session[216662]: Connection closed by invalid user test 146.190.226.24 port 56178 [preauth]
Feb 16 13:51:45 compute-0 nova_compute[185723]: 2026-02-16 13:51:45.747 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:51:45 compute-0 sshd-session[216664]: Invalid user postgres from 188.166.42.159 port 60342
Feb 16 13:51:45 compute-0 sshd-session[216664]: Connection closed by invalid user postgres 188.166.42.159 port 60342 [preauth]
Feb 16 13:51:47 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:51:47.446 105360 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=30, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '52:96:1b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '16:6c:7a:bd:49:39'}, ipsec=False) old=SB_Global(nb_cfg=29) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 13:51:47 compute-0 nova_compute[185723]: 2026-02-16 13:51:47.447 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:51:47 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:51:47.447 105360 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 16 13:51:48 compute-0 nova_compute[185723]: 2026-02-16 13:51:48.156 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:51:50 compute-0 podman[216666]: 2026-02-16 13:51:50.044167138 +0000 UTC m=+0.079288811 container health_status 19961a2f872cc72daf954f4dd556f980b5f58f137d145776df6132c167a8a93c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, config_id=ovn_controller)
Feb 16 13:51:50 compute-0 nova_compute[185723]: 2026-02-16 13:51:50.750 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:51:52 compute-0 nova_compute[185723]: 2026-02-16 13:51:52.219 185727 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1771249897.2182035, 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 13:51:52 compute-0 nova_compute[185723]: 2026-02-16 13:51:52.220 185727 INFO nova.compute.manager [-] [instance: 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32] VM Stopped (Lifecycle Event)
Feb 16 13:51:52 compute-0 nova_compute[185723]: 2026-02-16 13:51:52.261 185727 DEBUG nova.compute.manager [None req-afede021-f080-4135-9535-0614e44a8d61 - - - - - -] [instance: 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:51:53 compute-0 nova_compute[185723]: 2026-02-16 13:51:53.159 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:51:55 compute-0 nova_compute[185723]: 2026-02-16 13:51:55.751 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:51:57 compute-0 podman[216692]: 2026-02-16 13:51:57.006044394 +0000 UTC m=+0.044464686 container health_status 4ec6105d1727f201f656d7e6e358931be8ceabc5af37fb5ad779abc620122180 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 16 13:51:57 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:51:57.449 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=b0e583b2-47d7-4bde-bbd6-282143e0c194, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '30'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:51:58 compute-0 nova_compute[185723]: 2026-02-16 13:51:58.161 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:51:59 compute-0 podman[195053]: time="2026-02-16T13:51:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:51:59 compute-0 podman[195053]: @ - - [16/Feb/2026:13:51:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16012 "" "Go-http-client/1.1"
Feb 16 13:51:59 compute-0 podman[195053]: @ - - [16/Feb/2026:13:51:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2178 "" "Go-http-client/1.1"
Feb 16 13:52:00 compute-0 nova_compute[185723]: 2026-02-16 13:52:00.753 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:52:01 compute-0 anacron[60464]: Job `cron.monthly' started
Feb 16 13:52:01 compute-0 anacron[60464]: Job `cron.monthly' terminated
Feb 16 13:52:01 compute-0 anacron[60464]: Normal exit (3 jobs run)
Feb 16 13:52:01 compute-0 openstack_network_exporter[197909]: ERROR   13:52:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:52:01 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:52:01 compute-0 openstack_network_exporter[197909]: ERROR   13:52:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:52:01 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:52:03 compute-0 nova_compute[185723]: 2026-02-16 13:52:03.164 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:52:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:52:03.246 105360 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:52:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:52:03.246 105360 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:52:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:52:03.246 105360 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:52:03 compute-0 sshd-session[216718]: Invalid user ubuntu from 64.227.72.94 port 47472
Feb 16 13:52:03 compute-0 sshd-session[216718]: Connection closed by invalid user ubuntu 64.227.72.94 port 47472 [preauth]
Feb 16 13:52:05 compute-0 nova_compute[185723]: 2026-02-16 13:52:05.756 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:52:08 compute-0 nova_compute[185723]: 2026-02-16 13:52:08.188 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:52:10 compute-0 nova_compute[185723]: 2026-02-16 13:52:10.757 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:52:11 compute-0 nova_compute[185723]: 2026-02-16 13:52:11.205 185727 DEBUG nova.compute.manager [None req-90629bff-7a40-49e3-b7dd-ce9ae66c233c d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] Removing trait COMPUTE_STATUS_DISABLED from compute node resource provider c9501a85-df32-4b8f-bce0-9425ef1e7866 in placement. update_compute_provider_status /usr/lib/python3.9/site-packages/nova/compute/manager.py:606
Feb 16 13:52:11 compute-0 nova_compute[185723]: 2026-02-16 13:52:11.262 185727 DEBUG nova.compute.provider_tree [None req-90629bff-7a40-49e3-b7dd-ce9ae66c233c d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] Updating resource provider c9501a85-df32-4b8f-bce0-9425ef1e7866 generation from 37 to 40 during operation: update_traits _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Feb 16 13:52:13 compute-0 nova_compute[185723]: 2026-02-16 13:52:13.190 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:52:15 compute-0 podman[216721]: 2026-02-16 13:52:15.223114412 +0000 UTC m=+0.264231496 container health_status d69bd0be854ec8043fcba555b70c2cd311c7a2510d07d6bc9929fe7684abece9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Feb 16 13:52:15 compute-0 podman[216720]: 2026-02-16 13:52:15.250210436 +0000 UTC m=+0.292483359 container health_status 93151dcbec9f159583042df5101bdd7b5b1bcb5f3117955b4bc9cd1c0e465b98 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, managed_by=edpm_ansible, config_id=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=ubi9/ubi-minimal, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, release=1770267347, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, maintainer=Red Hat, Inc., org.opencontainers.image.created=2026-02-05T04:57:10Z, vcs-type=git, io.buildah.version=1.33.7, version=9.7, build-date=2026-02-05T04:57:10Z)
Feb 16 13:52:15 compute-0 nova_compute[185723]: 2026-02-16 13:52:15.286 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:52:15 compute-0 nova_compute[185723]: 2026-02-16 13:52:15.760 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:52:18 compute-0 nova_compute[185723]: 2026-02-16 13:52:18.193 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:52:20 compute-0 nova_compute[185723]: 2026-02-16 13:52:20.808 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:52:21 compute-0 podman[216758]: 2026-02-16 13:52:21.02563555 +0000 UTC m=+0.068009261 container health_status 19961a2f872cc72daf954f4dd556f980b5f58f137d145776df6132c167a8a93c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Feb 16 13:52:23 compute-0 nova_compute[185723]: 2026-02-16 13:52:23.230 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:52:25 compute-0 nova_compute[185723]: 2026-02-16 13:52:25.810 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:52:26 compute-0 nova_compute[185723]: 2026-02-16 13:52:26.433 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:52:26 compute-0 nova_compute[185723]: 2026-02-16 13:52:26.457 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:52:26 compute-0 nova_compute[185723]: 2026-02-16 13:52:26.458 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:52:26 compute-0 nova_compute[185723]: 2026-02-16 13:52:26.458 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:52:26 compute-0 nova_compute[185723]: 2026-02-16 13:52:26.459 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 16 13:52:26 compute-0 nova_compute[185723]: 2026-02-16 13:52:26.616 185727 WARNING nova.virt.libvirt.driver [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 13:52:26 compute-0 nova_compute[185723]: 2026-02-16 13:52:26.618 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5853MB free_disk=73.22002792358398GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 16 13:52:26 compute-0 nova_compute[185723]: 2026-02-16 13:52:26.618 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:52:26 compute-0 nova_compute[185723]: 2026-02-16 13:52:26.619 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:52:26 compute-0 nova_compute[185723]: 2026-02-16 13:52:26.682 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 16 13:52:26 compute-0 nova_compute[185723]: 2026-02-16 13:52:26.683 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 16 13:52:26 compute-0 nova_compute[185723]: 2026-02-16 13:52:26.701 185727 DEBUG nova.compute.provider_tree [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Inventory has not changed in ProviderTree for provider: c9501a85-df32-4b8f-bce0-9425ef1e7866 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 13:52:26 compute-0 nova_compute[185723]: 2026-02-16 13:52:26.716 185727 DEBUG nova.scheduler.client.report [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Inventory has not changed for provider c9501a85-df32-4b8f-bce0-9425ef1e7866 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 13:52:26 compute-0 nova_compute[185723]: 2026-02-16 13:52:26.718 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 16 13:52:26 compute-0 nova_compute[185723]: 2026-02-16 13:52:26.718 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.099s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:52:27 compute-0 sshd-session[216784]: Invalid user apache from 146.190.22.227 port 58866
Feb 16 13:52:27 compute-0 nova_compute[185723]: 2026-02-16 13:52:27.718 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:52:27 compute-0 nova_compute[185723]: 2026-02-16 13:52:27.718 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:52:27 compute-0 podman[216786]: 2026-02-16 13:52:27.761385006 +0000 UTC m=+0.066568935 container health_status 4ec6105d1727f201f656d7e6e358931be8ceabc5af37fb5ad779abc620122180 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 16 13:52:28 compute-0 sshd-session[216784]: Connection closed by invalid user apache 146.190.22.227 port 58866 [preauth]
Feb 16 13:52:28 compute-0 nova_compute[185723]: 2026-02-16 13:52:28.232 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:52:28 compute-0 nova_compute[185723]: 2026-02-16 13:52:28.432 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:52:29 compute-0 podman[195053]: time="2026-02-16T13:52:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:52:29 compute-0 podman[195053]: @ - - [16/Feb/2026:13:52:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16012 "" "Go-http-client/1.1"
Feb 16 13:52:29 compute-0 podman[195053]: @ - - [16/Feb/2026:13:52:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2178 "" "Go-http-client/1.1"
Feb 16 13:52:30 compute-0 nova_compute[185723]: 2026-02-16 13:52:30.433 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:52:30 compute-0 nova_compute[185723]: 2026-02-16 13:52:30.433 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 16 13:52:30 compute-0 nova_compute[185723]: 2026-02-16 13:52:30.433 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 16 13:52:30 compute-0 nova_compute[185723]: 2026-02-16 13:52:30.449 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 16 13:52:30 compute-0 nova_compute[185723]: 2026-02-16 13:52:30.449 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:52:30 compute-0 nova_compute[185723]: 2026-02-16 13:52:30.811 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:52:30 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:52:30.857 105360 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=31, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '52:96:1b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '16:6c:7a:bd:49:39'}, ipsec=False) old=SB_Global(nb_cfg=30) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 13:52:30 compute-0 nova_compute[185723]: 2026-02-16 13:52:30.858 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:52:30 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:52:30.858 105360 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 16 13:52:30 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:52:30.859 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=b0e583b2-47d7-4bde-bbd6-282143e0c194, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '31'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:52:31 compute-0 openstack_network_exporter[197909]: ERROR   13:52:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:52:31 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:52:31 compute-0 openstack_network_exporter[197909]: ERROR   13:52:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:52:31 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:52:32 compute-0 nova_compute[185723]: 2026-02-16 13:52:32.432 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:52:33 compute-0 nova_compute[185723]: 2026-02-16 13:52:33.236 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:52:33 compute-0 nova_compute[185723]: 2026-02-16 13:52:33.428 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:52:34 compute-0 nova_compute[185723]: 2026-02-16 13:52:34.433 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:52:34 compute-0 nova_compute[185723]: 2026-02-16 13:52:34.433 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 16 13:52:35 compute-0 nova_compute[185723]: 2026-02-16 13:52:35.813 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:52:38 compute-0 nova_compute[185723]: 2026-02-16 13:52:38.238 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:52:40 compute-0 nova_compute[185723]: 2026-02-16 13:52:40.814 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:52:42 compute-0 sshd-session[216811]: Invalid user postgres from 188.166.42.159 port 45128
Feb 16 13:52:42 compute-0 sshd-session[216811]: Connection closed by invalid user postgres 188.166.42.159 port 45128 [preauth]
Feb 16 13:52:43 compute-0 nova_compute[185723]: 2026-02-16 13:52:43.241 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:52:45 compute-0 nova_compute[185723]: 2026-02-16 13:52:45.815 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:52:46 compute-0 podman[216814]: 2026-02-16 13:52:46.002623583 +0000 UTC m=+0.042798214 container health_status d69bd0be854ec8043fcba555b70c2cd311c7a2510d07d6bc9929fe7684abece9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Feb 16 13:52:46 compute-0 podman[216813]: 2026-02-16 13:52:46.024254781 +0000 UTC m=+0.063268973 container health_status 93151dcbec9f159583042df5101bdd7b5b1bcb5f3117955b4bc9cd1c0e465b98 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, org.opencontainers.image.created=2026-02-05T04:57:10Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9/ubi-minimal, vcs-type=git, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, release=1770267347, version=9.7, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, container_name=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., build-date=2026-02-05T04:57:10Z, managed_by=edpm_ansible, architecture=x86_64, distribution-scope=public, io.buildah.version=1.33.7, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c)
Feb 16 13:52:48 compute-0 nova_compute[185723]: 2026-02-16 13:52:48.244 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:52:49 compute-0 nova_compute[185723]: 2026-02-16 13:52:49.722 185727 DEBUG oslo_concurrency.lockutils [None req-65becf34-ed00-4b55-b959-bde518f22e80 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] Acquiring lock "169216bd-b5ad-4408-8962-d36ad92cbf8c" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:52:49 compute-0 nova_compute[185723]: 2026-02-16 13:52:49.722 185727 DEBUG oslo_concurrency.lockutils [None req-65becf34-ed00-4b55-b959-bde518f22e80 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] Lock "169216bd-b5ad-4408-8962-d36ad92cbf8c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:52:49 compute-0 nova_compute[185723]: 2026-02-16 13:52:49.746 185727 DEBUG nova.compute.manager [None req-65becf34-ed00-4b55-b959-bde518f22e80 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] [instance: 169216bd-b5ad-4408-8962-d36ad92cbf8c] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Feb 16 13:52:49 compute-0 nova_compute[185723]: 2026-02-16 13:52:49.864 185727 DEBUG oslo_concurrency.lockutils [None req-65becf34-ed00-4b55-b959-bde518f22e80 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:52:49 compute-0 nova_compute[185723]: 2026-02-16 13:52:49.865 185727 DEBUG oslo_concurrency.lockutils [None req-65becf34-ed00-4b55-b959-bde518f22e80 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:52:49 compute-0 nova_compute[185723]: 2026-02-16 13:52:49.873 185727 DEBUG nova.virt.hardware [None req-65becf34-ed00-4b55-b959-bde518f22e80 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Feb 16 13:52:49 compute-0 nova_compute[185723]: 2026-02-16 13:52:49.874 185727 INFO nova.compute.claims [None req-65becf34-ed00-4b55-b959-bde518f22e80 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] [instance: 169216bd-b5ad-4408-8962-d36ad92cbf8c] Claim successful on node compute-0.ctlplane.example.com
Feb 16 13:52:49 compute-0 sshd-session[216855]: Invalid user ubuntu from 64.227.72.94 port 33794
Feb 16 13:52:49 compute-0 nova_compute[185723]: 2026-02-16 13:52:49.966 185727 DEBUG nova.compute.provider_tree [None req-65becf34-ed00-4b55-b959-bde518f22e80 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] Inventory has not changed in ProviderTree for provider: c9501a85-df32-4b8f-bce0-9425ef1e7866 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 13:52:49 compute-0 nova_compute[185723]: 2026-02-16 13:52:49.996 185727 DEBUG nova.scheduler.client.report [None req-65becf34-ed00-4b55-b959-bde518f22e80 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] Inventory has not changed for provider c9501a85-df32-4b8f-bce0-9425ef1e7866 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 13:52:50 compute-0 nova_compute[185723]: 2026-02-16 13:52:50.023 185727 DEBUG oslo_concurrency.lockutils [None req-65becf34-ed00-4b55-b959-bde518f22e80 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.159s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:52:50 compute-0 nova_compute[185723]: 2026-02-16 13:52:50.024 185727 DEBUG nova.compute.manager [None req-65becf34-ed00-4b55-b959-bde518f22e80 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] [instance: 169216bd-b5ad-4408-8962-d36ad92cbf8c] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Feb 16 13:52:50 compute-0 sshd-session[216855]: Connection closed by invalid user ubuntu 64.227.72.94 port 33794 [preauth]
Feb 16 13:52:50 compute-0 nova_compute[185723]: 2026-02-16 13:52:50.098 185727 DEBUG nova.compute.manager [None req-65becf34-ed00-4b55-b959-bde518f22e80 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] [instance: 169216bd-b5ad-4408-8962-d36ad92cbf8c] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Feb 16 13:52:50 compute-0 nova_compute[185723]: 2026-02-16 13:52:50.099 185727 DEBUG nova.network.neutron [None req-65becf34-ed00-4b55-b959-bde518f22e80 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] [instance: 169216bd-b5ad-4408-8962-d36ad92cbf8c] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Feb 16 13:52:50 compute-0 nova_compute[185723]: 2026-02-16 13:52:50.133 185727 INFO nova.virt.libvirt.driver [None req-65becf34-ed00-4b55-b959-bde518f22e80 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] [instance: 169216bd-b5ad-4408-8962-d36ad92cbf8c] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Feb 16 13:52:50 compute-0 nova_compute[185723]: 2026-02-16 13:52:50.158 185727 DEBUG nova.compute.manager [None req-65becf34-ed00-4b55-b959-bde518f22e80 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] [instance: 169216bd-b5ad-4408-8962-d36ad92cbf8c] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Feb 16 13:52:50 compute-0 ovn_controller[96072]: 2026-02-16T13:52:50Z|00233|memory_trim|INFO|Detected inactivity (last active 30008 ms ago): trimming memory
Feb 16 13:52:50 compute-0 nova_compute[185723]: 2026-02-16 13:52:50.301 185727 DEBUG nova.compute.manager [None req-65becf34-ed00-4b55-b959-bde518f22e80 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] [instance: 169216bd-b5ad-4408-8962-d36ad92cbf8c] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Feb 16 13:52:50 compute-0 nova_compute[185723]: 2026-02-16 13:52:50.303 185727 DEBUG nova.virt.libvirt.driver [None req-65becf34-ed00-4b55-b959-bde518f22e80 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] [instance: 169216bd-b5ad-4408-8962-d36ad92cbf8c] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Feb 16 13:52:50 compute-0 nova_compute[185723]: 2026-02-16 13:52:50.303 185727 INFO nova.virt.libvirt.driver [None req-65becf34-ed00-4b55-b959-bde518f22e80 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] [instance: 169216bd-b5ad-4408-8962-d36ad92cbf8c] Creating image(s)
Feb 16 13:52:50 compute-0 nova_compute[185723]: 2026-02-16 13:52:50.304 185727 DEBUG oslo_concurrency.lockutils [None req-65becf34-ed00-4b55-b959-bde518f22e80 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] Acquiring lock "/var/lib/nova/instances/169216bd-b5ad-4408-8962-d36ad92cbf8c/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:52:50 compute-0 nova_compute[185723]: 2026-02-16 13:52:50.304 185727 DEBUG oslo_concurrency.lockutils [None req-65becf34-ed00-4b55-b959-bde518f22e80 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] Lock "/var/lib/nova/instances/169216bd-b5ad-4408-8962-d36ad92cbf8c/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:52:50 compute-0 nova_compute[185723]: 2026-02-16 13:52:50.305 185727 DEBUG oslo_concurrency.lockutils [None req-65becf34-ed00-4b55-b959-bde518f22e80 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] Lock "/var/lib/nova/instances/169216bd-b5ad-4408-8962-d36ad92cbf8c/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:52:50 compute-0 nova_compute[185723]: 2026-02-16 13:52:50.317 185727 DEBUG oslo_concurrency.processutils [None req-65becf34-ed00-4b55-b959-bde518f22e80 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:52:50 compute-0 nova_compute[185723]: 2026-02-16 13:52:50.366 185727 DEBUG oslo_concurrency.processutils [None req-65becf34-ed00-4b55-b959-bde518f22e80 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json" returned: 0 in 0.048s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:52:50 compute-0 nova_compute[185723]: 2026-02-16 13:52:50.367 185727 DEBUG oslo_concurrency.lockutils [None req-65becf34-ed00-4b55-b959-bde518f22e80 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] Acquiring lock "755ae6cb63977e865a71a8c38a1a67ce95c9d0b7" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:52:50 compute-0 nova_compute[185723]: 2026-02-16 13:52:50.367 185727 DEBUG oslo_concurrency.lockutils [None req-65becf34-ed00-4b55-b959-bde518f22e80 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] Lock "755ae6cb63977e865a71a8c38a1a67ce95c9d0b7" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:52:50 compute-0 nova_compute[185723]: 2026-02-16 13:52:50.378 185727 DEBUG oslo_concurrency.processutils [None req-65becf34-ed00-4b55-b959-bde518f22e80 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:52:50 compute-0 nova_compute[185723]: 2026-02-16 13:52:50.424 185727 DEBUG oslo_concurrency.processutils [None req-65becf34-ed00-4b55-b959-bde518f22e80 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:52:50 compute-0 nova_compute[185723]: 2026-02-16 13:52:50.425 185727 DEBUG oslo_concurrency.processutils [None req-65becf34-ed00-4b55-b959-bde518f22e80 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7,backing_fmt=raw /var/lib/nova/instances/169216bd-b5ad-4408-8962-d36ad92cbf8c/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:52:50 compute-0 nova_compute[185723]: 2026-02-16 13:52:50.452 185727 DEBUG oslo_concurrency.processutils [None req-65becf34-ed00-4b55-b959-bde518f22e80 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7,backing_fmt=raw /var/lib/nova/instances/169216bd-b5ad-4408-8962-d36ad92cbf8c/disk 1073741824" returned: 0 in 0.027s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:52:50 compute-0 nova_compute[185723]: 2026-02-16 13:52:50.453 185727 DEBUG oslo_concurrency.lockutils [None req-65becf34-ed00-4b55-b959-bde518f22e80 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] Lock "755ae6cb63977e865a71a8c38a1a67ce95c9d0b7" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.085s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:52:50 compute-0 nova_compute[185723]: 2026-02-16 13:52:50.453 185727 DEBUG oslo_concurrency.processutils [None req-65becf34-ed00-4b55-b959-bde518f22e80 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:52:50 compute-0 nova_compute[185723]: 2026-02-16 13:52:50.518 185727 DEBUG oslo_concurrency.processutils [None req-65becf34-ed00-4b55-b959-bde518f22e80 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:52:50 compute-0 nova_compute[185723]: 2026-02-16 13:52:50.519 185727 DEBUG nova.virt.disk.api [None req-65becf34-ed00-4b55-b959-bde518f22e80 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] Checking if we can resize image /var/lib/nova/instances/169216bd-b5ad-4408-8962-d36ad92cbf8c/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Feb 16 13:52:50 compute-0 nova_compute[185723]: 2026-02-16 13:52:50.520 185727 DEBUG oslo_concurrency.processutils [None req-65becf34-ed00-4b55-b959-bde518f22e80 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/169216bd-b5ad-4408-8962-d36ad92cbf8c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:52:50 compute-0 nova_compute[185723]: 2026-02-16 13:52:50.580 185727 DEBUG oslo_concurrency.processutils [None req-65becf34-ed00-4b55-b959-bde518f22e80 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/169216bd-b5ad-4408-8962-d36ad92cbf8c/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:52:50 compute-0 nova_compute[185723]: 2026-02-16 13:52:50.581 185727 DEBUG nova.virt.disk.api [None req-65becf34-ed00-4b55-b959-bde518f22e80 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] Cannot resize image /var/lib/nova/instances/169216bd-b5ad-4408-8962-d36ad92cbf8c/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Feb 16 13:52:50 compute-0 nova_compute[185723]: 2026-02-16 13:52:50.582 185727 DEBUG nova.objects.instance [None req-65becf34-ed00-4b55-b959-bde518f22e80 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] Lazy-loading 'migration_context' on Instance uuid 169216bd-b5ad-4408-8962-d36ad92cbf8c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 13:52:50 compute-0 nova_compute[185723]: 2026-02-16 13:52:50.605 185727 DEBUG nova.virt.libvirt.driver [None req-65becf34-ed00-4b55-b959-bde518f22e80 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] [instance: 169216bd-b5ad-4408-8962-d36ad92cbf8c] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Feb 16 13:52:50 compute-0 nova_compute[185723]: 2026-02-16 13:52:50.605 185727 DEBUG nova.virt.libvirt.driver [None req-65becf34-ed00-4b55-b959-bde518f22e80 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] [instance: 169216bd-b5ad-4408-8962-d36ad92cbf8c] Ensure instance console log exists: /var/lib/nova/instances/169216bd-b5ad-4408-8962-d36ad92cbf8c/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Feb 16 13:52:50 compute-0 nova_compute[185723]: 2026-02-16 13:52:50.606 185727 DEBUG oslo_concurrency.lockutils [None req-65becf34-ed00-4b55-b959-bde518f22e80 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:52:50 compute-0 nova_compute[185723]: 2026-02-16 13:52:50.606 185727 DEBUG oslo_concurrency.lockutils [None req-65becf34-ed00-4b55-b959-bde518f22e80 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:52:50 compute-0 nova_compute[185723]: 2026-02-16 13:52:50.606 185727 DEBUG oslo_concurrency.lockutils [None req-65becf34-ed00-4b55-b959-bde518f22e80 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:52:50 compute-0 nova_compute[185723]: 2026-02-16 13:52:50.869 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:52:51 compute-0 nova_compute[185723]: 2026-02-16 13:52:51.185 185727 DEBUG nova.policy [None req-65becf34-ed00-4b55-b959-bde518f22e80 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '178de9ab917a4ba5a84dc9f520a0847f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '88d7e9d22dc247d4b0e2e95ecc7e73ad', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Feb 16 13:52:52 compute-0 podman[216872]: 2026-02-16 13:52:52.100443838 +0000 UTC m=+0.139043616 container health_status 19961a2f872cc72daf954f4dd556f980b5f58f137d145776df6132c167a8a93c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Feb 16 13:52:53 compute-0 sshd-session[216898]: Invalid user test from 146.190.226.24 port 58598
Feb 16 13:52:53 compute-0 nova_compute[185723]: 2026-02-16 13:52:53.282 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:52:53 compute-0 sshd-session[216898]: Connection closed by invalid user test 146.190.226.24 port 58598 [preauth]
Feb 16 13:52:53 compute-0 nova_compute[185723]: 2026-02-16 13:52:53.433 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._run_image_cache_manager_pass run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:52:53 compute-0 nova_compute[185723]: 2026-02-16 13:52:53.434 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Acquiring lock "storage-registry-lock" by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:52:53 compute-0 nova_compute[185723]: 2026-02-16 13:52:53.434 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "storage-registry-lock" acquired by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:52:53 compute-0 nova_compute[185723]: 2026-02-16 13:52:53.435 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "storage-registry-lock" "released" by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:52:53 compute-0 nova_compute[185723]: 2026-02-16 13:52:53.435 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Acquiring lock "storage-registry-lock" by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:52:53 compute-0 nova_compute[185723]: 2026-02-16 13:52:53.435 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "storage-registry-lock" acquired by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:52:53 compute-0 nova_compute[185723]: 2026-02-16 13:52:53.436 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "storage-registry-lock" "released" by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:52:53 compute-0 nova_compute[185723]: 2026-02-16 13:52:53.477 185727 DEBUG nova.virt.libvirt.imagecache [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Verify base images _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:314
Feb 16 13:52:53 compute-0 nova_compute[185723]: 2026-02-16 13:52:53.478 185727 DEBUG nova.virt.libvirt.imagecache [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Image id 6fb9af7f-2971-4890-a777-6e99e888717f yields fingerprint 755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:319
Feb 16 13:52:53 compute-0 nova_compute[185723]: 2026-02-16 13:52:53.478 185727 INFO nova.virt.libvirt.imagecache [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] image 6fb9af7f-2971-4890-a777-6e99e888717f at (/var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7): checking
Feb 16 13:52:53 compute-0 nova_compute[185723]: 2026-02-16 13:52:53.478 185727 DEBUG nova.virt.libvirt.imagecache [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] image 6fb9af7f-2971-4890-a777-6e99e888717f at (/var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7): image is in use _mark_in_use /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:279
Feb 16 13:52:53 compute-0 nova_compute[185723]: 2026-02-16 13:52:53.479 185727 DEBUG nova.virt.libvirt.imagecache [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Image id  yields fingerprint da39a3ee5e6b4b0d3255bfef95601890afd80709 _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:319
Feb 16 13:52:53 compute-0 nova_compute[185723]: 2026-02-16 13:52:53.480 185727 DEBUG nova.virt.libvirt.imagecache [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] 169216bd-b5ad-4408-8962-d36ad92cbf8c is a valid instance name _list_backing_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:126
Feb 16 13:52:53 compute-0 nova_compute[185723]: 2026-02-16 13:52:53.480 185727 DEBUG nova.virt.libvirt.imagecache [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] 169216bd-b5ad-4408-8962-d36ad92cbf8c has a disk file _list_backing_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:129
Feb 16 13:52:53 compute-0 nova_compute[185723]: 2026-02-16 13:52:53.480 185727 DEBUG oslo_concurrency.processutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/169216bd-b5ad-4408-8962-d36ad92cbf8c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:52:53 compute-0 nova_compute[185723]: 2026-02-16 13:52:53.531 185727 DEBUG oslo_concurrency.processutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/169216bd-b5ad-4408-8962-d36ad92cbf8c/disk --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:52:53 compute-0 nova_compute[185723]: 2026-02-16 13:52:53.532 185727 DEBUG nova.virt.libvirt.imagecache [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Instance 169216bd-b5ad-4408-8962-d36ad92cbf8c is backed by 755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 _list_backing_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:141
Feb 16 13:52:53 compute-0 nova_compute[185723]: 2026-02-16 13:52:53.532 185727 INFO nova.virt.libvirt.imagecache [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Active base files: /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7
Feb 16 13:52:53 compute-0 nova_compute[185723]: 2026-02-16 13:52:53.532 185727 DEBUG nova.virt.libvirt.imagecache [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Verification complete _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:350
Feb 16 13:52:53 compute-0 nova_compute[185723]: 2026-02-16 13:52:53.532 185727 DEBUG nova.virt.libvirt.imagecache [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Verify swap images _age_and_verify_swap_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:299
Feb 16 13:52:53 compute-0 nova_compute[185723]: 2026-02-16 13:52:53.533 185727 DEBUG nova.virt.libvirt.imagecache [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Verify ephemeral images _age_and_verify_ephemeral_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:284
Feb 16 13:52:54 compute-0 nova_compute[185723]: 2026-02-16 13:52:54.398 185727 DEBUG nova.network.neutron [None req-65becf34-ed00-4b55-b959-bde518f22e80 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] [instance: 169216bd-b5ad-4408-8962-d36ad92cbf8c] Successfully created port: 1f9f033c-6441-4f4d-a631-8d0870baa901 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Feb 16 13:52:55 compute-0 nova_compute[185723]: 2026-02-16 13:52:55.873 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:52:57 compute-0 nova_compute[185723]: 2026-02-16 13:52:57.232 185727 DEBUG nova.network.neutron [None req-65becf34-ed00-4b55-b959-bde518f22e80 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] [instance: 169216bd-b5ad-4408-8962-d36ad92cbf8c] Successfully updated port: 1f9f033c-6441-4f4d-a631-8d0870baa901 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Feb 16 13:52:57 compute-0 nova_compute[185723]: 2026-02-16 13:52:57.261 185727 DEBUG oslo_concurrency.lockutils [None req-65becf34-ed00-4b55-b959-bde518f22e80 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] Acquiring lock "refresh_cache-169216bd-b5ad-4408-8962-d36ad92cbf8c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 13:52:57 compute-0 nova_compute[185723]: 2026-02-16 13:52:57.262 185727 DEBUG oslo_concurrency.lockutils [None req-65becf34-ed00-4b55-b959-bde518f22e80 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] Acquired lock "refresh_cache-169216bd-b5ad-4408-8962-d36ad92cbf8c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 13:52:57 compute-0 nova_compute[185723]: 2026-02-16 13:52:57.262 185727 DEBUG nova.network.neutron [None req-65becf34-ed00-4b55-b959-bde518f22e80 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] [instance: 169216bd-b5ad-4408-8962-d36ad92cbf8c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 16 13:52:57 compute-0 nova_compute[185723]: 2026-02-16 13:52:57.363 185727 DEBUG nova.compute.manager [req-c9412763-6319-41b1-bbf7-cbe6d3fd6367 req-9a16b227-907b-42e7-8cb3-bd0fa24599c8 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 169216bd-b5ad-4408-8962-d36ad92cbf8c] Received event network-changed-1f9f033c-6441-4f4d-a631-8d0870baa901 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:52:57 compute-0 nova_compute[185723]: 2026-02-16 13:52:57.364 185727 DEBUG nova.compute.manager [req-c9412763-6319-41b1-bbf7-cbe6d3fd6367 req-9a16b227-907b-42e7-8cb3-bd0fa24599c8 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 169216bd-b5ad-4408-8962-d36ad92cbf8c] Refreshing instance network info cache due to event network-changed-1f9f033c-6441-4f4d-a631-8d0870baa901. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 16 13:52:57 compute-0 nova_compute[185723]: 2026-02-16 13:52:57.364 185727 DEBUG oslo_concurrency.lockutils [req-c9412763-6319-41b1-bbf7-cbe6d3fd6367 req-9a16b227-907b-42e7-8cb3-bd0fa24599c8 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "refresh_cache-169216bd-b5ad-4408-8962-d36ad92cbf8c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 13:52:57 compute-0 nova_compute[185723]: 2026-02-16 13:52:57.454 185727 DEBUG nova.network.neutron [None req-65becf34-ed00-4b55-b959-bde518f22e80 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] [instance: 169216bd-b5ad-4408-8962-d36ad92cbf8c] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Feb 16 13:52:58 compute-0 podman[216903]: 2026-02-16 13:52:58.007135933 +0000 UTC m=+0.046360443 container health_status 4ec6105d1727f201f656d7e6e358931be8ceabc5af37fb5ad779abc620122180 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 16 13:52:58 compute-0 nova_compute[185723]: 2026-02-16 13:52:58.285 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:52:58 compute-0 nova_compute[185723]: 2026-02-16 13:52:58.620 185727 DEBUG nova.network.neutron [None req-65becf34-ed00-4b55-b959-bde518f22e80 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] [instance: 169216bd-b5ad-4408-8962-d36ad92cbf8c] Updating instance_info_cache with network_info: [{"id": "1f9f033c-6441-4f4d-a631-8d0870baa901", "address": "fa:16:3e:23:9d:0a", "network": {"id": "9f3f30c5-b9b2-44c9-bea9-678f6d4e1e0c", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalancingStrategy-2080676524-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "88d7e9d22dc247d4b0e2e95ecc7e73ad", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1f9f033c-64", "ovs_interfaceid": "1f9f033c-6441-4f4d-a631-8d0870baa901", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 13:52:58 compute-0 nova_compute[185723]: 2026-02-16 13:52:58.645 185727 DEBUG oslo_concurrency.lockutils [None req-65becf34-ed00-4b55-b959-bde518f22e80 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] Releasing lock "refresh_cache-169216bd-b5ad-4408-8962-d36ad92cbf8c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 13:52:58 compute-0 nova_compute[185723]: 2026-02-16 13:52:58.646 185727 DEBUG nova.compute.manager [None req-65becf34-ed00-4b55-b959-bde518f22e80 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] [instance: 169216bd-b5ad-4408-8962-d36ad92cbf8c] Instance network_info: |[{"id": "1f9f033c-6441-4f4d-a631-8d0870baa901", "address": "fa:16:3e:23:9d:0a", "network": {"id": "9f3f30c5-b9b2-44c9-bea9-678f6d4e1e0c", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalancingStrategy-2080676524-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "88d7e9d22dc247d4b0e2e95ecc7e73ad", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1f9f033c-64", "ovs_interfaceid": "1f9f033c-6441-4f4d-a631-8d0870baa901", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Feb 16 13:52:58 compute-0 nova_compute[185723]: 2026-02-16 13:52:58.646 185727 DEBUG oslo_concurrency.lockutils [req-c9412763-6319-41b1-bbf7-cbe6d3fd6367 req-9a16b227-907b-42e7-8cb3-bd0fa24599c8 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquired lock "refresh_cache-169216bd-b5ad-4408-8962-d36ad92cbf8c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 13:52:58 compute-0 nova_compute[185723]: 2026-02-16 13:52:58.647 185727 DEBUG nova.network.neutron [req-c9412763-6319-41b1-bbf7-cbe6d3fd6367 req-9a16b227-907b-42e7-8cb3-bd0fa24599c8 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 169216bd-b5ad-4408-8962-d36ad92cbf8c] Refreshing network info cache for port 1f9f033c-6441-4f4d-a631-8d0870baa901 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 16 13:52:58 compute-0 nova_compute[185723]: 2026-02-16 13:52:58.650 185727 DEBUG nova.virt.libvirt.driver [None req-65becf34-ed00-4b55-b959-bde518f22e80 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] [instance: 169216bd-b5ad-4408-8962-d36ad92cbf8c] Start _get_guest_xml network_info=[{"id": "1f9f033c-6441-4f4d-a631-8d0870baa901", "address": "fa:16:3e:23:9d:0a", "network": {"id": "9f3f30c5-b9b2-44c9-bea9-678f6d4e1e0c", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalancingStrategy-2080676524-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "88d7e9d22dc247d4b0e2e95ecc7e73ad", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1f9f033c-64", "ovs_interfaceid": "1f9f033c-6441-4f4d-a631-8d0870baa901", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-16T13:17:01Z,direct_url=<?>,disk_format='qcow2',id=6fb9af7f-2971-4890-a777-6e99e888717f,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='fa8ec31824694513a42cc22c81880a5c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-16T13:17:03Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'encrypted': False, 'device_type': 'disk', 'disk_bus': 'virtio', 'encryption_options': None, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'size': 0, 'image_id': '6fb9af7f-2971-4890-a777-6e99e888717f'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Feb 16 13:52:58 compute-0 nova_compute[185723]: 2026-02-16 13:52:58.656 185727 WARNING nova.virt.libvirt.driver [None req-65becf34-ed00-4b55-b959-bde518f22e80 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 13:52:58 compute-0 nova_compute[185723]: 2026-02-16 13:52:58.661 185727 DEBUG nova.virt.libvirt.host [None req-65becf34-ed00-4b55-b959-bde518f22e80 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Feb 16 13:52:58 compute-0 nova_compute[185723]: 2026-02-16 13:52:58.661 185727 DEBUG nova.virt.libvirt.host [None req-65becf34-ed00-4b55-b959-bde518f22e80 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Feb 16 13:52:58 compute-0 nova_compute[185723]: 2026-02-16 13:52:58.664 185727 DEBUG nova.virt.libvirt.host [None req-65becf34-ed00-4b55-b959-bde518f22e80 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Feb 16 13:52:58 compute-0 nova_compute[185723]: 2026-02-16 13:52:58.665 185727 DEBUG nova.virt.libvirt.host [None req-65becf34-ed00-4b55-b959-bde518f22e80 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Feb 16 13:52:58 compute-0 nova_compute[185723]: 2026-02-16 13:52:58.666 185727 DEBUG nova.virt.libvirt.driver [None req-65becf34-ed00-4b55-b959-bde518f22e80 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Feb 16 13:52:58 compute-0 nova_compute[185723]: 2026-02-16 13:52:58.667 185727 DEBUG nova.virt.hardware [None req-65becf34-ed00-4b55-b959-bde518f22e80 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-02-16T13:16:59Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='6d89f72c-1760-421e-a5f2-83dfc3723b84',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-16T13:17:01Z,direct_url=<?>,disk_format='qcow2',id=6fb9af7f-2971-4890-a777-6e99e888717f,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='fa8ec31824694513a42cc22c81880a5c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-16T13:17:03Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Feb 16 13:52:58 compute-0 nova_compute[185723]: 2026-02-16 13:52:58.667 185727 DEBUG nova.virt.hardware [None req-65becf34-ed00-4b55-b959-bde518f22e80 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Feb 16 13:52:58 compute-0 nova_compute[185723]: 2026-02-16 13:52:58.668 185727 DEBUG nova.virt.hardware [None req-65becf34-ed00-4b55-b959-bde518f22e80 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Feb 16 13:52:58 compute-0 nova_compute[185723]: 2026-02-16 13:52:58.668 185727 DEBUG nova.virt.hardware [None req-65becf34-ed00-4b55-b959-bde518f22e80 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Feb 16 13:52:58 compute-0 nova_compute[185723]: 2026-02-16 13:52:58.668 185727 DEBUG nova.virt.hardware [None req-65becf34-ed00-4b55-b959-bde518f22e80 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Feb 16 13:52:58 compute-0 nova_compute[185723]: 2026-02-16 13:52:58.669 185727 DEBUG nova.virt.hardware [None req-65becf34-ed00-4b55-b959-bde518f22e80 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Feb 16 13:52:58 compute-0 nova_compute[185723]: 2026-02-16 13:52:58.669 185727 DEBUG nova.virt.hardware [None req-65becf34-ed00-4b55-b959-bde518f22e80 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Feb 16 13:52:58 compute-0 nova_compute[185723]: 2026-02-16 13:52:58.669 185727 DEBUG nova.virt.hardware [None req-65becf34-ed00-4b55-b959-bde518f22e80 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Feb 16 13:52:58 compute-0 nova_compute[185723]: 2026-02-16 13:52:58.670 185727 DEBUG nova.virt.hardware [None req-65becf34-ed00-4b55-b959-bde518f22e80 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Feb 16 13:52:58 compute-0 nova_compute[185723]: 2026-02-16 13:52:58.670 185727 DEBUG nova.virt.hardware [None req-65becf34-ed00-4b55-b959-bde518f22e80 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Feb 16 13:52:58 compute-0 nova_compute[185723]: 2026-02-16 13:52:58.670 185727 DEBUG nova.virt.hardware [None req-65becf34-ed00-4b55-b959-bde518f22e80 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Feb 16 13:52:58 compute-0 nova_compute[185723]: 2026-02-16 13:52:58.674 185727 DEBUG nova.virt.libvirt.vif [None req-65becf34-ed00-4b55-b959-bde518f22e80 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-16T13:52:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadBalancingStrategy-server-5212048',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteworkloadbalancingstrategy-server-5212048',id=27,image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='88d7e9d22dc247d4b0e2e95ecc7e73ad',ramdisk_id='',reservation_id='r-wp4nxnp9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteWorkloadBalancingStrategy-492275053',owner_user_name='tempest-TestExecuteWorkloadBalancingStrategy-492275053-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-16T13:52:50Z,user_data=None,user_id='178de9ab917a4ba5a84dc9f520a0847f',uuid=169216bd-b5ad-4408-8962-d36ad92cbf8c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "1f9f033c-6441-4f4d-a631-8d0870baa901", "address": "fa:16:3e:23:9d:0a", "network": {"id": "9f3f30c5-b9b2-44c9-bea9-678f6d4e1e0c", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalancingStrategy-2080676524-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "88d7e9d22dc247d4b0e2e95ecc7e73ad", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1f9f033c-64", "ovs_interfaceid": "1f9f033c-6441-4f4d-a631-8d0870baa901", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Feb 16 13:52:58 compute-0 nova_compute[185723]: 2026-02-16 13:52:58.675 185727 DEBUG nova.network.os_vif_util [None req-65becf34-ed00-4b55-b959-bde518f22e80 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] Converting VIF {"id": "1f9f033c-6441-4f4d-a631-8d0870baa901", "address": "fa:16:3e:23:9d:0a", "network": {"id": "9f3f30c5-b9b2-44c9-bea9-678f6d4e1e0c", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalancingStrategy-2080676524-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "88d7e9d22dc247d4b0e2e95ecc7e73ad", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1f9f033c-64", "ovs_interfaceid": "1f9f033c-6441-4f4d-a631-8d0870baa901", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 13:52:58 compute-0 nova_compute[185723]: 2026-02-16 13:52:58.676 185727 DEBUG nova.network.os_vif_util [None req-65becf34-ed00-4b55-b959-bde518f22e80 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:23:9d:0a,bridge_name='br-int',has_traffic_filtering=True,id=1f9f033c-6441-4f4d-a631-8d0870baa901,network=Network(9f3f30c5-b9b2-44c9-bea9-678f6d4e1e0c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1f9f033c-64') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 13:52:58 compute-0 nova_compute[185723]: 2026-02-16 13:52:58.678 185727 DEBUG nova.objects.instance [None req-65becf34-ed00-4b55-b959-bde518f22e80 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] Lazy-loading 'pci_devices' on Instance uuid 169216bd-b5ad-4408-8962-d36ad92cbf8c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 13:52:58 compute-0 nova_compute[185723]: 2026-02-16 13:52:58.696 185727 DEBUG nova.virt.libvirt.driver [None req-65becf34-ed00-4b55-b959-bde518f22e80 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] [instance: 169216bd-b5ad-4408-8962-d36ad92cbf8c] End _get_guest_xml xml=<domain type="kvm">
Feb 16 13:52:58 compute-0 nova_compute[185723]:   <uuid>169216bd-b5ad-4408-8962-d36ad92cbf8c</uuid>
Feb 16 13:52:58 compute-0 nova_compute[185723]:   <name>instance-0000001b</name>
Feb 16 13:52:58 compute-0 nova_compute[185723]:   <memory>131072</memory>
Feb 16 13:52:58 compute-0 nova_compute[185723]:   <vcpu>1</vcpu>
Feb 16 13:52:58 compute-0 nova_compute[185723]:   <metadata>
Feb 16 13:52:58 compute-0 nova_compute[185723]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb 16 13:52:58 compute-0 nova_compute[185723]:       <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Feb 16 13:52:58 compute-0 nova_compute[185723]:       <nova:name>tempest-TestExecuteWorkloadBalancingStrategy-server-5212048</nova:name>
Feb 16 13:52:58 compute-0 nova_compute[185723]:       <nova:creationTime>2026-02-16 13:52:58</nova:creationTime>
Feb 16 13:52:58 compute-0 nova_compute[185723]:       <nova:flavor name="m1.nano">
Feb 16 13:52:58 compute-0 nova_compute[185723]:         <nova:memory>128</nova:memory>
Feb 16 13:52:58 compute-0 nova_compute[185723]:         <nova:disk>1</nova:disk>
Feb 16 13:52:58 compute-0 nova_compute[185723]:         <nova:swap>0</nova:swap>
Feb 16 13:52:58 compute-0 nova_compute[185723]:         <nova:ephemeral>0</nova:ephemeral>
Feb 16 13:52:58 compute-0 nova_compute[185723]:         <nova:vcpus>1</nova:vcpus>
Feb 16 13:52:58 compute-0 nova_compute[185723]:       </nova:flavor>
Feb 16 13:52:58 compute-0 nova_compute[185723]:       <nova:owner>
Feb 16 13:52:58 compute-0 nova_compute[185723]:         <nova:user uuid="178de9ab917a4ba5a84dc9f520a0847f">tempest-TestExecuteWorkloadBalancingStrategy-492275053-project-member</nova:user>
Feb 16 13:52:58 compute-0 nova_compute[185723]:         <nova:project uuid="88d7e9d22dc247d4b0e2e95ecc7e73ad">tempest-TestExecuteWorkloadBalancingStrategy-492275053</nova:project>
Feb 16 13:52:58 compute-0 nova_compute[185723]:       </nova:owner>
Feb 16 13:52:58 compute-0 nova_compute[185723]:       <nova:root type="image" uuid="6fb9af7f-2971-4890-a777-6e99e888717f"/>
Feb 16 13:52:58 compute-0 nova_compute[185723]:       <nova:ports>
Feb 16 13:52:58 compute-0 nova_compute[185723]:         <nova:port uuid="1f9f033c-6441-4f4d-a631-8d0870baa901">
Feb 16 13:52:58 compute-0 nova_compute[185723]:           <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Feb 16 13:52:58 compute-0 nova_compute[185723]:         </nova:port>
Feb 16 13:52:58 compute-0 nova_compute[185723]:       </nova:ports>
Feb 16 13:52:58 compute-0 nova_compute[185723]:     </nova:instance>
Feb 16 13:52:58 compute-0 nova_compute[185723]:   </metadata>
Feb 16 13:52:58 compute-0 nova_compute[185723]:   <sysinfo type="smbios">
Feb 16 13:52:58 compute-0 nova_compute[185723]:     <system>
Feb 16 13:52:58 compute-0 nova_compute[185723]:       <entry name="manufacturer">RDO</entry>
Feb 16 13:52:58 compute-0 nova_compute[185723]:       <entry name="product">OpenStack Compute</entry>
Feb 16 13:52:58 compute-0 nova_compute[185723]:       <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Feb 16 13:52:58 compute-0 nova_compute[185723]:       <entry name="serial">169216bd-b5ad-4408-8962-d36ad92cbf8c</entry>
Feb 16 13:52:58 compute-0 nova_compute[185723]:       <entry name="uuid">169216bd-b5ad-4408-8962-d36ad92cbf8c</entry>
Feb 16 13:52:58 compute-0 nova_compute[185723]:       <entry name="family">Virtual Machine</entry>
Feb 16 13:52:58 compute-0 nova_compute[185723]:     </system>
Feb 16 13:52:58 compute-0 nova_compute[185723]:   </sysinfo>
Feb 16 13:52:58 compute-0 nova_compute[185723]:   <os>
Feb 16 13:52:58 compute-0 nova_compute[185723]:     <type arch="x86_64" machine="q35">hvm</type>
Feb 16 13:52:58 compute-0 nova_compute[185723]:     <boot dev="hd"/>
Feb 16 13:52:58 compute-0 nova_compute[185723]:     <smbios mode="sysinfo"/>
Feb 16 13:52:58 compute-0 nova_compute[185723]:   </os>
Feb 16 13:52:58 compute-0 nova_compute[185723]:   <features>
Feb 16 13:52:58 compute-0 nova_compute[185723]:     <acpi/>
Feb 16 13:52:58 compute-0 nova_compute[185723]:     <apic/>
Feb 16 13:52:58 compute-0 nova_compute[185723]:     <vmcoreinfo/>
Feb 16 13:52:58 compute-0 nova_compute[185723]:   </features>
Feb 16 13:52:58 compute-0 nova_compute[185723]:   <clock offset="utc">
Feb 16 13:52:58 compute-0 nova_compute[185723]:     <timer name="pit" tickpolicy="delay"/>
Feb 16 13:52:58 compute-0 nova_compute[185723]:     <timer name="rtc" tickpolicy="catchup"/>
Feb 16 13:52:58 compute-0 nova_compute[185723]:     <timer name="hpet" present="no"/>
Feb 16 13:52:58 compute-0 nova_compute[185723]:   </clock>
Feb 16 13:52:58 compute-0 rsyslogd[1017]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Feb 16 13:52:58 compute-0 nova_compute[185723]:   <cpu mode="custom" match="exact">
Feb 16 13:52:58 compute-0 nova_compute[185723]:     <model>Nehalem</model>
Feb 16 13:52:58 compute-0 nova_compute[185723]:     <topology sockets="1" cores="1" threads="1"/>
Feb 16 13:52:58 compute-0 nova_compute[185723]:   </cpu>
Feb 16 13:52:58 compute-0 nova_compute[185723]:   <devices>
Feb 16 13:52:58 compute-0 nova_compute[185723]:     <disk type="file" device="disk">
Feb 16 13:52:58 compute-0 nova_compute[185723]:       <driver name="qemu" type="qcow2" cache="none"/>
Feb 16 13:52:58 compute-0 nova_compute[185723]:       <source file="/var/lib/nova/instances/169216bd-b5ad-4408-8962-d36ad92cbf8c/disk"/>
Feb 16 13:52:58 compute-0 nova_compute[185723]:       <target dev="vda" bus="virtio"/>
Feb 16 13:52:58 compute-0 nova_compute[185723]:     </disk>
Feb 16 13:52:58 compute-0 nova_compute[185723]:     <disk type="file" device="cdrom">
Feb 16 13:52:58 compute-0 nova_compute[185723]:       <driver name="qemu" type="raw" cache="none"/>
Feb 16 13:52:58 compute-0 nova_compute[185723]:       <source file="/var/lib/nova/instances/169216bd-b5ad-4408-8962-d36ad92cbf8c/disk.config"/>
Feb 16 13:52:58 compute-0 nova_compute[185723]:       <target dev="sda" bus="sata"/>
Feb 16 13:52:58 compute-0 nova_compute[185723]:     </disk>
Feb 16 13:52:58 compute-0 nova_compute[185723]:     <interface type="ethernet">
Feb 16 13:52:58 compute-0 nova_compute[185723]:       <mac address="fa:16:3e:23:9d:0a"/>
Feb 16 13:52:58 compute-0 nova_compute[185723]:       <model type="virtio"/>
Feb 16 13:52:58 compute-0 nova_compute[185723]:       <driver name="vhost" rx_queue_size="512"/>
Feb 16 13:52:58 compute-0 nova_compute[185723]:       <mtu size="1442"/>
Feb 16 13:52:58 compute-0 nova_compute[185723]:       <target dev="tap1f9f033c-64"/>
Feb 16 13:52:58 compute-0 nova_compute[185723]:     </interface>
Feb 16 13:52:58 compute-0 nova_compute[185723]:     <serial type="pty">
Feb 16 13:52:58 compute-0 nova_compute[185723]:       <log file="/var/lib/nova/instances/169216bd-b5ad-4408-8962-d36ad92cbf8c/console.log" append="off"/>
Feb 16 13:52:58 compute-0 nova_compute[185723]:     </serial>
Feb 16 13:52:58 compute-0 nova_compute[185723]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Feb 16 13:52:58 compute-0 nova_compute[185723]:     <video>
Feb 16 13:52:58 compute-0 nova_compute[185723]:       <model type="virtio"/>
Feb 16 13:52:58 compute-0 nova_compute[185723]:     </video>
Feb 16 13:52:58 compute-0 nova_compute[185723]:     <input type="tablet" bus="usb"/>
Feb 16 13:52:58 compute-0 nova_compute[185723]:     <rng model="virtio">
Feb 16 13:52:58 compute-0 nova_compute[185723]:       <backend model="random">/dev/urandom</backend>
Feb 16 13:52:58 compute-0 nova_compute[185723]:     </rng>
Feb 16 13:52:58 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root"/>
Feb 16 13:52:58 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:52:58 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:52:58 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:52:58 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:52:58 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:52:58 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:52:58 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:52:58 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:52:58 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:52:58 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:52:58 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:52:58 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:52:58 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:52:58 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:52:58 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:52:58 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:52:58 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:52:58 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:52:58 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:52:58 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:52:58 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:52:58 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:52:58 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:52:58 compute-0 nova_compute[185723]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:52:58 compute-0 nova_compute[185723]:     <controller type="usb" index="0"/>
Feb 16 13:52:58 compute-0 nova_compute[185723]:     <memballoon model="virtio">
Feb 16 13:52:58 compute-0 nova_compute[185723]:       <stats period="10"/>
Feb 16 13:52:58 compute-0 nova_compute[185723]:     </memballoon>
Feb 16 13:52:58 compute-0 nova_compute[185723]:   </devices>
Feb 16 13:52:58 compute-0 nova_compute[185723]: </domain>
Feb 16 13:52:58 compute-0 nova_compute[185723]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Feb 16 13:52:58 compute-0 nova_compute[185723]: 2026-02-16 13:52:58.697 185727 DEBUG nova.compute.manager [None req-65becf34-ed00-4b55-b959-bde518f22e80 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] [instance: 169216bd-b5ad-4408-8962-d36ad92cbf8c] Preparing to wait for external event network-vif-plugged-1f9f033c-6441-4f4d-a631-8d0870baa901 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Feb 16 13:52:58 compute-0 nova_compute[185723]: 2026-02-16 13:52:58.698 185727 DEBUG oslo_concurrency.lockutils [None req-65becf34-ed00-4b55-b959-bde518f22e80 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] Acquiring lock "169216bd-b5ad-4408-8962-d36ad92cbf8c-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:52:58 compute-0 nova_compute[185723]: 2026-02-16 13:52:58.699 185727 DEBUG oslo_concurrency.lockutils [None req-65becf34-ed00-4b55-b959-bde518f22e80 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] Lock "169216bd-b5ad-4408-8962-d36ad92cbf8c-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:52:58 compute-0 nova_compute[185723]: 2026-02-16 13:52:58.699 185727 DEBUG oslo_concurrency.lockutils [None req-65becf34-ed00-4b55-b959-bde518f22e80 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] Lock "169216bd-b5ad-4408-8962-d36ad92cbf8c-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:52:58 compute-0 nova_compute[185723]: 2026-02-16 13:52:58.700 185727 DEBUG nova.virt.libvirt.vif [None req-65becf34-ed00-4b55-b959-bde518f22e80 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-16T13:52:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadBalancingStrategy-server-5212048',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteworkloadbalancingstrategy-server-5212048',id=27,image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='88d7e9d22dc247d4b0e2e95ecc7e73ad',ramdisk_id='',reservation_id='r-wp4nxnp9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteWorkloadBalancingStrategy-492275053',owner_user_name='tempest-TestExecuteWorkloadBalancingStrategy-492275053-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-16T13:52:50Z,user_data=None,user_id='178de9ab917a4ba5a84dc9f520a0847f',uuid=169216bd-b5ad-4408-8962-d36ad92cbf8c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "1f9f033c-6441-4f4d-a631-8d0870baa901", "address": "fa:16:3e:23:9d:0a", "network": {"id": "9f3f30c5-b9b2-44c9-bea9-678f6d4e1e0c", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalancingStrategy-2080676524-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "88d7e9d22dc247d4b0e2e95ecc7e73ad", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1f9f033c-64", "ovs_interfaceid": "1f9f033c-6441-4f4d-a631-8d0870baa901", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Feb 16 13:52:58 compute-0 nova_compute[185723]: 2026-02-16 13:52:58.700 185727 DEBUG nova.network.os_vif_util [None req-65becf34-ed00-4b55-b959-bde518f22e80 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] Converting VIF {"id": "1f9f033c-6441-4f4d-a631-8d0870baa901", "address": "fa:16:3e:23:9d:0a", "network": {"id": "9f3f30c5-b9b2-44c9-bea9-678f6d4e1e0c", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalancingStrategy-2080676524-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "88d7e9d22dc247d4b0e2e95ecc7e73ad", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1f9f033c-64", "ovs_interfaceid": "1f9f033c-6441-4f4d-a631-8d0870baa901", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 13:52:58 compute-0 nova_compute[185723]: 2026-02-16 13:52:58.701 185727 DEBUG nova.network.os_vif_util [None req-65becf34-ed00-4b55-b959-bde518f22e80 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:23:9d:0a,bridge_name='br-int',has_traffic_filtering=True,id=1f9f033c-6441-4f4d-a631-8d0870baa901,network=Network(9f3f30c5-b9b2-44c9-bea9-678f6d4e1e0c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1f9f033c-64') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 13:52:58 compute-0 nova_compute[185723]: 2026-02-16 13:52:58.702 185727 DEBUG os_vif [None req-65becf34-ed00-4b55-b959-bde518f22e80 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:23:9d:0a,bridge_name='br-int',has_traffic_filtering=True,id=1f9f033c-6441-4f4d-a631-8d0870baa901,network=Network(9f3f30c5-b9b2-44c9-bea9-678f6d4e1e0c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1f9f033c-64') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Feb 16 13:52:58 compute-0 nova_compute[185723]: 2026-02-16 13:52:58.702 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:52:58 compute-0 nova_compute[185723]: 2026-02-16 13:52:58.703 185727 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:52:58 compute-0 nova_compute[185723]: 2026-02-16 13:52:58.704 185727 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 13:52:58 compute-0 nova_compute[185723]: 2026-02-16 13:52:58.706 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:52:58 compute-0 nova_compute[185723]: 2026-02-16 13:52:58.707 185727 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1f9f033c-64, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:52:58 compute-0 nova_compute[185723]: 2026-02-16 13:52:58.707 185727 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap1f9f033c-64, col_values=(('external_ids', {'iface-id': '1f9f033c-6441-4f4d-a631-8d0870baa901', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:23:9d:0a', 'vm-uuid': '169216bd-b5ad-4408-8962-d36ad92cbf8c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:52:58 compute-0 nova_compute[185723]: 2026-02-16 13:52:58.709 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:52:58 compute-0 NetworkManager[56177]: <info>  [1771249978.7099] manager: (tap1f9f033c-64): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/93)
Feb 16 13:52:58 compute-0 nova_compute[185723]: 2026-02-16 13:52:58.713 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 13:52:58 compute-0 nova_compute[185723]: 2026-02-16 13:52:58.728 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:52:58 compute-0 nova_compute[185723]: 2026-02-16 13:52:58.729 185727 INFO os_vif [None req-65becf34-ed00-4b55-b959-bde518f22e80 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:23:9d:0a,bridge_name='br-int',has_traffic_filtering=True,id=1f9f033c-6441-4f4d-a631-8d0870baa901,network=Network(9f3f30c5-b9b2-44c9-bea9-678f6d4e1e0c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1f9f033c-64')
Feb 16 13:52:58 compute-0 nova_compute[185723]: 2026-02-16 13:52:58.777 185727 DEBUG nova.virt.libvirt.driver [None req-65becf34-ed00-4b55-b959-bde518f22e80 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 16 13:52:58 compute-0 nova_compute[185723]: 2026-02-16 13:52:58.777 185727 DEBUG nova.virt.libvirt.driver [None req-65becf34-ed00-4b55-b959-bde518f22e80 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 16 13:52:58 compute-0 nova_compute[185723]: 2026-02-16 13:52:58.777 185727 DEBUG nova.virt.libvirt.driver [None req-65becf34-ed00-4b55-b959-bde518f22e80 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] No VIF found with MAC fa:16:3e:23:9d:0a, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Feb 16 13:52:58 compute-0 nova_compute[185723]: 2026-02-16 13:52:58.778 185727 INFO nova.virt.libvirt.driver [None req-65becf34-ed00-4b55-b959-bde518f22e80 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] [instance: 169216bd-b5ad-4408-8962-d36ad92cbf8c] Using config drive
Feb 16 13:52:59 compute-0 nova_compute[185723]: 2026-02-16 13:52:59.438 185727 INFO nova.virt.libvirt.driver [None req-65becf34-ed00-4b55-b959-bde518f22e80 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] [instance: 169216bd-b5ad-4408-8962-d36ad92cbf8c] Creating config drive at /var/lib/nova/instances/169216bd-b5ad-4408-8962-d36ad92cbf8c/disk.config
Feb 16 13:52:59 compute-0 nova_compute[185723]: 2026-02-16 13:52:59.442 185727 DEBUG oslo_concurrency.processutils [None req-65becf34-ed00-4b55-b959-bde518f22e80 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/169216bd-b5ad-4408-8962-d36ad92cbf8c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpiwiibi8t execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:52:59 compute-0 nova_compute[185723]: 2026-02-16 13:52:59.565 185727 DEBUG oslo_concurrency.processutils [None req-65becf34-ed00-4b55-b959-bde518f22e80 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/169216bd-b5ad-4408-8962-d36ad92cbf8c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpiwiibi8t" returned: 0 in 0.122s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:52:59 compute-0 kernel: tap1f9f033c-64: entered promiscuous mode
Feb 16 13:52:59 compute-0 NetworkManager[56177]: <info>  [1771249979.6122] manager: (tap1f9f033c-64): new Tun device (/org/freedesktop/NetworkManager/Devices/94)
Feb 16 13:52:59 compute-0 nova_compute[185723]: 2026-02-16 13:52:59.614 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:52:59 compute-0 ovn_controller[96072]: 2026-02-16T13:52:59Z|00234|binding|INFO|Claiming lport 1f9f033c-6441-4f4d-a631-8d0870baa901 for this chassis.
Feb 16 13:52:59 compute-0 ovn_controller[96072]: 2026-02-16T13:52:59Z|00235|binding|INFO|1f9f033c-6441-4f4d-a631-8d0870baa901: Claiming fa:16:3e:23:9d:0a 10.100.0.8
Feb 16 13:52:59 compute-0 nova_compute[185723]: 2026-02-16 13:52:59.621 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:52:59 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:52:59.631 105360 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:23:9d:0a 10.100.0.8'], port_security=['fa:16:3e:23:9d:0a 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '169216bd-b5ad-4408-8962-d36ad92cbf8c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9f3f30c5-b9b2-44c9-bea9-678f6d4e1e0c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '88d7e9d22dc247d4b0e2e95ecc7e73ad', 'neutron:revision_number': '2', 'neutron:security_group_ids': '412416b0-33c5-4b94-970c-86cdbe589da9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=07efc4e8-a338-40bc-a1a5-892571713a01, chassis=[<ovs.db.idl.Row object at 0x7f2e53046760>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2e53046760>], logical_port=1f9f033c-6441-4f4d-a631-8d0870baa901) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 13:52:59 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:52:59.632 105360 INFO neutron.agent.ovn.metadata.agent [-] Port 1f9f033c-6441-4f4d-a631-8d0870baa901 in datapath 9f3f30c5-b9b2-44c9-bea9-678f6d4e1e0c bound to our chassis
Feb 16 13:52:59 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:52:59.634 105360 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 9f3f30c5-b9b2-44c9-bea9-678f6d4e1e0c
Feb 16 13:52:59 compute-0 systemd-udevd[216946]: Network interface NamePolicy= disabled on kernel command line.
Feb 16 13:52:59 compute-0 nova_compute[185723]: 2026-02-16 13:52:59.639 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:52:59 compute-0 ovn_controller[96072]: 2026-02-16T13:52:59Z|00236|binding|INFO|Setting lport 1f9f033c-6441-4f4d-a631-8d0870baa901 ovn-installed in OVS
Feb 16 13:52:59 compute-0 ovn_controller[96072]: 2026-02-16T13:52:59Z|00237|binding|INFO|Setting lport 1f9f033c-6441-4f4d-a631-8d0870baa901 up in Southbound
Feb 16 13:52:59 compute-0 nova_compute[185723]: 2026-02-16 13:52:59.642 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:52:59 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:52:59.643 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[81f97755-5572-430f-9aab-a948d3204cbc]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:52:59 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:52:59.644 105360 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap9f3f30c5-b1 in ovnmeta-9f3f30c5-b9b2-44c9-bea9-678f6d4e1e0c namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Feb 16 13:52:59 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:52:59.645 206438 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap9f3f30c5-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Feb 16 13:52:59 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:52:59.645 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[49be002e-f002-49ac-b941-b49afd7e5d16]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:52:59 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:52:59.646 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[92f726a8-b353-41cc-ad66-47abab75a527]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:52:59 compute-0 systemd-machined[155229]: New machine qemu-22-instance-0000001b.
Feb 16 13:52:59 compute-0 NetworkManager[56177]: <info>  [1771249979.6507] device (tap1f9f033c-64): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 16 13:52:59 compute-0 NetworkManager[56177]: <info>  [1771249979.6512] device (tap1f9f033c-64): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Feb 16 13:52:59 compute-0 systemd[1]: Started Virtual Machine qemu-22-instance-0000001b.
Feb 16 13:52:59 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:52:59.655 105762 DEBUG oslo.privsep.daemon [-] privsep: reply[93499b28-dd49-461d-95d9-04dc7341ed3f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:52:59 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:52:59.667 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[a72f0726-6cc3-4593-b468-1c6977ba7bf2]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:52:59 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:52:59.686 206452 DEBUG oslo.privsep.daemon [-] privsep: reply[6574a88b-47c9-4fa5-83d7-61b82f4bb5c5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:52:59 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:52:59.690 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[a9c3d37e-21f4-408a-b8cb-5c939706687a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:52:59 compute-0 NetworkManager[56177]: <info>  [1771249979.6918] manager: (tap9f3f30c5-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/95)
Feb 16 13:52:59 compute-0 systemd-udevd[216951]: Network interface NamePolicy= disabled on kernel command line.
Feb 16 13:52:59 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:52:59.715 206452 DEBUG oslo.privsep.daemon [-] privsep: reply[52361dd7-c1f0-4391-98f2-db5926a25d86]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:52:59 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:52:59.718 206452 DEBUG oslo.privsep.daemon [-] privsep: reply[6f877f7a-7b6a-4ab3-87ff-e1d1fd8cd8dc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:52:59 compute-0 NetworkManager[56177]: <info>  [1771249979.7404] device (tap9f3f30c5-b0): carrier: link connected
Feb 16 13:52:59 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:52:59.745 206452 DEBUG oslo.privsep.daemon [-] privsep: reply[a029d089-324e-4417-a0c7-9dfc47a385e7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:52:59 compute-0 podman[195053]: time="2026-02-16T13:52:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:52:59 compute-0 podman[195053]: @ - - [16/Feb/2026:13:52:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16012 "" "Go-http-client/1.1"
Feb 16 13:52:59 compute-0 podman[195053]: @ - - [16/Feb/2026:13:52:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2179 "" "Go-http-client/1.1"
Feb 16 13:52:59 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:52:59.762 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[b5c400d0-3742-4382-b1d4-a35aab672dda]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9f3f30c5-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:8b:b8:36'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 67], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 606313, 'reachable_time': 16457, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216980, 'error': None, 'target': 'ovnmeta-9f3f30c5-b9b2-44c9-bea9-678f6d4e1e0c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:52:59 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:52:59.776 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[048e80af-b7c8-4635-9215-7fd1492d9237]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe8b:b836'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 606313, 'tstamp': 606313}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216981, 'error': None, 'target': 'ovnmeta-9f3f30c5-b9b2-44c9-bea9-678f6d4e1e0c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:52:59 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:52:59.794 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[8b484239-f921-433b-9733-56f371383766]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9f3f30c5-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:8b:b8:36'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 3, 'rx_bytes': 90, 'tx_bytes': 266, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 3, 'rx_bytes': 90, 'tx_bytes': 266, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 67], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 606313, 'reachable_time': 16457, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 224, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 224, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 216982, 'error': None, 'target': 'ovnmeta-9f3f30c5-b9b2-44c9-bea9-678f6d4e1e0c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:52:59 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:52:59.821 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[0acd71ee-98e0-4da1-b434-bbacfef28eb3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:52:59 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:52:59.866 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[11bbd9c3-5771-49aa-85e9-497f6dbb08a9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:52:59 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:52:59.868 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9f3f30c5-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:52:59 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:52:59.869 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 13:52:59 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:52:59.869 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9f3f30c5-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:52:59 compute-0 kernel: tap9f3f30c5-b0: entered promiscuous mode
Feb 16 13:52:59 compute-0 NetworkManager[56177]: <info>  [1771249979.8723] manager: (tap9f3f30c5-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/96)
Feb 16 13:52:59 compute-0 nova_compute[185723]: 2026-02-16 13:52:59.871 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:52:59 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:52:59.875 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap9f3f30c5-b0, col_values=(('external_ids', {'iface-id': '340fa0af-180b-44ed-9c22-e18a8f5ebdec'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:52:59 compute-0 ovn_controller[96072]: 2026-02-16T13:52:59Z|00238|binding|INFO|Releasing lport 340fa0af-180b-44ed-9c22-e18a8f5ebdec from this chassis (sb_readonly=0)
Feb 16 13:52:59 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:52:59.878 105360 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/9f3f30c5-b9b2-44c9-bea9-678f6d4e1e0c.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/9f3f30c5-b9b2-44c9-bea9-678f6d4e1e0c.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Feb 16 13:52:59 compute-0 nova_compute[185723]: 2026-02-16 13:52:59.879 185727 DEBUG nova.compute.manager [req-8d069ff0-abe9-4fe9-8d52-af3bec8cb910 req-17112860-48dc-4b42-a371-cb399cb2e5e4 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 169216bd-b5ad-4408-8962-d36ad92cbf8c] Received event network-vif-plugged-1f9f033c-6441-4f4d-a631-8d0870baa901 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:52:59 compute-0 nova_compute[185723]: 2026-02-16 13:52:59.879 185727 DEBUG oslo_concurrency.lockutils [req-8d069ff0-abe9-4fe9-8d52-af3bec8cb910 req-17112860-48dc-4b42-a371-cb399cb2e5e4 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "169216bd-b5ad-4408-8962-d36ad92cbf8c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:52:59 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:52:59.879 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[be871947-83f2-4178-a387-77a3d9445f09]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:52:59 compute-0 nova_compute[185723]: 2026-02-16 13:52:59.880 185727 DEBUG oslo_concurrency.lockutils [req-8d069ff0-abe9-4fe9-8d52-af3bec8cb910 req-17112860-48dc-4b42-a371-cb399cb2e5e4 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "169216bd-b5ad-4408-8962-d36ad92cbf8c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:52:59 compute-0 nova_compute[185723]: 2026-02-16 13:52:59.880 185727 DEBUG oslo_concurrency.lockutils [req-8d069ff0-abe9-4fe9-8d52-af3bec8cb910 req-17112860-48dc-4b42-a371-cb399cb2e5e4 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "169216bd-b5ad-4408-8962-d36ad92cbf8c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:52:59 compute-0 nova_compute[185723]: 2026-02-16 13:52:59.880 185727 DEBUG nova.compute.manager [req-8d069ff0-abe9-4fe9-8d52-af3bec8cb910 req-17112860-48dc-4b42-a371-cb399cb2e5e4 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 169216bd-b5ad-4408-8962-d36ad92cbf8c] Processing event network-vif-plugged-1f9f033c-6441-4f4d-a631-8d0870baa901 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Feb 16 13:52:59 compute-0 nova_compute[185723]: 2026-02-16 13:52:59.880 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:52:59 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:52:59.881 105360 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Feb 16 13:52:59 compute-0 ovn_metadata_agent[105355]: global
Feb 16 13:52:59 compute-0 ovn_metadata_agent[105355]:     log         /dev/log local0 debug
Feb 16 13:52:59 compute-0 ovn_metadata_agent[105355]:     log-tag     haproxy-metadata-proxy-9f3f30c5-b9b2-44c9-bea9-678f6d4e1e0c
Feb 16 13:52:59 compute-0 ovn_metadata_agent[105355]:     user        root
Feb 16 13:52:59 compute-0 ovn_metadata_agent[105355]:     group       root
Feb 16 13:52:59 compute-0 ovn_metadata_agent[105355]:     maxconn     1024
Feb 16 13:52:59 compute-0 ovn_metadata_agent[105355]:     pidfile     /var/lib/neutron/external/pids/9f3f30c5-b9b2-44c9-bea9-678f6d4e1e0c.pid.haproxy
Feb 16 13:52:59 compute-0 ovn_metadata_agent[105355]:     daemon
Feb 16 13:52:59 compute-0 ovn_metadata_agent[105355]: 
Feb 16 13:52:59 compute-0 ovn_metadata_agent[105355]: defaults
Feb 16 13:52:59 compute-0 ovn_metadata_agent[105355]:     log global
Feb 16 13:52:59 compute-0 ovn_metadata_agent[105355]:     mode http
Feb 16 13:52:59 compute-0 ovn_metadata_agent[105355]:     option httplog
Feb 16 13:52:59 compute-0 ovn_metadata_agent[105355]:     option dontlognull
Feb 16 13:52:59 compute-0 ovn_metadata_agent[105355]:     option http-server-close
Feb 16 13:52:59 compute-0 ovn_metadata_agent[105355]:     option forwardfor
Feb 16 13:52:59 compute-0 ovn_metadata_agent[105355]:     retries                 3
Feb 16 13:52:59 compute-0 ovn_metadata_agent[105355]:     timeout http-request    30s
Feb 16 13:52:59 compute-0 ovn_metadata_agent[105355]:     timeout connect         30s
Feb 16 13:52:59 compute-0 ovn_metadata_agent[105355]:     timeout client          32s
Feb 16 13:52:59 compute-0 ovn_metadata_agent[105355]:     timeout server          32s
Feb 16 13:52:59 compute-0 ovn_metadata_agent[105355]:     timeout http-keep-alive 30s
Feb 16 13:52:59 compute-0 ovn_metadata_agent[105355]: 
Feb 16 13:52:59 compute-0 ovn_metadata_agent[105355]: 
Feb 16 13:52:59 compute-0 ovn_metadata_agent[105355]: listen listener
Feb 16 13:52:59 compute-0 ovn_metadata_agent[105355]:     bind 169.254.169.254:80
Feb 16 13:52:59 compute-0 ovn_metadata_agent[105355]:     server metadata /var/lib/neutron/metadata_proxy
Feb 16 13:52:59 compute-0 ovn_metadata_agent[105355]:     http-request add-header X-OVN-Network-ID 9f3f30c5-b9b2-44c9-bea9-678f6d4e1e0c
Feb 16 13:52:59 compute-0 ovn_metadata_agent[105355]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Feb 16 13:52:59 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:52:59.881 105360 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-9f3f30c5-b9b2-44c9-bea9-678f6d4e1e0c', 'env', 'PROCESS_TAG=haproxy-9f3f30c5-b9b2-44c9-bea9-678f6d4e1e0c', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/9f3f30c5-b9b2-44c9-bea9-678f6d4e1e0c.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Feb 16 13:53:00 compute-0 nova_compute[185723]: 2026-02-16 13:53:00.074 185727 DEBUG nova.compute.manager [None req-65becf34-ed00-4b55-b959-bde518f22e80 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] [instance: 169216bd-b5ad-4408-8962-d36ad92cbf8c] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Feb 16 13:53:00 compute-0 nova_compute[185723]: 2026-02-16 13:53:00.075 185727 DEBUG nova.virt.driver [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] Emitting event <LifecycleEvent: 1771249980.0736325, 169216bd-b5ad-4408-8962-d36ad92cbf8c => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 13:53:00 compute-0 nova_compute[185723]: 2026-02-16 13:53:00.075 185727 INFO nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: 169216bd-b5ad-4408-8962-d36ad92cbf8c] VM Started (Lifecycle Event)
Feb 16 13:53:00 compute-0 nova_compute[185723]: 2026-02-16 13:53:00.084 185727 DEBUG nova.virt.libvirt.driver [None req-65becf34-ed00-4b55-b959-bde518f22e80 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] [instance: 169216bd-b5ad-4408-8962-d36ad92cbf8c] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Feb 16 13:53:00 compute-0 nova_compute[185723]: 2026-02-16 13:53:00.088 185727 INFO nova.virt.libvirt.driver [-] [instance: 169216bd-b5ad-4408-8962-d36ad92cbf8c] Instance spawned successfully.
Feb 16 13:53:00 compute-0 nova_compute[185723]: 2026-02-16 13:53:00.089 185727 DEBUG nova.virt.libvirt.driver [None req-65becf34-ed00-4b55-b959-bde518f22e80 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] [instance: 169216bd-b5ad-4408-8962-d36ad92cbf8c] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Feb 16 13:53:00 compute-0 nova_compute[185723]: 2026-02-16 13:53:00.094 185727 DEBUG nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: 169216bd-b5ad-4408-8962-d36ad92cbf8c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:53:00 compute-0 nova_compute[185723]: 2026-02-16 13:53:00.098 185727 DEBUG nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: 169216bd-b5ad-4408-8962-d36ad92cbf8c] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 16 13:53:00 compute-0 nova_compute[185723]: 2026-02-16 13:53:00.109 185727 DEBUG nova.virt.libvirt.driver [None req-65becf34-ed00-4b55-b959-bde518f22e80 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] [instance: 169216bd-b5ad-4408-8962-d36ad92cbf8c] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:53:00 compute-0 nova_compute[185723]: 2026-02-16 13:53:00.109 185727 DEBUG nova.virt.libvirt.driver [None req-65becf34-ed00-4b55-b959-bde518f22e80 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] [instance: 169216bd-b5ad-4408-8962-d36ad92cbf8c] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:53:00 compute-0 nova_compute[185723]: 2026-02-16 13:53:00.110 185727 DEBUG nova.virt.libvirt.driver [None req-65becf34-ed00-4b55-b959-bde518f22e80 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] [instance: 169216bd-b5ad-4408-8962-d36ad92cbf8c] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:53:00 compute-0 nova_compute[185723]: 2026-02-16 13:53:00.110 185727 DEBUG nova.virt.libvirt.driver [None req-65becf34-ed00-4b55-b959-bde518f22e80 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] [instance: 169216bd-b5ad-4408-8962-d36ad92cbf8c] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:53:00 compute-0 nova_compute[185723]: 2026-02-16 13:53:00.111 185727 DEBUG nova.virt.libvirt.driver [None req-65becf34-ed00-4b55-b959-bde518f22e80 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] [instance: 169216bd-b5ad-4408-8962-d36ad92cbf8c] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:53:00 compute-0 nova_compute[185723]: 2026-02-16 13:53:00.111 185727 DEBUG nova.virt.libvirt.driver [None req-65becf34-ed00-4b55-b959-bde518f22e80 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] [instance: 169216bd-b5ad-4408-8962-d36ad92cbf8c] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:53:00 compute-0 nova_compute[185723]: 2026-02-16 13:53:00.122 185727 INFO nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: 169216bd-b5ad-4408-8962-d36ad92cbf8c] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 16 13:53:00 compute-0 nova_compute[185723]: 2026-02-16 13:53:00.123 185727 DEBUG nova.virt.driver [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] Emitting event <LifecycleEvent: 1771249980.0775137, 169216bd-b5ad-4408-8962-d36ad92cbf8c => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 13:53:00 compute-0 nova_compute[185723]: 2026-02-16 13:53:00.123 185727 INFO nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: 169216bd-b5ad-4408-8962-d36ad92cbf8c] VM Paused (Lifecycle Event)
Feb 16 13:53:00 compute-0 nova_compute[185723]: 2026-02-16 13:53:00.162 185727 DEBUG nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: 169216bd-b5ad-4408-8962-d36ad92cbf8c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:53:00 compute-0 nova_compute[185723]: 2026-02-16 13:53:00.167 185727 DEBUG nova.virt.driver [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] Emitting event <LifecycleEvent: 1771249980.083791, 169216bd-b5ad-4408-8962-d36ad92cbf8c => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 13:53:00 compute-0 nova_compute[185723]: 2026-02-16 13:53:00.168 185727 INFO nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: 169216bd-b5ad-4408-8962-d36ad92cbf8c] VM Resumed (Lifecycle Event)
Feb 16 13:53:00 compute-0 nova_compute[185723]: 2026-02-16 13:53:00.215 185727 DEBUG nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: 169216bd-b5ad-4408-8962-d36ad92cbf8c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:53:00 compute-0 podman[217022]: 2026-02-16 13:53:00.21793457 +0000 UTC m=+0.052188468 container create aac1415ddbfc290c7f36da12fe165002ab21e8ceafd75a0310b884de947e1936 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9f3f30c5-b9b2-44c9-bea9-678f6d4e1e0c, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 16 13:53:00 compute-0 nova_compute[185723]: 2026-02-16 13:53:00.219 185727 DEBUG nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: 169216bd-b5ad-4408-8962-d36ad92cbf8c] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 16 13:53:00 compute-0 nova_compute[185723]: 2026-02-16 13:53:00.230 185727 INFO nova.compute.manager [None req-65becf34-ed00-4b55-b959-bde518f22e80 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] [instance: 169216bd-b5ad-4408-8962-d36ad92cbf8c] Took 9.93 seconds to spawn the instance on the hypervisor.
Feb 16 13:53:00 compute-0 nova_compute[185723]: 2026-02-16 13:53:00.231 185727 DEBUG nova.compute.manager [None req-65becf34-ed00-4b55-b959-bde518f22e80 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] [instance: 169216bd-b5ad-4408-8962-d36ad92cbf8c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:53:00 compute-0 nova_compute[185723]: 2026-02-16 13:53:00.246 185727 INFO nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: 169216bd-b5ad-4408-8962-d36ad92cbf8c] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 16 13:53:00 compute-0 systemd[1]: Started libpod-conmon-aac1415ddbfc290c7f36da12fe165002ab21e8ceafd75a0310b884de947e1936.scope.
Feb 16 13:53:00 compute-0 podman[217022]: 2026-02-16 13:53:00.188895688 +0000 UTC m=+0.023149596 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Feb 16 13:53:00 compute-0 systemd[1]: Started libcrun container.
Feb 16 13:53:00 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a3b1ea43ac2b40e9d87529f422f7d9eae31c604773733c64a580258fd332b817/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 16 13:53:00 compute-0 podman[217022]: 2026-02-16 13:53:00.301491806 +0000 UTC m=+0.135745734 container init aac1415ddbfc290c7f36da12fe165002ab21e8ceafd75a0310b884de947e1936 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9f3f30c5-b9b2-44c9-bea9-678f6d4e1e0c, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 16 13:53:00 compute-0 podman[217022]: 2026-02-16 13:53:00.306238874 +0000 UTC m=+0.140492762 container start aac1415ddbfc290c7f36da12fe165002ab21e8ceafd75a0310b884de947e1936 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9f3f30c5-b9b2-44c9-bea9-678f6d4e1e0c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 16 13:53:00 compute-0 neutron-haproxy-ovnmeta-9f3f30c5-b9b2-44c9-bea9-678f6d4e1e0c[217036]: [NOTICE]   (217040) : New worker (217042) forked
Feb 16 13:53:00 compute-0 neutron-haproxy-ovnmeta-9f3f30c5-b9b2-44c9-bea9-678f6d4e1e0c[217036]: [NOTICE]   (217040) : Loading success.
Feb 16 13:53:00 compute-0 nova_compute[185723]: 2026-02-16 13:53:00.330 185727 INFO nova.compute.manager [None req-65becf34-ed00-4b55-b959-bde518f22e80 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] [instance: 169216bd-b5ad-4408-8962-d36ad92cbf8c] Took 10.50 seconds to build instance.
Feb 16 13:53:00 compute-0 nova_compute[185723]: 2026-02-16 13:53:00.369 185727 DEBUG oslo_concurrency.lockutils [None req-65becf34-ed00-4b55-b959-bde518f22e80 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] Lock "169216bd-b5ad-4408-8962-d36ad92cbf8c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.646s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:53:00 compute-0 nova_compute[185723]: 2026-02-16 13:53:00.873 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:53:01 compute-0 nova_compute[185723]: 2026-02-16 13:53:01.218 185727 DEBUG nova.network.neutron [req-c9412763-6319-41b1-bbf7-cbe6d3fd6367 req-9a16b227-907b-42e7-8cb3-bd0fa24599c8 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 169216bd-b5ad-4408-8962-d36ad92cbf8c] Updated VIF entry in instance network info cache for port 1f9f033c-6441-4f4d-a631-8d0870baa901. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 16 13:53:01 compute-0 nova_compute[185723]: 2026-02-16 13:53:01.219 185727 DEBUG nova.network.neutron [req-c9412763-6319-41b1-bbf7-cbe6d3fd6367 req-9a16b227-907b-42e7-8cb3-bd0fa24599c8 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 169216bd-b5ad-4408-8962-d36ad92cbf8c] Updating instance_info_cache with network_info: [{"id": "1f9f033c-6441-4f4d-a631-8d0870baa901", "address": "fa:16:3e:23:9d:0a", "network": {"id": "9f3f30c5-b9b2-44c9-bea9-678f6d4e1e0c", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalancingStrategy-2080676524-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "88d7e9d22dc247d4b0e2e95ecc7e73ad", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1f9f033c-64", "ovs_interfaceid": "1f9f033c-6441-4f4d-a631-8d0870baa901", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 13:53:01 compute-0 nova_compute[185723]: 2026-02-16 13:53:01.238 185727 DEBUG oslo_concurrency.lockutils [req-c9412763-6319-41b1-bbf7-cbe6d3fd6367 req-9a16b227-907b-42e7-8cb3-bd0fa24599c8 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Releasing lock "refresh_cache-169216bd-b5ad-4408-8962-d36ad92cbf8c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 13:53:01 compute-0 openstack_network_exporter[197909]: ERROR   13:53:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:53:01 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:53:01 compute-0 openstack_network_exporter[197909]: ERROR   13:53:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:53:01 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:53:01 compute-0 nova_compute[185723]: 2026-02-16 13:53:01.978 185727 DEBUG nova.compute.manager [req-a2172b0e-ef09-459c-a16f-89d0e73786a8 req-1644fed3-f3bc-418f-98f5-ce45f1c2b1f1 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 169216bd-b5ad-4408-8962-d36ad92cbf8c] Received event network-vif-plugged-1f9f033c-6441-4f4d-a631-8d0870baa901 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:53:01 compute-0 nova_compute[185723]: 2026-02-16 13:53:01.979 185727 DEBUG oslo_concurrency.lockutils [req-a2172b0e-ef09-459c-a16f-89d0e73786a8 req-1644fed3-f3bc-418f-98f5-ce45f1c2b1f1 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "169216bd-b5ad-4408-8962-d36ad92cbf8c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:53:01 compute-0 nova_compute[185723]: 2026-02-16 13:53:01.979 185727 DEBUG oslo_concurrency.lockutils [req-a2172b0e-ef09-459c-a16f-89d0e73786a8 req-1644fed3-f3bc-418f-98f5-ce45f1c2b1f1 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "169216bd-b5ad-4408-8962-d36ad92cbf8c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:53:01 compute-0 nova_compute[185723]: 2026-02-16 13:53:01.980 185727 DEBUG oslo_concurrency.lockutils [req-a2172b0e-ef09-459c-a16f-89d0e73786a8 req-1644fed3-f3bc-418f-98f5-ce45f1c2b1f1 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "169216bd-b5ad-4408-8962-d36ad92cbf8c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:53:01 compute-0 nova_compute[185723]: 2026-02-16 13:53:01.980 185727 DEBUG nova.compute.manager [req-a2172b0e-ef09-459c-a16f-89d0e73786a8 req-1644fed3-f3bc-418f-98f5-ce45f1c2b1f1 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 169216bd-b5ad-4408-8962-d36ad92cbf8c] No waiting events found dispatching network-vif-plugged-1f9f033c-6441-4f4d-a631-8d0870baa901 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:53:01 compute-0 nova_compute[185723]: 2026-02-16 13:53:01.980 185727 WARNING nova.compute.manager [req-a2172b0e-ef09-459c-a16f-89d0e73786a8 req-1644fed3-f3bc-418f-98f5-ce45f1c2b1f1 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 169216bd-b5ad-4408-8962-d36ad92cbf8c] Received unexpected event network-vif-plugged-1f9f033c-6441-4f4d-a631-8d0870baa901 for instance with vm_state active and task_state None.
Feb 16 13:53:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:53:03.248 105360 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:53:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:53:03.249 105360 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:53:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:53:03.250 105360 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:53:03 compute-0 nova_compute[185723]: 2026-02-16 13:53:03.712 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:53:05 compute-0 nova_compute[185723]: 2026-02-16 13:53:05.901 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:53:08 compute-0 nova_compute[185723]: 2026-02-16 13:53:08.715 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:53:10 compute-0 nova_compute[185723]: 2026-02-16 13:53:10.902 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:53:11 compute-0 ovn_controller[96072]: 2026-02-16T13:53:11Z|00032|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:23:9d:0a 10.100.0.8
Feb 16 13:53:11 compute-0 ovn_controller[96072]: 2026-02-16T13:53:11Z|00033|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:23:9d:0a 10.100.0.8
Feb 16 13:53:13 compute-0 nova_compute[185723]: 2026-02-16 13:53:13.719 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:53:15 compute-0 nova_compute[185723]: 2026-02-16 13:53:15.903 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:53:17 compute-0 podman[217068]: 2026-02-16 13:53:17.012495498 +0000 UTC m=+0.046048065 container health_status d69bd0be854ec8043fcba555b70c2cd311c7a2510d07d6bc9929fe7684abece9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent)
Feb 16 13:53:17 compute-0 podman[217067]: 2026-02-16 13:53:17.016133849 +0000 UTC m=+0.051949422 container health_status 93151dcbec9f159583042df5101bdd7b5b1bcb5f3117955b4bc9cd1c0e465b98 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.7, build-date=2026-02-05T04:57:10Z, distribution-scope=public, vendor=Red Hat, Inc., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, name=ubi9/ubi-minimal, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, architecture=x86_64, config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1770267347, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., org.opencontainers.image.created=2026-02-05T04:57:10Z, io.buildah.version=1.33.7, container_name=openstack_network_exporter)
Feb 16 13:53:18 compute-0 nova_compute[185723]: 2026-02-16 13:53:18.721 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:53:19 compute-0 nova_compute[185723]: 2026-02-16 13:53:19.874 185727 DEBUG nova.virt.libvirt.driver [None req-719ce552-7116-4f20-91ab-02a18f61571d d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] [instance: b81c5faa-2832-4df4-8db7-1ffb8d8209ab] Creating tmpfile /var/lib/nova/instances/tmpq8uj_sv2 to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10041
Feb 16 13:53:20 compute-0 nova_compute[185723]: 2026-02-16 13:53:20.020 185727 DEBUG nova.compute.manager [None req-719ce552-7116-4f20-91ab-02a18f61571d d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpq8uj_sv2',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.9/site-packages/nova/compute/manager.py:8476
Feb 16 13:53:20 compute-0 nova_compute[185723]: 2026-02-16 13:53:20.773 185727 DEBUG nova.compute.manager [None req-719ce552-7116-4f20-91ab-02a18f61571d d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpq8uj_sv2',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='b81c5faa-2832-4df4-8db7-1ffb8d8209ab',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8604
Feb 16 13:53:20 compute-0 nova_compute[185723]: 2026-02-16 13:53:20.809 185727 DEBUG oslo_concurrency.lockutils [None req-719ce552-7116-4f20-91ab-02a18f61571d d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] Acquiring lock "refresh_cache-b81c5faa-2832-4df4-8db7-1ffb8d8209ab" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 13:53:20 compute-0 nova_compute[185723]: 2026-02-16 13:53:20.810 185727 DEBUG oslo_concurrency.lockutils [None req-719ce552-7116-4f20-91ab-02a18f61571d d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] Acquired lock "refresh_cache-b81c5faa-2832-4df4-8db7-1ffb8d8209ab" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 13:53:20 compute-0 nova_compute[185723]: 2026-02-16 13:53:20.810 185727 DEBUG nova.network.neutron [None req-719ce552-7116-4f20-91ab-02a18f61571d d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] [instance: b81c5faa-2832-4df4-8db7-1ffb8d8209ab] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 16 13:53:20 compute-0 nova_compute[185723]: 2026-02-16 13:53:20.906 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:53:21 compute-0 nova_compute[185723]: 2026-02-16 13:53:21.433 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:53:21 compute-0 nova_compute[185723]: 2026-02-16 13:53:21.433 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Feb 16 13:53:21 compute-0 nova_compute[185723]: 2026-02-16 13:53:21.460 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Feb 16 13:53:22 compute-0 nova_compute[185723]: 2026-02-16 13:53:22.180 185727 DEBUG nova.network.neutron [None req-719ce552-7116-4f20-91ab-02a18f61571d d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] [instance: b81c5faa-2832-4df4-8db7-1ffb8d8209ab] Updating instance_info_cache with network_info: [{"id": "da0306b3-8514-4ef0-984c-14d90dedd285", "address": "fa:16:3e:04:49:ad", "network": {"id": "9f3f30c5-b9b2-44c9-bea9-678f6d4e1e0c", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalancingStrategy-2080676524-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "88d7e9d22dc247d4b0e2e95ecc7e73ad", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapda0306b3-85", "ovs_interfaceid": "da0306b3-8514-4ef0-984c-14d90dedd285", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 13:53:22 compute-0 nova_compute[185723]: 2026-02-16 13:53:22.204 185727 DEBUG oslo_concurrency.lockutils [None req-719ce552-7116-4f20-91ab-02a18f61571d d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] Releasing lock "refresh_cache-b81c5faa-2832-4df4-8db7-1ffb8d8209ab" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 13:53:22 compute-0 nova_compute[185723]: 2026-02-16 13:53:22.206 185727 DEBUG nova.virt.libvirt.driver [None req-719ce552-7116-4f20-91ab-02a18f61571d d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] [instance: b81c5faa-2832-4df4-8db7-1ffb8d8209ab] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpq8uj_sv2',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='b81c5faa-2832-4df4-8db7-1ffb8d8209ab',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10827
Feb 16 13:53:22 compute-0 nova_compute[185723]: 2026-02-16 13:53:22.206 185727 DEBUG nova.virt.libvirt.driver [None req-719ce552-7116-4f20-91ab-02a18f61571d d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] [instance: b81c5faa-2832-4df4-8db7-1ffb8d8209ab] Creating instance directory: /var/lib/nova/instances/b81c5faa-2832-4df4-8db7-1ffb8d8209ab pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10840
Feb 16 13:53:22 compute-0 nova_compute[185723]: 2026-02-16 13:53:22.207 185727 DEBUG nova.virt.libvirt.driver [None req-719ce552-7116-4f20-91ab-02a18f61571d d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] [instance: b81c5faa-2832-4df4-8db7-1ffb8d8209ab] Creating disk.info with the contents: {'/var/lib/nova/instances/b81c5faa-2832-4df4-8db7-1ffb8d8209ab/disk': 'qcow2', '/var/lib/nova/instances/b81c5faa-2832-4df4-8db7-1ffb8d8209ab/disk.config': 'raw'} pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10854
Feb 16 13:53:22 compute-0 nova_compute[185723]: 2026-02-16 13:53:22.207 185727 DEBUG nova.virt.libvirt.driver [None req-719ce552-7116-4f20-91ab-02a18f61571d d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] [instance: b81c5faa-2832-4df4-8db7-1ffb8d8209ab] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10864
Feb 16 13:53:22 compute-0 nova_compute[185723]: 2026-02-16 13:53:22.208 185727 DEBUG nova.objects.instance [None req-719ce552-7116-4f20-91ab-02a18f61571d d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] Lazy-loading 'trusted_certs' on Instance uuid b81c5faa-2832-4df4-8db7-1ffb8d8209ab obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 13:53:22 compute-0 nova_compute[185723]: 2026-02-16 13:53:22.246 185727 DEBUG oslo_concurrency.processutils [None req-719ce552-7116-4f20-91ab-02a18f61571d d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:53:22 compute-0 nova_compute[185723]: 2026-02-16 13:53:22.297 185727 DEBUG oslo_concurrency.processutils [None req-719ce552-7116-4f20-91ab-02a18f61571d d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:53:22 compute-0 nova_compute[185723]: 2026-02-16 13:53:22.298 185727 DEBUG oslo_concurrency.lockutils [None req-719ce552-7116-4f20-91ab-02a18f61571d d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] Acquiring lock "755ae6cb63977e865a71a8c38a1a67ce95c9d0b7" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:53:22 compute-0 nova_compute[185723]: 2026-02-16 13:53:22.299 185727 DEBUG oslo_concurrency.lockutils [None req-719ce552-7116-4f20-91ab-02a18f61571d d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] Lock "755ae6cb63977e865a71a8c38a1a67ce95c9d0b7" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:53:22 compute-0 nova_compute[185723]: 2026-02-16 13:53:22.309 185727 DEBUG oslo_concurrency.processutils [None req-719ce552-7116-4f20-91ab-02a18f61571d d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:53:22 compute-0 nova_compute[185723]: 2026-02-16 13:53:22.354 185727 DEBUG oslo_concurrency.processutils [None req-719ce552-7116-4f20-91ab-02a18f61571d d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:53:22 compute-0 nova_compute[185723]: 2026-02-16 13:53:22.355 185727 DEBUG oslo_concurrency.processutils [None req-719ce552-7116-4f20-91ab-02a18f61571d d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7,backing_fmt=raw /var/lib/nova/instances/b81c5faa-2832-4df4-8db7-1ffb8d8209ab/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:53:22 compute-0 nova_compute[185723]: 2026-02-16 13:53:22.382 185727 DEBUG oslo_concurrency.processutils [None req-719ce552-7116-4f20-91ab-02a18f61571d d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7,backing_fmt=raw /var/lib/nova/instances/b81c5faa-2832-4df4-8db7-1ffb8d8209ab/disk 1073741824" returned: 0 in 0.027s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:53:22 compute-0 nova_compute[185723]: 2026-02-16 13:53:22.383 185727 DEBUG oslo_concurrency.lockutils [None req-719ce552-7116-4f20-91ab-02a18f61571d d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] Lock "755ae6cb63977e865a71a8c38a1a67ce95c9d0b7" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.084s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:53:22 compute-0 nova_compute[185723]: 2026-02-16 13:53:22.383 185727 DEBUG oslo_concurrency.processutils [None req-719ce552-7116-4f20-91ab-02a18f61571d d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:53:22 compute-0 nova_compute[185723]: 2026-02-16 13:53:22.431 185727 DEBUG oslo_concurrency.processutils [None req-719ce552-7116-4f20-91ab-02a18f61571d d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json" returned: 0 in 0.048s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:53:22 compute-0 nova_compute[185723]: 2026-02-16 13:53:22.432 185727 DEBUG nova.virt.disk.api [None req-719ce552-7116-4f20-91ab-02a18f61571d d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] Checking if we can resize image /var/lib/nova/instances/b81c5faa-2832-4df4-8db7-1ffb8d8209ab/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Feb 16 13:53:22 compute-0 nova_compute[185723]: 2026-02-16 13:53:22.433 185727 DEBUG oslo_concurrency.processutils [None req-719ce552-7116-4f20-91ab-02a18f61571d d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b81c5faa-2832-4df4-8db7-1ffb8d8209ab/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:53:22 compute-0 nova_compute[185723]: 2026-02-16 13:53:22.485 185727 DEBUG oslo_concurrency.processutils [None req-719ce552-7116-4f20-91ab-02a18f61571d d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b81c5faa-2832-4df4-8db7-1ffb8d8209ab/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:53:22 compute-0 nova_compute[185723]: 2026-02-16 13:53:22.486 185727 DEBUG nova.virt.disk.api [None req-719ce552-7116-4f20-91ab-02a18f61571d d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] Cannot resize image /var/lib/nova/instances/b81c5faa-2832-4df4-8db7-1ffb8d8209ab/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Feb 16 13:53:22 compute-0 nova_compute[185723]: 2026-02-16 13:53:22.486 185727 DEBUG nova.objects.instance [None req-719ce552-7116-4f20-91ab-02a18f61571d d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] Lazy-loading 'migration_context' on Instance uuid b81c5faa-2832-4df4-8db7-1ffb8d8209ab obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 13:53:22 compute-0 nova_compute[185723]: 2026-02-16 13:53:22.503 185727 DEBUG oslo_concurrency.processutils [None req-719ce552-7116-4f20-91ab-02a18f61571d d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/b81c5faa-2832-4df4-8db7-1ffb8d8209ab/disk.config 485376 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:53:22 compute-0 nova_compute[185723]: 2026-02-16 13:53:22.524 185727 DEBUG oslo_concurrency.processutils [None req-719ce552-7116-4f20-91ab-02a18f61571d d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/b81c5faa-2832-4df4-8db7-1ffb8d8209ab/disk.config 485376" returned: 0 in 0.021s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:53:22 compute-0 nova_compute[185723]: 2026-02-16 13:53:22.526 185727 DEBUG nova.virt.libvirt.volume.remotefs [None req-719ce552-7116-4f20-91ab-02a18f61571d d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] Copying file compute-1.ctlplane.example.com:/var/lib/nova/instances/b81c5faa-2832-4df4-8db7-1ffb8d8209ab/disk.config to /var/lib/nova/instances/b81c5faa-2832-4df4-8db7-1ffb8d8209ab copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Feb 16 13:53:22 compute-0 nova_compute[185723]: 2026-02-16 13:53:22.526 185727 DEBUG oslo_concurrency.processutils [None req-719ce552-7116-4f20-91ab-02a18f61571d d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] Running cmd (subprocess): scp -C -r compute-1.ctlplane.example.com:/var/lib/nova/instances/b81c5faa-2832-4df4-8db7-1ffb8d8209ab/disk.config /var/lib/nova/instances/b81c5faa-2832-4df4-8db7-1ffb8d8209ab execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:53:22 compute-0 nova_compute[185723]: 2026-02-16 13:53:22.932 185727 DEBUG oslo_concurrency.processutils [None req-719ce552-7116-4f20-91ab-02a18f61571d d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] CMD "scp -C -r compute-1.ctlplane.example.com:/var/lib/nova/instances/b81c5faa-2832-4df4-8db7-1ffb8d8209ab/disk.config /var/lib/nova/instances/b81c5faa-2832-4df4-8db7-1ffb8d8209ab" returned: 0 in 0.406s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:53:22 compute-0 nova_compute[185723]: 2026-02-16 13:53:22.933 185727 DEBUG nova.virt.libvirt.driver [None req-719ce552-7116-4f20-91ab-02a18f61571d d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] [instance: b81c5faa-2832-4df4-8db7-1ffb8d8209ab] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10794
Feb 16 13:53:22 compute-0 nova_compute[185723]: 2026-02-16 13:53:22.934 185727 DEBUG nova.virt.libvirt.vif [None req-719ce552-7116-4f20-91ab-02a18f61571d d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-16T13:53:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadBalancingStrategy-server-1419341349',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteworkloadbalancingstrategy-server-1419341349',id=28,image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-16T13:53:11Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='88d7e9d22dc247d4b0e2e95ecc7e73ad',ramdisk_id='',reservation_id='r-9nhfpgiy',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteWorkloadBalancingStrategy-492275053',owner_user_name='tempest-TestExecuteWorkloadBalancingStrategy-492275053-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2026-02-16T13:53:11Z,user_data=None,user_id='178de9ab917a4ba5a84dc9f520a0847f',uuid=b81c5faa-2832-4df4-8db7-1ffb8d8209ab,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "da0306b3-8514-4ef0-984c-14d90dedd285", "address": "fa:16:3e:04:49:ad", "network": {"id": "9f3f30c5-b9b2-44c9-bea9-678f6d4e1e0c", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalancingStrategy-2080676524-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "88d7e9d22dc247d4b0e2e95ecc7e73ad", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapda0306b3-85", "ovs_interfaceid": "da0306b3-8514-4ef0-984c-14d90dedd285", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Feb 16 13:53:22 compute-0 nova_compute[185723]: 2026-02-16 13:53:22.934 185727 DEBUG nova.network.os_vif_util [None req-719ce552-7116-4f20-91ab-02a18f61571d d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] Converting VIF {"id": "da0306b3-8514-4ef0-984c-14d90dedd285", "address": "fa:16:3e:04:49:ad", "network": {"id": "9f3f30c5-b9b2-44c9-bea9-678f6d4e1e0c", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalancingStrategy-2080676524-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "88d7e9d22dc247d4b0e2e95ecc7e73ad", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapda0306b3-85", "ovs_interfaceid": "da0306b3-8514-4ef0-984c-14d90dedd285", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 13:53:22 compute-0 nova_compute[185723]: 2026-02-16 13:53:22.935 185727 DEBUG nova.network.os_vif_util [None req-719ce552-7116-4f20-91ab-02a18f61571d d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:04:49:ad,bridge_name='br-int',has_traffic_filtering=True,id=da0306b3-8514-4ef0-984c-14d90dedd285,network=Network(9f3f30c5-b9b2-44c9-bea9-678f6d4e1e0c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapda0306b3-85') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 13:53:22 compute-0 nova_compute[185723]: 2026-02-16 13:53:22.936 185727 DEBUG os_vif [None req-719ce552-7116-4f20-91ab-02a18f61571d d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:04:49:ad,bridge_name='br-int',has_traffic_filtering=True,id=da0306b3-8514-4ef0-984c-14d90dedd285,network=Network(9f3f30c5-b9b2-44c9-bea9-678f6d4e1e0c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapda0306b3-85') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Feb 16 13:53:22 compute-0 nova_compute[185723]: 2026-02-16 13:53:22.936 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:53:22 compute-0 nova_compute[185723]: 2026-02-16 13:53:22.937 185727 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:53:22 compute-0 nova_compute[185723]: 2026-02-16 13:53:22.937 185727 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 13:53:22 compute-0 nova_compute[185723]: 2026-02-16 13:53:22.940 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:53:22 compute-0 nova_compute[185723]: 2026-02-16 13:53:22.941 185727 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapda0306b3-85, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:53:22 compute-0 nova_compute[185723]: 2026-02-16 13:53:22.941 185727 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapda0306b3-85, col_values=(('external_ids', {'iface-id': 'da0306b3-8514-4ef0-984c-14d90dedd285', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:04:49:ad', 'vm-uuid': 'b81c5faa-2832-4df4-8db7-1ffb8d8209ab'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:53:22 compute-0 NetworkManager[56177]: <info>  [1771250002.9446] manager: (tapda0306b3-85): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/97)
Feb 16 13:53:22 compute-0 nova_compute[185723]: 2026-02-16 13:53:22.946 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 13:53:22 compute-0 nova_compute[185723]: 2026-02-16 13:53:22.949 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:53:22 compute-0 nova_compute[185723]: 2026-02-16 13:53:22.950 185727 INFO os_vif [None req-719ce552-7116-4f20-91ab-02a18f61571d d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:04:49:ad,bridge_name='br-int',has_traffic_filtering=True,id=da0306b3-8514-4ef0-984c-14d90dedd285,network=Network(9f3f30c5-b9b2-44c9-bea9-678f6d4e1e0c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapda0306b3-85')
Feb 16 13:53:22 compute-0 nova_compute[185723]: 2026-02-16 13:53:22.950 185727 DEBUG nova.virt.libvirt.driver [None req-719ce552-7116-4f20-91ab-02a18f61571d d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10954
Feb 16 13:53:22 compute-0 nova_compute[185723]: 2026-02-16 13:53:22.950 185727 DEBUG nova.compute.manager [None req-719ce552-7116-4f20-91ab-02a18f61571d d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpq8uj_sv2',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='b81c5faa-2832-4df4-8db7-1ffb8d8209ab',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8668
Feb 16 13:53:23 compute-0 podman[217125]: 2026-02-16 13:53:23.031793542 +0000 UTC m=+0.068738550 container health_status 19961a2f872cc72daf954f4dd556f980b5f58f137d145776df6132c167a8a93c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller)
Feb 16 13:53:25 compute-0 nova_compute[185723]: 2026-02-16 13:53:25.909 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:53:27 compute-0 nova_compute[185723]: 2026-02-16 13:53:27.621 185727 DEBUG nova.network.neutron [None req-719ce552-7116-4f20-91ab-02a18f61571d d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] [instance: b81c5faa-2832-4df4-8db7-1ffb8d8209ab] Port da0306b3-8514-4ef0-984c-14d90dedd285 updated with migration profile {'migrating_to': 'compute-0.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.9/site-packages/nova/network/neutron.py:354
Feb 16 13:53:27 compute-0 nova_compute[185723]: 2026-02-16 13:53:27.623 185727 DEBUG nova.compute.manager [None req-719ce552-7116-4f20-91ab-02a18f61571d d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpq8uj_sv2',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='b81c5faa-2832-4df4-8db7-1ffb8d8209ab',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8723
Feb 16 13:53:27 compute-0 systemd[1]: Starting libvirt proxy daemon...
Feb 16 13:53:27 compute-0 systemd[1]: Started libvirt proxy daemon.
Feb 16 13:53:27 compute-0 kernel: tapda0306b3-85: entered promiscuous mode
Feb 16 13:53:27 compute-0 NetworkManager[56177]: <info>  [1771250007.9192] manager: (tapda0306b3-85): new Tun device (/org/freedesktop/NetworkManager/Devices/98)
Feb 16 13:53:27 compute-0 nova_compute[185723]: 2026-02-16 13:53:27.920 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:53:27 compute-0 ovn_controller[96072]: 2026-02-16T13:53:27Z|00239|binding|INFO|Claiming lport da0306b3-8514-4ef0-984c-14d90dedd285 for this additional chassis.
Feb 16 13:53:27 compute-0 ovn_controller[96072]: 2026-02-16T13:53:27Z|00240|binding|INFO|da0306b3-8514-4ef0-984c-14d90dedd285: Claiming fa:16:3e:04:49:ad 10.100.0.5
Feb 16 13:53:27 compute-0 ovn_controller[96072]: 2026-02-16T13:53:27Z|00241|binding|INFO|Setting lport da0306b3-8514-4ef0-984c-14d90dedd285 ovn-installed in OVS
Feb 16 13:53:27 compute-0 nova_compute[185723]: 2026-02-16 13:53:27.925 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:53:27 compute-0 nova_compute[185723]: 2026-02-16 13:53:27.929 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:53:27 compute-0 nova_compute[185723]: 2026-02-16 13:53:27.943 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:53:27 compute-0 systemd-udevd[217189]: Network interface NamePolicy= disabled on kernel command line.
Feb 16 13:53:27 compute-0 systemd-machined[155229]: New machine qemu-23-instance-0000001c.
Feb 16 13:53:27 compute-0 NetworkManager[56177]: <info>  [1771250007.9561] device (tapda0306b3-85): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 16 13:53:27 compute-0 NetworkManager[56177]: <info>  [1771250007.9566] device (tapda0306b3-85): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Feb 16 13:53:27 compute-0 systemd[1]: Started Virtual Machine qemu-23-instance-0000001c.
Feb 16 13:53:28 compute-0 nova_compute[185723]: 2026-02-16 13:53:28.460 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:53:28 compute-0 nova_compute[185723]: 2026-02-16 13:53:28.461 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:53:28 compute-0 nova_compute[185723]: 2026-02-16 13:53:28.494 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:53:28 compute-0 nova_compute[185723]: 2026-02-16 13:53:28.494 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:53:28 compute-0 nova_compute[185723]: 2026-02-16 13:53:28.494 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:53:28 compute-0 nova_compute[185723]: 2026-02-16 13:53:28.494 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 16 13:53:28 compute-0 nova_compute[185723]: 2026-02-16 13:53:28.584 185727 DEBUG nova.virt.driver [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] Emitting event <LifecycleEvent: 1771250008.5843117, b81c5faa-2832-4df4-8db7-1ffb8d8209ab => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 13:53:28 compute-0 nova_compute[185723]: 2026-02-16 13:53:28.585 185727 INFO nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: b81c5faa-2832-4df4-8db7-1ffb8d8209ab] VM Started (Lifecycle Event)
Feb 16 13:53:28 compute-0 nova_compute[185723]: 2026-02-16 13:53:28.605 185727 DEBUG nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: b81c5faa-2832-4df4-8db7-1ffb8d8209ab] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:53:28 compute-0 nova_compute[185723]: 2026-02-16 13:53:28.612 185727 DEBUG oslo_concurrency.processutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/169216bd-b5ad-4408-8962-d36ad92cbf8c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:53:28 compute-0 podman[217206]: 2026-02-16 13:53:28.614162659 +0000 UTC m=+0.076802889 container health_status 4ec6105d1727f201f656d7e6e358931be8ceabc5af37fb5ad779abc620122180 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 16 13:53:28 compute-0 nova_compute[185723]: 2026-02-16 13:53:28.666 185727 DEBUG oslo_concurrency.processutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/169216bd-b5ad-4408-8962-d36ad92cbf8c/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:53:28 compute-0 nova_compute[185723]: 2026-02-16 13:53:28.668 185727 DEBUG oslo_concurrency.processutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/169216bd-b5ad-4408-8962-d36ad92cbf8c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:53:28 compute-0 nova_compute[185723]: 2026-02-16 13:53:28.729 185727 DEBUG oslo_concurrency.processutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/169216bd-b5ad-4408-8962-d36ad92cbf8c/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:53:28 compute-0 nova_compute[185723]: 2026-02-16 13:53:28.739 185727 DEBUG oslo_concurrency.processutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b81c5faa-2832-4df4-8db7-1ffb8d8209ab/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:53:28 compute-0 nova_compute[185723]: 2026-02-16 13:53:28.789 185727 DEBUG oslo_concurrency.processutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b81c5faa-2832-4df4-8db7-1ffb8d8209ab/disk --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:53:28 compute-0 nova_compute[185723]: 2026-02-16 13:53:28.791 185727 DEBUG oslo_concurrency.processutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b81c5faa-2832-4df4-8db7-1ffb8d8209ab/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:53:28 compute-0 nova_compute[185723]: 2026-02-16 13:53:28.866 185727 DEBUG oslo_concurrency.processutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b81c5faa-2832-4df4-8db7-1ffb8d8209ab/disk --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:53:29 compute-0 nova_compute[185723]: 2026-02-16 13:53:29.044 185727 WARNING nova.virt.libvirt.driver [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 13:53:29 compute-0 nova_compute[185723]: 2026-02-16 13:53:29.046 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5617MB free_disk=73.19053649902344GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 16 13:53:29 compute-0 nova_compute[185723]: 2026-02-16 13:53:29.046 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:53:29 compute-0 nova_compute[185723]: 2026-02-16 13:53:29.047 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:53:29 compute-0 nova_compute[185723]: 2026-02-16 13:53:29.109 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Migration for instance b81c5faa-2832-4df4-8db7-1ffb8d8209ab refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903
Feb 16 13:53:29 compute-0 nova_compute[185723]: 2026-02-16 13:53:29.131 185727 INFO nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] [instance: b81c5faa-2832-4df4-8db7-1ffb8d8209ab] Updating resource usage from migration 23baa630-8720-4df0-ae64-9dd4c6785d6f
Feb 16 13:53:29 compute-0 nova_compute[185723]: 2026-02-16 13:53:29.132 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] [instance: b81c5faa-2832-4df4-8db7-1ffb8d8209ab] Starting to track incoming migration 23baa630-8720-4df0-ae64-9dd4c6785d6f with flavor 6d89f72c-1760-421e-a5f2-83dfc3723b84 _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1431
Feb 16 13:53:29 compute-0 nova_compute[185723]: 2026-02-16 13:53:29.283 185727 DEBUG nova.virt.driver [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] Emitting event <LifecycleEvent: 1771250009.2831168, b81c5faa-2832-4df4-8db7-1ffb8d8209ab => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 13:53:29 compute-0 nova_compute[185723]: 2026-02-16 13:53:29.284 185727 INFO nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: b81c5faa-2832-4df4-8db7-1ffb8d8209ab] VM Resumed (Lifecycle Event)
Feb 16 13:53:29 compute-0 nova_compute[185723]: 2026-02-16 13:53:29.290 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Instance 169216bd-b5ad-4408-8962-d36ad92cbf8c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 16 13:53:29 compute-0 nova_compute[185723]: 2026-02-16 13:53:29.313 185727 WARNING nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Instance b81c5faa-2832-4df4-8db7-1ffb8d8209ab has been moved to another host compute-1.ctlplane.example.com(compute-1.ctlplane.example.com). There are allocations remaining against the source host that might need to be removed: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}.
Feb 16 13:53:29 compute-0 nova_compute[185723]: 2026-02-16 13:53:29.314 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 16 13:53:29 compute-0 nova_compute[185723]: 2026-02-16 13:53:29.314 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 16 13:53:29 compute-0 nova_compute[185723]: 2026-02-16 13:53:29.317 185727 DEBUG nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: b81c5faa-2832-4df4-8db7-1ffb8d8209ab] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:53:29 compute-0 nova_compute[185723]: 2026-02-16 13:53:29.320 185727 DEBUG nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: b81c5faa-2832-4df4-8db7-1ffb8d8209ab] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 16 13:53:29 compute-0 nova_compute[185723]: 2026-02-16 13:53:29.344 185727 INFO nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: b81c5faa-2832-4df4-8db7-1ffb8d8209ab] During the sync_power process the instance has moved from host compute-1.ctlplane.example.com to host compute-0.ctlplane.example.com
Feb 16 13:53:29 compute-0 nova_compute[185723]: 2026-02-16 13:53:29.437 185727 DEBUG nova.compute.provider_tree [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Inventory has not changed in ProviderTree for provider: c9501a85-df32-4b8f-bce0-9425ef1e7866 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 13:53:29 compute-0 nova_compute[185723]: 2026-02-16 13:53:29.454 185727 DEBUG nova.scheduler.client.report [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Inventory has not changed for provider c9501a85-df32-4b8f-bce0-9425ef1e7866 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 13:53:29 compute-0 nova_compute[185723]: 2026-02-16 13:53:29.487 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 16 13:53:29 compute-0 nova_compute[185723]: 2026-02-16 13:53:29.487 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.441s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:53:29 compute-0 podman[195053]: time="2026-02-16T13:53:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:53:29 compute-0 podman[195053]: @ - - [16/Feb/2026:13:53:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17245 "" "Go-http-client/1.1"
Feb 16 13:53:29 compute-0 podman[195053]: @ - - [16/Feb/2026:13:53:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2643 "" "Go-http-client/1.1"
Feb 16 13:53:30 compute-0 nova_compute[185723]: 2026-02-16 13:53:30.461 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:53:30 compute-0 nova_compute[185723]: 2026-02-16 13:53:30.462 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:53:30 compute-0 nova_compute[185723]: 2026-02-16 13:53:30.965 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:53:31 compute-0 ovn_controller[96072]: 2026-02-16T13:53:31Z|00242|binding|INFO|Claiming lport da0306b3-8514-4ef0-984c-14d90dedd285 for this chassis.
Feb 16 13:53:31 compute-0 ovn_controller[96072]: 2026-02-16T13:53:31Z|00243|binding|INFO|da0306b3-8514-4ef0-984c-14d90dedd285: Claiming fa:16:3e:04:49:ad 10.100.0.5
Feb 16 13:53:31 compute-0 ovn_controller[96072]: 2026-02-16T13:53:31Z|00244|binding|INFO|Setting lport da0306b3-8514-4ef0-984c-14d90dedd285 up in Southbound
Feb 16 13:53:31 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:53:31.286 105360 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:04:49:ad 10.100.0.5'], port_security=['fa:16:3e:04:49:ad 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'b81c5faa-2832-4df4-8db7-1ffb8d8209ab', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9f3f30c5-b9b2-44c9-bea9-678f6d4e1e0c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '88d7e9d22dc247d4b0e2e95ecc7e73ad', 'neutron:revision_number': '11', 'neutron:security_group_ids': '412416b0-33c5-4b94-970c-86cdbe589da9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=07efc4e8-a338-40bc-a1a5-892571713a01, chassis=[<ovs.db.idl.Row object at 0x7f2e53046760>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2e53046760>], logical_port=da0306b3-8514-4ef0-984c-14d90dedd285) old=Port_Binding(up=[False], additional_chassis=[<ovs.db.idl.Row object at 0x7f2e53046760>], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 13:53:31 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:53:31.288 105360 INFO neutron.agent.ovn.metadata.agent [-] Port da0306b3-8514-4ef0-984c-14d90dedd285 in datapath 9f3f30c5-b9b2-44c9-bea9-678f6d4e1e0c bound to our chassis
Feb 16 13:53:31 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:53:31.289 105360 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 9f3f30c5-b9b2-44c9-bea9-678f6d4e1e0c
Feb 16 13:53:31 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:53:31.291 105360 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=32, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '52:96:1b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '16:6c:7a:bd:49:39'}, ipsec=False) old=SB_Global(nb_cfg=31) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 13:53:31 compute-0 nova_compute[185723]: 2026-02-16 13:53:31.293 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:53:31 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:53:31.302 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[4fa8907a-76a5-45cf-835f-c47e23890351]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:53:31 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:53:31.327 206452 DEBUG oslo.privsep.daemon [-] privsep: reply[f76fa29c-b063-42cd-abdd-c51888eb60ac]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:53:31 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:53:31.330 206452 DEBUG oslo.privsep.daemon [-] privsep: reply[dadf8dc3-d022-4dfe-9bed-d903ff7b25ca]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:53:31 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:53:31.354 206452 DEBUG oslo.privsep.daemon [-] privsep: reply[35672d6d-5cd9-4897-af2a-60ec42aec303]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:53:31 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:53:31.369 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[c9fbbed7-a68e-4efe-b053-b19c9b82591d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9f3f30c5-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:8b:b8:36'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 7, 'rx_bytes': 700, 'tx_bytes': 530, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 7, 'rx_bytes': 700, 'tx_bytes': 530, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 67], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 606313, 'reachable_time': 16457, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 5, 'outoctets': 376, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 5, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 376, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 5, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217258, 'error': None, 'target': 'ovnmeta-9f3f30c5-b9b2-44c9-bea9-678f6d4e1e0c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:53:31 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:53:31.384 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[8e357e07-48ed-42a9-8c3e-b302eb6d61d8]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap9f3f30c5-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 606323, 'tstamp': 606323}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217259, 'error': None, 'target': 'ovnmeta-9f3f30c5-b9b2-44c9-bea9-678f6d4e1e0c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap9f3f30c5-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 606325, 'tstamp': 606325}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217259, 'error': None, 'target': 'ovnmeta-9f3f30c5-b9b2-44c9-bea9-678f6d4e1e0c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:53:31 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:53:31.385 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9f3f30c5-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:53:31 compute-0 nova_compute[185723]: 2026-02-16 13:53:31.387 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:53:31 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:53:31.388 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9f3f30c5-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:53:31 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:53:31.388 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 13:53:31 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:53:31.388 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap9f3f30c5-b0, col_values=(('external_ids', {'iface-id': '340fa0af-180b-44ed-9c22-e18a8f5ebdec'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:53:31 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:53:31.389 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 13:53:31 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:53:31.389 105360 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 16 13:53:31 compute-0 openstack_network_exporter[197909]: ERROR   13:53:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:53:31 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:53:31 compute-0 openstack_network_exporter[197909]: ERROR   13:53:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:53:31 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:53:31 compute-0 nova_compute[185723]: 2026-02-16 13:53:31.428 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:53:31 compute-0 nova_compute[185723]: 2026-02-16 13:53:31.468 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:53:31 compute-0 nova_compute[185723]: 2026-02-16 13:53:31.469 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 16 13:53:31 compute-0 nova_compute[185723]: 2026-02-16 13:53:31.469 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 16 13:53:31 compute-0 nova_compute[185723]: 2026-02-16 13:53:31.560 185727 INFO nova.compute.manager [None req-719ce552-7116-4f20-91ab-02a18f61571d d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] [instance: b81c5faa-2832-4df4-8db7-1ffb8d8209ab] Post operation of migration started
Feb 16 13:53:32 compute-0 nova_compute[185723]: 2026-02-16 13:53:32.195 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Acquiring lock "refresh_cache-169216bd-b5ad-4408-8962-d36ad92cbf8c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 13:53:32 compute-0 nova_compute[185723]: 2026-02-16 13:53:32.196 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Acquired lock "refresh_cache-169216bd-b5ad-4408-8962-d36ad92cbf8c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 13:53:32 compute-0 nova_compute[185723]: 2026-02-16 13:53:32.196 185727 DEBUG nova.network.neutron [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] [instance: 169216bd-b5ad-4408-8962-d36ad92cbf8c] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Feb 16 13:53:32 compute-0 nova_compute[185723]: 2026-02-16 13:53:32.196 185727 DEBUG nova.objects.instance [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 169216bd-b5ad-4408-8962-d36ad92cbf8c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 13:53:32 compute-0 nova_compute[185723]: 2026-02-16 13:53:32.945 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:53:33 compute-0 nova_compute[185723]: 2026-02-16 13:53:33.173 185727 DEBUG oslo_concurrency.lockutils [None req-719ce552-7116-4f20-91ab-02a18f61571d d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] Acquiring lock "refresh_cache-b81c5faa-2832-4df4-8db7-1ffb8d8209ab" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 13:53:33 compute-0 nova_compute[185723]: 2026-02-16 13:53:33.174 185727 DEBUG oslo_concurrency.lockutils [None req-719ce552-7116-4f20-91ab-02a18f61571d d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] Acquired lock "refresh_cache-b81c5faa-2832-4df4-8db7-1ffb8d8209ab" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 13:53:33 compute-0 nova_compute[185723]: 2026-02-16 13:53:33.174 185727 DEBUG nova.network.neutron [None req-719ce552-7116-4f20-91ab-02a18f61571d d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] [instance: b81c5faa-2832-4df4-8db7-1ffb8d8209ab] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 16 13:53:34 compute-0 nova_compute[185723]: 2026-02-16 13:53:34.751 185727 DEBUG nova.network.neutron [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] [instance: 169216bd-b5ad-4408-8962-d36ad92cbf8c] Updating instance_info_cache with network_info: [{"id": "1f9f033c-6441-4f4d-a631-8d0870baa901", "address": "fa:16:3e:23:9d:0a", "network": {"id": "9f3f30c5-b9b2-44c9-bea9-678f6d4e1e0c", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalancingStrategy-2080676524-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "88d7e9d22dc247d4b0e2e95ecc7e73ad", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1f9f033c-64", "ovs_interfaceid": "1f9f033c-6441-4f4d-a631-8d0870baa901", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 13:53:34 compute-0 nova_compute[185723]: 2026-02-16 13:53:34.774 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Releasing lock "refresh_cache-169216bd-b5ad-4408-8962-d36ad92cbf8c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 13:53:34 compute-0 nova_compute[185723]: 2026-02-16 13:53:34.774 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] [instance: 169216bd-b5ad-4408-8962-d36ad92cbf8c] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Feb 16 13:53:34 compute-0 nova_compute[185723]: 2026-02-16 13:53:34.775 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:53:34 compute-0 nova_compute[185723]: 2026-02-16 13:53:34.775 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:53:34 compute-0 nova_compute[185723]: 2026-02-16 13:53:34.775 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:53:34 compute-0 nova_compute[185723]: 2026-02-16 13:53:34.776 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 16 13:53:35 compute-0 nova_compute[185723]: 2026-02-16 13:53:35.250 185727 DEBUG nova.network.neutron [None req-719ce552-7116-4f20-91ab-02a18f61571d d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] [instance: b81c5faa-2832-4df4-8db7-1ffb8d8209ab] Updating instance_info_cache with network_info: [{"id": "da0306b3-8514-4ef0-984c-14d90dedd285", "address": "fa:16:3e:04:49:ad", "network": {"id": "9f3f30c5-b9b2-44c9-bea9-678f6d4e1e0c", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalancingStrategy-2080676524-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "88d7e9d22dc247d4b0e2e95ecc7e73ad", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapda0306b3-85", "ovs_interfaceid": "da0306b3-8514-4ef0-984c-14d90dedd285", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 13:53:35 compute-0 nova_compute[185723]: 2026-02-16 13:53:35.274 185727 DEBUG oslo_concurrency.lockutils [None req-719ce552-7116-4f20-91ab-02a18f61571d d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] Releasing lock "refresh_cache-b81c5faa-2832-4df4-8db7-1ffb8d8209ab" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 13:53:35 compute-0 nova_compute[185723]: 2026-02-16 13:53:35.306 185727 DEBUG oslo_concurrency.lockutils [None req-719ce552-7116-4f20-91ab-02a18f61571d d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:53:35 compute-0 nova_compute[185723]: 2026-02-16 13:53:35.307 185727 DEBUG oslo_concurrency.lockutils [None req-719ce552-7116-4f20-91ab-02a18f61571d d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:53:35 compute-0 nova_compute[185723]: 2026-02-16 13:53:35.307 185727 DEBUG oslo_concurrency.lockutils [None req-719ce552-7116-4f20-91ab-02a18f61571d d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:53:35 compute-0 nova_compute[185723]: 2026-02-16 13:53:35.310 185727 INFO nova.virt.libvirt.driver [None req-719ce552-7116-4f20-91ab-02a18f61571d d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] [instance: b81c5faa-2832-4df4-8db7-1ffb8d8209ab] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Feb 16 13:53:35 compute-0 virtqemud[184843]: Domain id=23 name='instance-0000001c' uuid=b81c5faa-2832-4df4-8db7-1ffb8d8209ab is tainted: custom-monitor
Feb 16 13:53:35 compute-0 nova_compute[185723]: 2026-02-16 13:53:35.967 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:53:35 compute-0 sshd-session[217260]: Invalid user postgres from 188.166.42.159 port 47200
Feb 16 13:53:36 compute-0 sshd-session[217260]: Connection closed by invalid user postgres 188.166.42.159 port 47200 [preauth]
Feb 16 13:53:36 compute-0 nova_compute[185723]: 2026-02-16 13:53:36.316 185727 INFO nova.virt.libvirt.driver [None req-719ce552-7116-4f20-91ab-02a18f61571d d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] [instance: b81c5faa-2832-4df4-8db7-1ffb8d8209ab] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Feb 16 13:53:36 compute-0 sshd-session[217262]: Invalid user ubuntu from 64.227.72.94 port 47228
Feb 16 13:53:36 compute-0 sshd-session[217262]: Connection closed by invalid user ubuntu 64.227.72.94 port 47228 [preauth]
Feb 16 13:53:37 compute-0 nova_compute[185723]: 2026-02-16 13:53:37.322 185727 INFO nova.virt.libvirt.driver [None req-719ce552-7116-4f20-91ab-02a18f61571d d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] [instance: b81c5faa-2832-4df4-8db7-1ffb8d8209ab] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Feb 16 13:53:37 compute-0 nova_compute[185723]: 2026-02-16 13:53:37.326 185727 DEBUG nova.compute.manager [None req-719ce552-7116-4f20-91ab-02a18f61571d d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] [instance: b81c5faa-2832-4df4-8db7-1ffb8d8209ab] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:53:37 compute-0 nova_compute[185723]: 2026-02-16 13:53:37.348 185727 DEBUG nova.objects.instance [None req-719ce552-7116-4f20-91ab-02a18f61571d d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] [instance: b81c5faa-2832-4df4-8db7-1ffb8d8209ab] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Feb 16 13:53:37 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:53:37.391 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=b0e583b2-47d7-4bde-bbd6-282143e0c194, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '32'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:53:37 compute-0 nova_compute[185723]: 2026-02-16 13:53:37.775 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:53:37 compute-0 nova_compute[185723]: 2026-02-16 13:53:37.947 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:53:40 compute-0 nova_compute[185723]: 2026-02-16 13:53:40.433 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:53:40 compute-0 nova_compute[185723]: 2026-02-16 13:53:40.434 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Feb 16 13:53:40 compute-0 nova_compute[185723]: 2026-02-16 13:53:40.969 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:53:40 compute-0 sshd-session[217265]: Connection closed by 103.213.244.180 port 38504 [preauth]
Feb 16 13:53:42 compute-0 nova_compute[185723]: 2026-02-16 13:53:42.814 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:53:42 compute-0 nova_compute[185723]: 2026-02-16 13:53:42.841 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Triggering sync for uuid 169216bd-b5ad-4408-8962-d36ad92cbf8c _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268
Feb 16 13:53:42 compute-0 nova_compute[185723]: 2026-02-16 13:53:42.842 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Triggering sync for uuid b81c5faa-2832-4df4-8db7-1ffb8d8209ab _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268
Feb 16 13:53:42 compute-0 nova_compute[185723]: 2026-02-16 13:53:42.843 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Acquiring lock "169216bd-b5ad-4408-8962-d36ad92cbf8c" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:53:42 compute-0 nova_compute[185723]: 2026-02-16 13:53:42.844 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "169216bd-b5ad-4408-8962-d36ad92cbf8c" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:53:42 compute-0 nova_compute[185723]: 2026-02-16 13:53:42.844 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Acquiring lock "b81c5faa-2832-4df4-8db7-1ffb8d8209ab" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:53:42 compute-0 nova_compute[185723]: 2026-02-16 13:53:42.845 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "b81c5faa-2832-4df4-8db7-1ffb8d8209ab" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:53:42 compute-0 nova_compute[185723]: 2026-02-16 13:53:42.878 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "b81c5faa-2832-4df4-8db7-1ffb8d8209ab" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.033s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:53:42 compute-0 nova_compute[185723]: 2026-02-16 13:53:42.880 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "169216bd-b5ad-4408-8962-d36ad92cbf8c" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.036s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:53:42 compute-0 nova_compute[185723]: 2026-02-16 13:53:42.950 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:53:44 compute-0 nova_compute[185723]: 2026-02-16 13:53:44.648 185727 DEBUG oslo_concurrency.lockutils [None req-6231334a-a7e8-4c78-b987-b188bdce5169 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] Acquiring lock "b81c5faa-2832-4df4-8db7-1ffb8d8209ab" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:53:44 compute-0 nova_compute[185723]: 2026-02-16 13:53:44.648 185727 DEBUG oslo_concurrency.lockutils [None req-6231334a-a7e8-4c78-b987-b188bdce5169 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] Lock "b81c5faa-2832-4df4-8db7-1ffb8d8209ab" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:53:44 compute-0 nova_compute[185723]: 2026-02-16 13:53:44.649 185727 DEBUG oslo_concurrency.lockutils [None req-6231334a-a7e8-4c78-b987-b188bdce5169 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] Acquiring lock "b81c5faa-2832-4df4-8db7-1ffb8d8209ab-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:53:44 compute-0 nova_compute[185723]: 2026-02-16 13:53:44.649 185727 DEBUG oslo_concurrency.lockutils [None req-6231334a-a7e8-4c78-b987-b188bdce5169 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] Lock "b81c5faa-2832-4df4-8db7-1ffb8d8209ab-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:53:44 compute-0 nova_compute[185723]: 2026-02-16 13:53:44.649 185727 DEBUG oslo_concurrency.lockutils [None req-6231334a-a7e8-4c78-b987-b188bdce5169 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] Lock "b81c5faa-2832-4df4-8db7-1ffb8d8209ab-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:53:44 compute-0 nova_compute[185723]: 2026-02-16 13:53:44.650 185727 INFO nova.compute.manager [None req-6231334a-a7e8-4c78-b987-b188bdce5169 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] [instance: b81c5faa-2832-4df4-8db7-1ffb8d8209ab] Terminating instance
Feb 16 13:53:44 compute-0 nova_compute[185723]: 2026-02-16 13:53:44.651 185727 DEBUG nova.compute.manager [None req-6231334a-a7e8-4c78-b987-b188bdce5169 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] [instance: b81c5faa-2832-4df4-8db7-1ffb8d8209ab] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Feb 16 13:53:44 compute-0 kernel: tapda0306b3-85 (unregistering): left promiscuous mode
Feb 16 13:53:44 compute-0 NetworkManager[56177]: <info>  [1771250024.6894] device (tapda0306b3-85): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Feb 16 13:53:44 compute-0 ovn_controller[96072]: 2026-02-16T13:53:44Z|00245|binding|INFO|Releasing lport da0306b3-8514-4ef0-984c-14d90dedd285 from this chassis (sb_readonly=0)
Feb 16 13:53:44 compute-0 ovn_controller[96072]: 2026-02-16T13:53:44Z|00246|binding|INFO|Setting lport da0306b3-8514-4ef0-984c-14d90dedd285 down in Southbound
Feb 16 13:53:44 compute-0 ovn_controller[96072]: 2026-02-16T13:53:44Z|00247|binding|INFO|Removing iface tapda0306b3-85 ovn-installed in OVS
Feb 16 13:53:44 compute-0 nova_compute[185723]: 2026-02-16 13:53:44.700 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:53:44 compute-0 nova_compute[185723]: 2026-02-16 13:53:44.702 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:53:44 compute-0 nova_compute[185723]: 2026-02-16 13:53:44.704 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:53:44 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:53:44.724 105360 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:04:49:ad 10.100.0.5'], port_security=['fa:16:3e:04:49:ad 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'b81c5faa-2832-4df4-8db7-1ffb8d8209ab', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9f3f30c5-b9b2-44c9-bea9-678f6d4e1e0c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '88d7e9d22dc247d4b0e2e95ecc7e73ad', 'neutron:revision_number': '13', 'neutron:security_group_ids': '412416b0-33c5-4b94-970c-86cdbe589da9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=07efc4e8-a338-40bc-a1a5-892571713a01, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2e53046760>], logical_port=da0306b3-8514-4ef0-984c-14d90dedd285) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2e53046760>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 13:53:44 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:53:44.725 105360 INFO neutron.agent.ovn.metadata.agent [-] Port da0306b3-8514-4ef0-984c-14d90dedd285 in datapath 9f3f30c5-b9b2-44c9-bea9-678f6d4e1e0c unbound from our chassis
Feb 16 13:53:44 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:53:44.726 105360 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 9f3f30c5-b9b2-44c9-bea9-678f6d4e1e0c
Feb 16 13:53:44 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:53:44.739 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[ede7e211-1e29-4a64-9c9b-950264b58af6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:53:44 compute-0 systemd[1]: machine-qemu\x2d23\x2dinstance\x2d0000001c.scope: Deactivated successfully.
Feb 16 13:53:44 compute-0 systemd[1]: machine-qemu\x2d23\x2dinstance\x2d0000001c.scope: Consumed 1.566s CPU time.
Feb 16 13:53:44 compute-0 systemd-machined[155229]: Machine qemu-23-instance-0000001c terminated.
Feb 16 13:53:44 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:53:44.765 206452 DEBUG oslo.privsep.daemon [-] privsep: reply[d9241333-2c7d-4ada-b1a4-9e4ad588159c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:53:44 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:53:44.767 206452 DEBUG oslo.privsep.daemon [-] privsep: reply[0b6598d1-b630-4628-b603-6f7934016ce0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:53:44 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:53:44.791 206452 DEBUG oslo.privsep.daemon [-] privsep: reply[d542f0d8-2373-42d6-9fba-f488ba199368]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:53:44 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:53:44.806 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[32c8690a-eca4-45ba-8a5f-4fa0d96f33f9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9f3f30c5-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:8b:b8:36'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 26, 'tx_packets': 9, 'rx_bytes': 1372, 'tx_bytes': 614, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 26, 'tx_packets': 9, 'rx_bytes': 1372, 'tx_bytes': 614, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 67], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 606313, 'reachable_time': 16457, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 5, 'outoctets': 376, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 5, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 376, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 5, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217279, 'error': None, 'target': 'ovnmeta-9f3f30c5-b9b2-44c9-bea9-678f6d4e1e0c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:53:44 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:53:44.821 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[c5cd7fb5-791d-4234-8d78-f05baf20b4cf]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap9f3f30c5-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 606323, 'tstamp': 606323}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217280, 'error': None, 'target': 'ovnmeta-9f3f30c5-b9b2-44c9-bea9-678f6d4e1e0c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap9f3f30c5-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 606325, 'tstamp': 606325}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217280, 'error': None, 'target': 'ovnmeta-9f3f30c5-b9b2-44c9-bea9-678f6d4e1e0c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:53:44 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:53:44.823 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9f3f30c5-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:53:44 compute-0 nova_compute[185723]: 2026-02-16 13:53:44.852 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:53:44 compute-0 nova_compute[185723]: 2026-02-16 13:53:44.856 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:53:44 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:53:44.856 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9f3f30c5-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:53:44 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:53:44.857 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 13:53:44 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:53:44.858 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap9f3f30c5-b0, col_values=(('external_ids', {'iface-id': '340fa0af-180b-44ed-9c22-e18a8f5ebdec'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:53:44 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:53:44.858 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 13:53:44 compute-0 nova_compute[185723]: 2026-02-16 13:53:44.871 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:53:44 compute-0 nova_compute[185723]: 2026-02-16 13:53:44.874 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:53:44 compute-0 nova_compute[185723]: 2026-02-16 13:53:44.896 185727 INFO nova.virt.libvirt.driver [-] [instance: b81c5faa-2832-4df4-8db7-1ffb8d8209ab] Instance destroyed successfully.
Feb 16 13:53:44 compute-0 nova_compute[185723]: 2026-02-16 13:53:44.896 185727 DEBUG nova.objects.instance [None req-6231334a-a7e8-4c78-b987-b188bdce5169 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] Lazy-loading 'resources' on Instance uuid b81c5faa-2832-4df4-8db7-1ffb8d8209ab obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 13:53:44 compute-0 nova_compute[185723]: 2026-02-16 13:53:44.975 185727 DEBUG nova.virt.libvirt.vif [None req-6231334a-a7e8-4c78-b987-b188bdce5169 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-02-16T13:53:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadBalancingStrategy-server-1419341349',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteworkloadbalancingstrategy-server-1419341349',id=28,image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-16T13:53:11Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='88d7e9d22dc247d4b0e2e95ecc7e73ad',ramdisk_id='',reservation_id='r-9nhfpgiy',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteWorkloadBalancingStrategy-492275053',owner_user_name='tempest-TestExecuteWorkloadBalancingStrategy-492275053-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-16T13:53:37Z,user_data=None,user_id='178de9ab917a4ba5a84dc9f520a0847f',uuid=b81c5faa-2832-4df4-8db7-1ffb8d8209ab,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "da0306b3-8514-4ef0-984c-14d90dedd285", "address": "fa:16:3e:04:49:ad", "network": {"id": "9f3f30c5-b9b2-44c9-bea9-678f6d4e1e0c", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalancingStrategy-2080676524-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "88d7e9d22dc247d4b0e2e95ecc7e73ad", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapda0306b3-85", "ovs_interfaceid": "da0306b3-8514-4ef0-984c-14d90dedd285", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Feb 16 13:53:44 compute-0 nova_compute[185723]: 2026-02-16 13:53:44.976 185727 DEBUG nova.network.os_vif_util [None req-6231334a-a7e8-4c78-b987-b188bdce5169 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] Converting VIF {"id": "da0306b3-8514-4ef0-984c-14d90dedd285", "address": "fa:16:3e:04:49:ad", "network": {"id": "9f3f30c5-b9b2-44c9-bea9-678f6d4e1e0c", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalancingStrategy-2080676524-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "88d7e9d22dc247d4b0e2e95ecc7e73ad", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapda0306b3-85", "ovs_interfaceid": "da0306b3-8514-4ef0-984c-14d90dedd285", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 13:53:44 compute-0 nova_compute[185723]: 2026-02-16 13:53:44.976 185727 DEBUG nova.network.os_vif_util [None req-6231334a-a7e8-4c78-b987-b188bdce5169 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:04:49:ad,bridge_name='br-int',has_traffic_filtering=True,id=da0306b3-8514-4ef0-984c-14d90dedd285,network=Network(9f3f30c5-b9b2-44c9-bea9-678f6d4e1e0c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapda0306b3-85') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 13:53:44 compute-0 nova_compute[185723]: 2026-02-16 13:53:44.977 185727 DEBUG os_vif [None req-6231334a-a7e8-4c78-b987-b188bdce5169 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:04:49:ad,bridge_name='br-int',has_traffic_filtering=True,id=da0306b3-8514-4ef0-984c-14d90dedd285,network=Network(9f3f30c5-b9b2-44c9-bea9-678f6d4e1e0c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapda0306b3-85') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Feb 16 13:53:44 compute-0 nova_compute[185723]: 2026-02-16 13:53:44.978 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:53:44 compute-0 nova_compute[185723]: 2026-02-16 13:53:44.978 185727 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapda0306b3-85, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:53:44 compute-0 nova_compute[185723]: 2026-02-16 13:53:44.980 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:53:44 compute-0 nova_compute[185723]: 2026-02-16 13:53:44.981 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:53:44 compute-0 nova_compute[185723]: 2026-02-16 13:53:44.983 185727 INFO os_vif [None req-6231334a-a7e8-4c78-b987-b188bdce5169 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:04:49:ad,bridge_name='br-int',has_traffic_filtering=True,id=da0306b3-8514-4ef0-984c-14d90dedd285,network=Network(9f3f30c5-b9b2-44c9-bea9-678f6d4e1e0c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapda0306b3-85')
Feb 16 13:53:44 compute-0 nova_compute[185723]: 2026-02-16 13:53:44.984 185727 INFO nova.virt.libvirt.driver [None req-6231334a-a7e8-4c78-b987-b188bdce5169 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] [instance: b81c5faa-2832-4df4-8db7-1ffb8d8209ab] Deleting instance files /var/lib/nova/instances/b81c5faa-2832-4df4-8db7-1ffb8d8209ab_del
Feb 16 13:53:44 compute-0 nova_compute[185723]: 2026-02-16 13:53:44.984 185727 INFO nova.virt.libvirt.driver [None req-6231334a-a7e8-4c78-b987-b188bdce5169 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] [instance: b81c5faa-2832-4df4-8db7-1ffb8d8209ab] Deletion of /var/lib/nova/instances/b81c5faa-2832-4df4-8db7-1ffb8d8209ab_del complete
Feb 16 13:53:45 compute-0 nova_compute[185723]: 2026-02-16 13:53:45.079 185727 INFO nova.compute.manager [None req-6231334a-a7e8-4c78-b987-b188bdce5169 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] [instance: b81c5faa-2832-4df4-8db7-1ffb8d8209ab] Took 0.43 seconds to destroy the instance on the hypervisor.
Feb 16 13:53:45 compute-0 nova_compute[185723]: 2026-02-16 13:53:45.079 185727 DEBUG oslo.service.loopingcall [None req-6231334a-a7e8-4c78-b987-b188bdce5169 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Feb 16 13:53:45 compute-0 nova_compute[185723]: 2026-02-16 13:53:45.080 185727 DEBUG nova.compute.manager [-] [instance: b81c5faa-2832-4df4-8db7-1ffb8d8209ab] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Feb 16 13:53:45 compute-0 nova_compute[185723]: 2026-02-16 13:53:45.080 185727 DEBUG nova.network.neutron [-] [instance: b81c5faa-2832-4df4-8db7-1ffb8d8209ab] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Feb 16 13:53:45 compute-0 nova_compute[185723]: 2026-02-16 13:53:45.700 185727 DEBUG nova.compute.manager [req-736734e6-37af-49d1-b3f3-391182e201a5 req-99430d26-ecca-45c8-8537-c0e5825dc049 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: b81c5faa-2832-4df4-8db7-1ffb8d8209ab] Received event network-vif-unplugged-da0306b3-8514-4ef0-984c-14d90dedd285 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:53:45 compute-0 nova_compute[185723]: 2026-02-16 13:53:45.700 185727 DEBUG oslo_concurrency.lockutils [req-736734e6-37af-49d1-b3f3-391182e201a5 req-99430d26-ecca-45c8-8537-c0e5825dc049 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "b81c5faa-2832-4df4-8db7-1ffb8d8209ab-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:53:45 compute-0 nova_compute[185723]: 2026-02-16 13:53:45.701 185727 DEBUG oslo_concurrency.lockutils [req-736734e6-37af-49d1-b3f3-391182e201a5 req-99430d26-ecca-45c8-8537-c0e5825dc049 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "b81c5faa-2832-4df4-8db7-1ffb8d8209ab-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:53:45 compute-0 nova_compute[185723]: 2026-02-16 13:53:45.701 185727 DEBUG oslo_concurrency.lockutils [req-736734e6-37af-49d1-b3f3-391182e201a5 req-99430d26-ecca-45c8-8537-c0e5825dc049 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "b81c5faa-2832-4df4-8db7-1ffb8d8209ab-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:53:45 compute-0 nova_compute[185723]: 2026-02-16 13:53:45.701 185727 DEBUG nova.compute.manager [req-736734e6-37af-49d1-b3f3-391182e201a5 req-99430d26-ecca-45c8-8537-c0e5825dc049 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: b81c5faa-2832-4df4-8db7-1ffb8d8209ab] No waiting events found dispatching network-vif-unplugged-da0306b3-8514-4ef0-984c-14d90dedd285 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:53:45 compute-0 nova_compute[185723]: 2026-02-16 13:53:45.701 185727 DEBUG nova.compute.manager [req-736734e6-37af-49d1-b3f3-391182e201a5 req-99430d26-ecca-45c8-8537-c0e5825dc049 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: b81c5faa-2832-4df4-8db7-1ffb8d8209ab] Received event network-vif-unplugged-da0306b3-8514-4ef0-984c-14d90dedd285 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Feb 16 13:53:46 compute-0 nova_compute[185723]: 2026-02-16 13:53:46.022 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:53:46 compute-0 nova_compute[185723]: 2026-02-16 13:53:46.452 185727 DEBUG nova.network.neutron [-] [instance: b81c5faa-2832-4df4-8db7-1ffb8d8209ab] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 13:53:46 compute-0 nova_compute[185723]: 2026-02-16 13:53:46.477 185727 INFO nova.compute.manager [-] [instance: b81c5faa-2832-4df4-8db7-1ffb8d8209ab] Took 1.40 seconds to deallocate network for instance.
Feb 16 13:53:46 compute-0 nova_compute[185723]: 2026-02-16 13:53:46.538 185727 DEBUG oslo_concurrency.lockutils [None req-6231334a-a7e8-4c78-b987-b188bdce5169 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:53:46 compute-0 nova_compute[185723]: 2026-02-16 13:53:46.538 185727 DEBUG oslo_concurrency.lockutils [None req-6231334a-a7e8-4c78-b987-b188bdce5169 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:53:46 compute-0 nova_compute[185723]: 2026-02-16 13:53:46.543 185727 DEBUG oslo_concurrency.lockutils [None req-6231334a-a7e8-4c78-b987-b188bdce5169 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.004s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:53:46 compute-0 nova_compute[185723]: 2026-02-16 13:53:46.588 185727 DEBUG nova.compute.manager [req-cd502128-0332-4f2c-9598-5be8d1850b17 req-6023712f-3e1c-492f-94b3-23662910662a faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: b81c5faa-2832-4df4-8db7-1ffb8d8209ab] Received event network-vif-deleted-da0306b3-8514-4ef0-984c-14d90dedd285 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:53:46 compute-0 nova_compute[185723]: 2026-02-16 13:53:46.590 185727 INFO nova.scheduler.client.report [None req-6231334a-a7e8-4c78-b987-b188bdce5169 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] Deleted allocations for instance b81c5faa-2832-4df4-8db7-1ffb8d8209ab
Feb 16 13:53:46 compute-0 nova_compute[185723]: 2026-02-16 13:53:46.680 185727 DEBUG oslo_concurrency.lockutils [None req-6231334a-a7e8-4c78-b987-b188bdce5169 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] Lock "b81c5faa-2832-4df4-8db7-1ffb8d8209ab" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.031s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:53:47 compute-0 nova_compute[185723]: 2026-02-16 13:53:47.565 185727 DEBUG oslo_concurrency.lockutils [None req-a8bc9575-b9f3-4e02-832d-2b4ad31e1cb8 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] Acquiring lock "169216bd-b5ad-4408-8962-d36ad92cbf8c" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:53:47 compute-0 nova_compute[185723]: 2026-02-16 13:53:47.565 185727 DEBUG oslo_concurrency.lockutils [None req-a8bc9575-b9f3-4e02-832d-2b4ad31e1cb8 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] Lock "169216bd-b5ad-4408-8962-d36ad92cbf8c" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:53:47 compute-0 nova_compute[185723]: 2026-02-16 13:53:47.565 185727 DEBUG oslo_concurrency.lockutils [None req-a8bc9575-b9f3-4e02-832d-2b4ad31e1cb8 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] Acquiring lock "169216bd-b5ad-4408-8962-d36ad92cbf8c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:53:47 compute-0 nova_compute[185723]: 2026-02-16 13:53:47.566 185727 DEBUG oslo_concurrency.lockutils [None req-a8bc9575-b9f3-4e02-832d-2b4ad31e1cb8 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] Lock "169216bd-b5ad-4408-8962-d36ad92cbf8c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:53:47 compute-0 nova_compute[185723]: 2026-02-16 13:53:47.566 185727 DEBUG oslo_concurrency.lockutils [None req-a8bc9575-b9f3-4e02-832d-2b4ad31e1cb8 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] Lock "169216bd-b5ad-4408-8962-d36ad92cbf8c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:53:47 compute-0 nova_compute[185723]: 2026-02-16 13:53:47.567 185727 INFO nova.compute.manager [None req-a8bc9575-b9f3-4e02-832d-2b4ad31e1cb8 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] [instance: 169216bd-b5ad-4408-8962-d36ad92cbf8c] Terminating instance
Feb 16 13:53:47 compute-0 nova_compute[185723]: 2026-02-16 13:53:47.568 185727 DEBUG nova.compute.manager [None req-a8bc9575-b9f3-4e02-832d-2b4ad31e1cb8 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] [instance: 169216bd-b5ad-4408-8962-d36ad92cbf8c] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Feb 16 13:53:47 compute-0 kernel: tap1f9f033c-64 (unregistering): left promiscuous mode
Feb 16 13:53:47 compute-0 NetworkManager[56177]: <info>  [1771250027.5966] device (tap1f9f033c-64): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Feb 16 13:53:47 compute-0 ovn_controller[96072]: 2026-02-16T13:53:47Z|00248|binding|INFO|Releasing lport 1f9f033c-6441-4f4d-a631-8d0870baa901 from this chassis (sb_readonly=0)
Feb 16 13:53:47 compute-0 ovn_controller[96072]: 2026-02-16T13:53:47Z|00249|binding|INFO|Setting lport 1f9f033c-6441-4f4d-a631-8d0870baa901 down in Southbound
Feb 16 13:53:47 compute-0 ovn_controller[96072]: 2026-02-16T13:53:47Z|00250|binding|INFO|Removing iface tap1f9f033c-64 ovn-installed in OVS
Feb 16 13:53:47 compute-0 nova_compute[185723]: 2026-02-16 13:53:47.600 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:53:47 compute-0 nova_compute[185723]: 2026-02-16 13:53:47.602 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:53:47 compute-0 nova_compute[185723]: 2026-02-16 13:53:47.605 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:53:47 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:53:47.607 105360 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:23:9d:0a 10.100.0.8'], port_security=['fa:16:3e:23:9d:0a 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '169216bd-b5ad-4408-8962-d36ad92cbf8c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9f3f30c5-b9b2-44c9-bea9-678f6d4e1e0c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '88d7e9d22dc247d4b0e2e95ecc7e73ad', 'neutron:revision_number': '4', 'neutron:security_group_ids': '412416b0-33c5-4b94-970c-86cdbe589da9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=07efc4e8-a338-40bc-a1a5-892571713a01, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2e53046760>], logical_port=1f9f033c-6441-4f4d-a631-8d0870baa901) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2e53046760>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 13:53:47 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:53:47.608 105360 INFO neutron.agent.ovn.metadata.agent [-] Port 1f9f033c-6441-4f4d-a631-8d0870baa901 in datapath 9f3f30c5-b9b2-44c9-bea9-678f6d4e1e0c unbound from our chassis
Feb 16 13:53:47 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:53:47.609 105360 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 9f3f30c5-b9b2-44c9-bea9-678f6d4e1e0c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 16 13:53:47 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:53:47.610 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[f1c8cbd0-30aa-4a58-bff3-09aa7f9f6a5d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:53:47 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:53:47.610 105360 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-9f3f30c5-b9b2-44c9-bea9-678f6d4e1e0c namespace which is not needed anymore
Feb 16 13:53:47 compute-0 systemd[1]: machine-qemu\x2d22\x2dinstance\x2d0000001b.scope: Deactivated successfully.
Feb 16 13:53:47 compute-0 systemd[1]: machine-qemu\x2d22\x2dinstance\x2d0000001b.scope: Consumed 12.760s CPU time.
Feb 16 13:53:47 compute-0 systemd-machined[155229]: Machine qemu-22-instance-0000001b terminated.
Feb 16 13:53:47 compute-0 podman[217304]: 2026-02-16 13:53:47.68200362 +0000 UTC m=+0.055853771 container health_status d69bd0be854ec8043fcba555b70c2cd311c7a2510d07d6bc9929fe7684abece9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Feb 16 13:53:47 compute-0 podman[217302]: 2026-02-16 13:53:47.702538041 +0000 UTC m=+0.076394032 container health_status 93151dcbec9f159583042df5101bdd7b5b1bcb5f3117955b4bc9cd1c0e465b98 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, build-date=2026-02-05T04:57:10Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., org.opencontainers.image.created=2026-02-05T04:57:10Z, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, managed_by=edpm_ansible, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vendor=Red Hat, Inc., architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=openstack_network_exporter, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.openshift.expose-services=, version=9.7, container_name=openstack_network_exporter, release=1770267347, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9)
Feb 16 13:53:47 compute-0 neutron-haproxy-ovnmeta-9f3f30c5-b9b2-44c9-bea9-678f6d4e1e0c[217036]: [NOTICE]   (217040) : haproxy version is 2.8.14-c23fe91
Feb 16 13:53:47 compute-0 neutron-haproxy-ovnmeta-9f3f30c5-b9b2-44c9-bea9-678f6d4e1e0c[217036]: [NOTICE]   (217040) : path to executable is /usr/sbin/haproxy
Feb 16 13:53:47 compute-0 neutron-haproxy-ovnmeta-9f3f30c5-b9b2-44c9-bea9-678f6d4e1e0c[217036]: [WARNING]  (217040) : Exiting Master process...
Feb 16 13:53:47 compute-0 neutron-haproxy-ovnmeta-9f3f30c5-b9b2-44c9-bea9-678f6d4e1e0c[217036]: [ALERT]    (217040) : Current worker (217042) exited with code 143 (Terminated)
Feb 16 13:53:47 compute-0 neutron-haproxy-ovnmeta-9f3f30c5-b9b2-44c9-bea9-678f6d4e1e0c[217036]: [WARNING]  (217040) : All workers exited. Exiting... (0)
Feb 16 13:53:47 compute-0 systemd[1]: libpod-aac1415ddbfc290c7f36da12fe165002ab21e8ceafd75a0310b884de947e1936.scope: Deactivated successfully.
Feb 16 13:53:47 compute-0 podman[217355]: 2026-02-16 13:53:47.733175133 +0000 UTC m=+0.043459082 container died aac1415ddbfc290c7f36da12fe165002ab21e8ceafd75a0310b884de947e1936 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9f3f30c5-b9b2-44c9-bea9-678f6d4e1e0c, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2)
Feb 16 13:53:47 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-aac1415ddbfc290c7f36da12fe165002ab21e8ceafd75a0310b884de947e1936-userdata-shm.mount: Deactivated successfully.
Feb 16 13:53:47 compute-0 systemd[1]: var-lib-containers-storage-overlay-a3b1ea43ac2b40e9d87529f422f7d9eae31c604773733c64a580258fd332b817-merged.mount: Deactivated successfully.
Feb 16 13:53:47 compute-0 podman[217355]: 2026-02-16 13:53:47.762658927 +0000 UTC m=+0.072942876 container cleanup aac1415ddbfc290c7f36da12fe165002ab21e8ceafd75a0310b884de947e1936 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9f3f30c5-b9b2-44c9-bea9-678f6d4e1e0c, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127)
Feb 16 13:53:47 compute-0 systemd[1]: libpod-conmon-aac1415ddbfc290c7f36da12fe165002ab21e8ceafd75a0310b884de947e1936.scope: Deactivated successfully.
Feb 16 13:53:47 compute-0 kernel: tap1f9f033c-64: entered promiscuous mode
Feb 16 13:53:47 compute-0 kernel: tap1f9f033c-64 (unregistering): left promiscuous mode
Feb 16 13:53:47 compute-0 NetworkManager[56177]: <info>  [1771250027.7888] manager: (tap1f9f033c-64): new Tun device (/org/freedesktop/NetworkManager/Devices/99)
Feb 16 13:53:47 compute-0 ovn_controller[96072]: 2026-02-16T13:53:47Z|00251|binding|INFO|Claiming lport 1f9f033c-6441-4f4d-a631-8d0870baa901 for this chassis.
Feb 16 13:53:47 compute-0 ovn_controller[96072]: 2026-02-16T13:53:47Z|00252|binding|INFO|1f9f033c-6441-4f4d-a631-8d0870baa901: Claiming fa:16:3e:23:9d:0a 10.100.0.8
Feb 16 13:53:47 compute-0 nova_compute[185723]: 2026-02-16 13:53:47.789 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:53:47 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:53:47.797 105360 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:23:9d:0a 10.100.0.8'], port_security=['fa:16:3e:23:9d:0a 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '169216bd-b5ad-4408-8962-d36ad92cbf8c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9f3f30c5-b9b2-44c9-bea9-678f6d4e1e0c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '88d7e9d22dc247d4b0e2e95ecc7e73ad', 'neutron:revision_number': '4', 'neutron:security_group_ids': '412416b0-33c5-4b94-970c-86cdbe589da9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=07efc4e8-a338-40bc-a1a5-892571713a01, chassis=[<ovs.db.idl.Row object at 0x7f2e53046760>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2e53046760>], logical_port=1f9f033c-6441-4f4d-a631-8d0870baa901) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 13:53:47 compute-0 ovn_controller[96072]: 2026-02-16T13:53:47Z|00253|binding|INFO|Setting lport 1f9f033c-6441-4f4d-a631-8d0870baa901 ovn-installed in OVS
Feb 16 13:53:47 compute-0 ovn_controller[96072]: 2026-02-16T13:53:47Z|00254|binding|INFO|Setting lport 1f9f033c-6441-4f4d-a631-8d0870baa901 up in Southbound
Feb 16 13:53:47 compute-0 ovn_controller[96072]: 2026-02-16T13:53:47Z|00255|binding|INFO|Releasing lport 1f9f033c-6441-4f4d-a631-8d0870baa901 from this chassis (sb_readonly=1)
Feb 16 13:53:47 compute-0 ovn_controller[96072]: 2026-02-16T13:53:47Z|00256|if_status|INFO|Dropped 5 log messages in last 233 seconds (most recently, 233 seconds ago) due to excessive rate
Feb 16 13:53:47 compute-0 ovn_controller[96072]: 2026-02-16T13:53:47Z|00257|if_status|INFO|Not setting lport 1f9f033c-6441-4f4d-a631-8d0870baa901 down as sb is readonly
Feb 16 13:53:47 compute-0 nova_compute[185723]: 2026-02-16 13:53:47.799 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:53:47 compute-0 ovn_controller[96072]: 2026-02-16T13:53:47Z|00258|binding|INFO|Removing iface tap1f9f033c-64 ovn-installed in OVS
Feb 16 13:53:47 compute-0 nova_compute[185723]: 2026-02-16 13:53:47.801 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:53:47 compute-0 ovn_controller[96072]: 2026-02-16T13:53:47Z|00259|binding|INFO|Releasing lport 1f9f033c-6441-4f4d-a631-8d0870baa901 from this chassis (sb_readonly=0)
Feb 16 13:53:47 compute-0 ovn_controller[96072]: 2026-02-16T13:53:47Z|00260|binding|INFO|Setting lport 1f9f033c-6441-4f4d-a631-8d0870baa901 down in Southbound
Feb 16 13:53:47 compute-0 nova_compute[185723]: 2026-02-16 13:53:47.804 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:53:47 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:53:47.809 105360 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:23:9d:0a 10.100.0.8'], port_security=['fa:16:3e:23:9d:0a 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '169216bd-b5ad-4408-8962-d36ad92cbf8c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9f3f30c5-b9b2-44c9-bea9-678f6d4e1e0c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '88d7e9d22dc247d4b0e2e95ecc7e73ad', 'neutron:revision_number': '4', 'neutron:security_group_ids': '412416b0-33c5-4b94-970c-86cdbe589da9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=07efc4e8-a338-40bc-a1a5-892571713a01, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2e53046760>], logical_port=1f9f033c-6441-4f4d-a631-8d0870baa901) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2e53046760>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 13:53:47 compute-0 nova_compute[185723]: 2026-02-16 13:53:47.812 185727 DEBUG nova.compute.manager [req-91418210-8598-4153-afcb-165e58aa4766 req-17d53a39-453e-464e-83aa-cf3af4709aa5 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: b81c5faa-2832-4df4-8db7-1ffb8d8209ab] Received event network-vif-plugged-da0306b3-8514-4ef0-984c-14d90dedd285 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:53:47 compute-0 nova_compute[185723]: 2026-02-16 13:53:47.812 185727 DEBUG oslo_concurrency.lockutils [req-91418210-8598-4153-afcb-165e58aa4766 req-17d53a39-453e-464e-83aa-cf3af4709aa5 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "b81c5faa-2832-4df4-8db7-1ffb8d8209ab-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:53:47 compute-0 nova_compute[185723]: 2026-02-16 13:53:47.812 185727 DEBUG oslo_concurrency.lockutils [req-91418210-8598-4153-afcb-165e58aa4766 req-17d53a39-453e-464e-83aa-cf3af4709aa5 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "b81c5faa-2832-4df4-8db7-1ffb8d8209ab-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:53:47 compute-0 nova_compute[185723]: 2026-02-16 13:53:47.813 185727 DEBUG oslo_concurrency.lockutils [req-91418210-8598-4153-afcb-165e58aa4766 req-17d53a39-453e-464e-83aa-cf3af4709aa5 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "b81c5faa-2832-4df4-8db7-1ffb8d8209ab-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:53:47 compute-0 nova_compute[185723]: 2026-02-16 13:53:47.813 185727 DEBUG nova.compute.manager [req-91418210-8598-4153-afcb-165e58aa4766 req-17d53a39-453e-464e-83aa-cf3af4709aa5 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: b81c5faa-2832-4df4-8db7-1ffb8d8209ab] No waiting events found dispatching network-vif-plugged-da0306b3-8514-4ef0-984c-14d90dedd285 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:53:47 compute-0 nova_compute[185723]: 2026-02-16 13:53:47.813 185727 WARNING nova.compute.manager [req-91418210-8598-4153-afcb-165e58aa4766 req-17d53a39-453e-464e-83aa-cf3af4709aa5 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: b81c5faa-2832-4df4-8db7-1ffb8d8209ab] Received unexpected event network-vif-plugged-da0306b3-8514-4ef0-984c-14d90dedd285 for instance with vm_state deleted and task_state None.
Feb 16 13:53:47 compute-0 nova_compute[185723]: 2026-02-16 13:53:47.830 185727 INFO nova.virt.libvirt.driver [-] [instance: 169216bd-b5ad-4408-8962-d36ad92cbf8c] Instance destroyed successfully.
Feb 16 13:53:47 compute-0 nova_compute[185723]: 2026-02-16 13:53:47.830 185727 DEBUG nova.objects.instance [None req-a8bc9575-b9f3-4e02-832d-2b4ad31e1cb8 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] Lazy-loading 'resources' on Instance uuid 169216bd-b5ad-4408-8962-d36ad92cbf8c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 13:53:47 compute-0 podman[217391]: 2026-02-16 13:53:47.834424523 +0000 UTC m=+0.053021940 container remove aac1415ddbfc290c7f36da12fe165002ab21e8ceafd75a0310b884de947e1936 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9f3f30c5-b9b2-44c9-bea9-678f6d4e1e0c, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20260127)
Feb 16 13:53:47 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:53:47.839 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[fd115fdd-a61d-4426-bb35-316c08855c34]: (4, ('Mon Feb 16 01:53:47 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-9f3f30c5-b9b2-44c9-bea9-678f6d4e1e0c (aac1415ddbfc290c7f36da12fe165002ab21e8ceafd75a0310b884de947e1936)\naac1415ddbfc290c7f36da12fe165002ab21e8ceafd75a0310b884de947e1936\nMon Feb 16 01:53:47 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-9f3f30c5-b9b2-44c9-bea9-678f6d4e1e0c (aac1415ddbfc290c7f36da12fe165002ab21e8ceafd75a0310b884de947e1936)\naac1415ddbfc290c7f36da12fe165002ab21e8ceafd75a0310b884de947e1936\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:53:47 compute-0 nova_compute[185723]: 2026-02-16 13:53:47.841 185727 DEBUG nova.virt.libvirt.vif [None req-a8bc9575-b9f3-4e02-832d-2b4ad31e1cb8 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-16T13:52:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadBalancingStrategy-server-5212048',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteworkloadbalancingstrategy-server-5212048',id=27,image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-16T13:53:00Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='88d7e9d22dc247d4b0e2e95ecc7e73ad',ramdisk_id='',reservation_id='r-wp4nxnp9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteWorkloadBalancingStrategy-492275053',owner_user_name='tempest-TestExecuteWorkloadBalancingStrategy-492275053-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-16T13:53:00Z,user_data=None,user_id='178de9ab917a4ba5a84dc9f520a0847f',uuid=169216bd-b5ad-4408-8962-d36ad92cbf8c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "1f9f033c-6441-4f4d-a631-8d0870baa901", "address": "fa:16:3e:23:9d:0a", "network": {"id": "9f3f30c5-b9b2-44c9-bea9-678f6d4e1e0c", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalancingStrategy-2080676524-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "88d7e9d22dc247d4b0e2e95ecc7e73ad", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1f9f033c-64", "ovs_interfaceid": "1f9f033c-6441-4f4d-a631-8d0870baa901", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Feb 16 13:53:47 compute-0 nova_compute[185723]: 2026-02-16 13:53:47.842 185727 DEBUG nova.network.os_vif_util [None req-a8bc9575-b9f3-4e02-832d-2b4ad31e1cb8 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] Converting VIF {"id": "1f9f033c-6441-4f4d-a631-8d0870baa901", "address": "fa:16:3e:23:9d:0a", "network": {"id": "9f3f30c5-b9b2-44c9-bea9-678f6d4e1e0c", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalancingStrategy-2080676524-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "88d7e9d22dc247d4b0e2e95ecc7e73ad", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1f9f033c-64", "ovs_interfaceid": "1f9f033c-6441-4f4d-a631-8d0870baa901", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 13:53:47 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:53:47.840 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[d311ca29-7b15-44be-abf9-d21790a4846b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:53:47 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:53:47.841 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9f3f30c5-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:53:47 compute-0 nova_compute[185723]: 2026-02-16 13:53:47.842 185727 DEBUG nova.network.os_vif_util [None req-a8bc9575-b9f3-4e02-832d-2b4ad31e1cb8 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:23:9d:0a,bridge_name='br-int',has_traffic_filtering=True,id=1f9f033c-6441-4f4d-a631-8d0870baa901,network=Network(9f3f30c5-b9b2-44c9-bea9-678f6d4e1e0c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1f9f033c-64') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 13:53:47 compute-0 nova_compute[185723]: 2026-02-16 13:53:47.843 185727 DEBUG os_vif [None req-a8bc9575-b9f3-4e02-832d-2b4ad31e1cb8 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:23:9d:0a,bridge_name='br-int',has_traffic_filtering=True,id=1f9f033c-6441-4f4d-a631-8d0870baa901,network=Network(9f3f30c5-b9b2-44c9-bea9-678f6d4e1e0c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1f9f033c-64') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Feb 16 13:53:47 compute-0 kernel: tap9f3f30c5-b0: left promiscuous mode
Feb 16 13:53:47 compute-0 nova_compute[185723]: 2026-02-16 13:53:47.844 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:53:47 compute-0 nova_compute[185723]: 2026-02-16 13:53:47.845 185727 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1f9f033c-64, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:53:47 compute-0 nova_compute[185723]: 2026-02-16 13:53:47.846 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:53:47 compute-0 nova_compute[185723]: 2026-02-16 13:53:47.847 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 13:53:47 compute-0 nova_compute[185723]: 2026-02-16 13:53:47.850 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:53:47 compute-0 nova_compute[185723]: 2026-02-16 13:53:47.851 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:53:47 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:53:47.853 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[5f896771-75a5-42ed-bdce-6a189c2313c4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:53:47 compute-0 nova_compute[185723]: 2026-02-16 13:53:47.853 185727 INFO os_vif [None req-a8bc9575-b9f3-4e02-832d-2b4ad31e1cb8 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:23:9d:0a,bridge_name='br-int',has_traffic_filtering=True,id=1f9f033c-6441-4f4d-a631-8d0870baa901,network=Network(9f3f30c5-b9b2-44c9-bea9-678f6d4e1e0c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1f9f033c-64')
Feb 16 13:53:47 compute-0 nova_compute[185723]: 2026-02-16 13:53:47.854 185727 INFO nova.virt.libvirt.driver [None req-a8bc9575-b9f3-4e02-832d-2b4ad31e1cb8 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] [instance: 169216bd-b5ad-4408-8962-d36ad92cbf8c] Deleting instance files /var/lib/nova/instances/169216bd-b5ad-4408-8962-d36ad92cbf8c_del
Feb 16 13:53:47 compute-0 nova_compute[185723]: 2026-02-16 13:53:47.855 185727 INFO nova.virt.libvirt.driver [None req-a8bc9575-b9f3-4e02-832d-2b4ad31e1cb8 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] [instance: 169216bd-b5ad-4408-8962-d36ad92cbf8c] Deletion of /var/lib/nova/instances/169216bd-b5ad-4408-8962-d36ad92cbf8c_del complete
Feb 16 13:53:47 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:53:47.869 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[48a83b92-436f-4ac3-bc4a-03a61a24929a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:53:47 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:53:47.872 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[677cc09e-bf78-4b29-a665-95655e5f3e0f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:53:47 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:53:47.887 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[2ece870e-2f08-4d8d-b9f8-19cb171fcbc3]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 606307, 'reachable_time': 22488, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217426, 'error': None, 'target': 'ovnmeta-9f3f30c5-b9b2-44c9-bea9-678f6d4e1e0c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:53:47 compute-0 systemd[1]: run-netns-ovnmeta\x2d9f3f30c5\x2db9b2\x2d44c9\x2dbea9\x2d678f6d4e1e0c.mount: Deactivated successfully.
Feb 16 13:53:47 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:53:47.891 105762 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-9f3f30c5-b9b2-44c9-bea9-678f6d4e1e0c deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Feb 16 13:53:47 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:53:47.891 105762 DEBUG oslo.privsep.daemon [-] privsep: reply[5db8623d-1252-4334-bb78-383e20164d19]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:53:47 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:53:47.893 105360 INFO neutron.agent.ovn.metadata.agent [-] Port 1f9f033c-6441-4f4d-a631-8d0870baa901 in datapath 9f3f30c5-b9b2-44c9-bea9-678f6d4e1e0c unbound from our chassis
Feb 16 13:53:47 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:53:47.894 105360 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 9f3f30c5-b9b2-44c9-bea9-678f6d4e1e0c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 16 13:53:47 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:53:47.895 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[aba050f3-d264-4c25-a42f-75aa4ae7b485]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:53:47 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:53:47.895 105360 INFO neutron.agent.ovn.metadata.agent [-] Port 1f9f033c-6441-4f4d-a631-8d0870baa901 in datapath 9f3f30c5-b9b2-44c9-bea9-678f6d4e1e0c unbound from our chassis
Feb 16 13:53:47 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:53:47.896 105360 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 9f3f30c5-b9b2-44c9-bea9-678f6d4e1e0c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 16 13:53:47 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:53:47.897 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[fb53d068-abe5-4d41-afe4-e85b5d7f50fe]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:53:47 compute-0 nova_compute[185723]: 2026-02-16 13:53:47.911 185727 INFO nova.compute.manager [None req-a8bc9575-b9f3-4e02-832d-2b4ad31e1cb8 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] [instance: 169216bd-b5ad-4408-8962-d36ad92cbf8c] Took 0.34 seconds to destroy the instance on the hypervisor.
Feb 16 13:53:47 compute-0 nova_compute[185723]: 2026-02-16 13:53:47.911 185727 DEBUG oslo.service.loopingcall [None req-a8bc9575-b9f3-4e02-832d-2b4ad31e1cb8 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Feb 16 13:53:47 compute-0 nova_compute[185723]: 2026-02-16 13:53:47.912 185727 DEBUG nova.compute.manager [-] [instance: 169216bd-b5ad-4408-8962-d36ad92cbf8c] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Feb 16 13:53:47 compute-0 nova_compute[185723]: 2026-02-16 13:53:47.912 185727 DEBUG nova.network.neutron [-] [instance: 169216bd-b5ad-4408-8962-d36ad92cbf8c] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Feb 16 13:53:48 compute-0 nova_compute[185723]: 2026-02-16 13:53:48.639 185727 DEBUG nova.network.neutron [-] [instance: 169216bd-b5ad-4408-8962-d36ad92cbf8c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 13:53:48 compute-0 nova_compute[185723]: 2026-02-16 13:53:48.659 185727 INFO nova.compute.manager [-] [instance: 169216bd-b5ad-4408-8962-d36ad92cbf8c] Took 0.75 seconds to deallocate network for instance.
Feb 16 13:53:48 compute-0 nova_compute[185723]: 2026-02-16 13:53:48.710 185727 DEBUG nova.compute.manager [req-7ef85bf6-9ac9-427c-8724-8a8c079d857e req-4f00a357-90e4-4650-bf98-e42d3c0475ca faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 169216bd-b5ad-4408-8962-d36ad92cbf8c] Received event network-vif-deleted-1f9f033c-6441-4f4d-a631-8d0870baa901 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:53:48 compute-0 nova_compute[185723]: 2026-02-16 13:53:48.713 185727 DEBUG oslo_concurrency.lockutils [None req-a8bc9575-b9f3-4e02-832d-2b4ad31e1cb8 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:53:48 compute-0 nova_compute[185723]: 2026-02-16 13:53:48.713 185727 DEBUG oslo_concurrency.lockutils [None req-a8bc9575-b9f3-4e02-832d-2b4ad31e1cb8 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:53:48 compute-0 nova_compute[185723]: 2026-02-16 13:53:48.757 185727 DEBUG nova.compute.provider_tree [None req-a8bc9575-b9f3-4e02-832d-2b4ad31e1cb8 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] Inventory has not changed in ProviderTree for provider: c9501a85-df32-4b8f-bce0-9425ef1e7866 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 13:53:48 compute-0 nova_compute[185723]: 2026-02-16 13:53:48.771 185727 DEBUG nova.scheduler.client.report [None req-a8bc9575-b9f3-4e02-832d-2b4ad31e1cb8 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] Inventory has not changed for provider c9501a85-df32-4b8f-bce0-9425ef1e7866 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 13:53:48 compute-0 nova_compute[185723]: 2026-02-16 13:53:48.796 185727 DEBUG oslo_concurrency.lockutils [None req-a8bc9575-b9f3-4e02-832d-2b4ad31e1cb8 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.083s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:53:48 compute-0 nova_compute[185723]: 2026-02-16 13:53:48.820 185727 INFO nova.scheduler.client.report [None req-a8bc9575-b9f3-4e02-832d-2b4ad31e1cb8 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] Deleted allocations for instance 169216bd-b5ad-4408-8962-d36ad92cbf8c
Feb 16 13:53:48 compute-0 nova_compute[185723]: 2026-02-16 13:53:48.903 185727 DEBUG oslo_concurrency.lockutils [None req-a8bc9575-b9f3-4e02-832d-2b4ad31e1cb8 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] Lock "169216bd-b5ad-4408-8962-d36ad92cbf8c" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.338s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:53:49 compute-0 nova_compute[185723]: 2026-02-16 13:53:49.966 185727 DEBUG nova.compute.manager [req-1ba9419f-aaf6-43e0-8fef-d55a99c807ca req-6a7d28e5-b642-48cc-bb98-526fab92c78d faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 169216bd-b5ad-4408-8962-d36ad92cbf8c] Received event network-vif-unplugged-1f9f033c-6441-4f4d-a631-8d0870baa901 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:53:49 compute-0 nova_compute[185723]: 2026-02-16 13:53:49.967 185727 DEBUG oslo_concurrency.lockutils [req-1ba9419f-aaf6-43e0-8fef-d55a99c807ca req-6a7d28e5-b642-48cc-bb98-526fab92c78d faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "169216bd-b5ad-4408-8962-d36ad92cbf8c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:53:49 compute-0 nova_compute[185723]: 2026-02-16 13:53:49.967 185727 DEBUG oslo_concurrency.lockutils [req-1ba9419f-aaf6-43e0-8fef-d55a99c807ca req-6a7d28e5-b642-48cc-bb98-526fab92c78d faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "169216bd-b5ad-4408-8962-d36ad92cbf8c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:53:49 compute-0 nova_compute[185723]: 2026-02-16 13:53:49.967 185727 DEBUG oslo_concurrency.lockutils [req-1ba9419f-aaf6-43e0-8fef-d55a99c807ca req-6a7d28e5-b642-48cc-bb98-526fab92c78d faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "169216bd-b5ad-4408-8962-d36ad92cbf8c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:53:49 compute-0 nova_compute[185723]: 2026-02-16 13:53:49.967 185727 DEBUG nova.compute.manager [req-1ba9419f-aaf6-43e0-8fef-d55a99c807ca req-6a7d28e5-b642-48cc-bb98-526fab92c78d faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 169216bd-b5ad-4408-8962-d36ad92cbf8c] No waiting events found dispatching network-vif-unplugged-1f9f033c-6441-4f4d-a631-8d0870baa901 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:53:49 compute-0 nova_compute[185723]: 2026-02-16 13:53:49.968 185727 WARNING nova.compute.manager [req-1ba9419f-aaf6-43e0-8fef-d55a99c807ca req-6a7d28e5-b642-48cc-bb98-526fab92c78d faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 169216bd-b5ad-4408-8962-d36ad92cbf8c] Received unexpected event network-vif-unplugged-1f9f033c-6441-4f4d-a631-8d0870baa901 for instance with vm_state deleted and task_state None.
Feb 16 13:53:49 compute-0 nova_compute[185723]: 2026-02-16 13:53:49.968 185727 DEBUG nova.compute.manager [req-1ba9419f-aaf6-43e0-8fef-d55a99c807ca req-6a7d28e5-b642-48cc-bb98-526fab92c78d faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 169216bd-b5ad-4408-8962-d36ad92cbf8c] Received event network-vif-plugged-1f9f033c-6441-4f4d-a631-8d0870baa901 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:53:49 compute-0 nova_compute[185723]: 2026-02-16 13:53:49.968 185727 DEBUG oslo_concurrency.lockutils [req-1ba9419f-aaf6-43e0-8fef-d55a99c807ca req-6a7d28e5-b642-48cc-bb98-526fab92c78d faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "169216bd-b5ad-4408-8962-d36ad92cbf8c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:53:49 compute-0 nova_compute[185723]: 2026-02-16 13:53:49.968 185727 DEBUG oslo_concurrency.lockutils [req-1ba9419f-aaf6-43e0-8fef-d55a99c807ca req-6a7d28e5-b642-48cc-bb98-526fab92c78d faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "169216bd-b5ad-4408-8962-d36ad92cbf8c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:53:49 compute-0 nova_compute[185723]: 2026-02-16 13:53:49.968 185727 DEBUG oslo_concurrency.lockutils [req-1ba9419f-aaf6-43e0-8fef-d55a99c807ca req-6a7d28e5-b642-48cc-bb98-526fab92c78d faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "169216bd-b5ad-4408-8962-d36ad92cbf8c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:53:49 compute-0 nova_compute[185723]: 2026-02-16 13:53:49.969 185727 DEBUG nova.compute.manager [req-1ba9419f-aaf6-43e0-8fef-d55a99c807ca req-6a7d28e5-b642-48cc-bb98-526fab92c78d faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 169216bd-b5ad-4408-8962-d36ad92cbf8c] No waiting events found dispatching network-vif-plugged-1f9f033c-6441-4f4d-a631-8d0870baa901 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:53:49 compute-0 nova_compute[185723]: 2026-02-16 13:53:49.969 185727 WARNING nova.compute.manager [req-1ba9419f-aaf6-43e0-8fef-d55a99c807ca req-6a7d28e5-b642-48cc-bb98-526fab92c78d faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 169216bd-b5ad-4408-8962-d36ad92cbf8c] Received unexpected event network-vif-plugged-1f9f033c-6441-4f4d-a631-8d0870baa901 for instance with vm_state deleted and task_state None.
Feb 16 13:53:51 compute-0 sshd-session[217427]: Invalid user postgres from 146.190.22.227 port 56332
Feb 16 13:53:51 compute-0 nova_compute[185723]: 2026-02-16 13:53:51.063 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:53:51 compute-0 sshd-session[217427]: Connection closed by invalid user postgres 146.190.22.227 port 56332 [preauth]
Feb 16 13:53:52 compute-0 nova_compute[185723]: 2026-02-16 13:53:52.846 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:53:54 compute-0 podman[217429]: 2026-02-16 13:53:54.035490802 +0000 UTC m=+0.070986708 container health_status 19961a2f872cc72daf954f4dd556f980b5f58f137d145776df6132c167a8a93c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_controller, managed_by=edpm_ansible)
Feb 16 13:53:56 compute-0 nova_compute[185723]: 2026-02-16 13:53:56.068 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:53:57 compute-0 nova_compute[185723]: 2026-02-16 13:53:57.847 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:53:59 compute-0 podman[217455]: 2026-02-16 13:53:59.001275257 +0000 UTC m=+0.044798456 container health_status 4ec6105d1727f201f656d7e6e358931be8ceabc5af37fb5ad779abc620122180 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 16 13:53:59 compute-0 podman[195053]: time="2026-02-16T13:53:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:53:59 compute-0 podman[195053]: @ - - [16/Feb/2026:13:53:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16012 "" "Go-http-client/1.1"
Feb 16 13:53:59 compute-0 podman[195053]: @ - - [16/Feb/2026:13:53:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2182 "" "Go-http-client/1.1"
Feb 16 13:53:59 compute-0 nova_compute[185723]: 2026-02-16 13:53:59.896 185727 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1771250024.8948128, b81c5faa-2832-4df4-8db7-1ffb8d8209ab => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 13:53:59 compute-0 nova_compute[185723]: 2026-02-16 13:53:59.897 185727 INFO nova.compute.manager [-] [instance: b81c5faa-2832-4df4-8db7-1ffb8d8209ab] VM Stopped (Lifecycle Event)
Feb 16 13:53:59 compute-0 nova_compute[185723]: 2026-02-16 13:53:59.918 185727 DEBUG nova.compute.manager [None req-c2c9ea1d-3c3f-48ee-a583-dab96334efe1 - - - - - -] [instance: b81c5faa-2832-4df4-8db7-1ffb8d8209ab] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:54:00 compute-0 sshd-session[217479]: Invalid user test from 146.190.226.24 port 59210
Feb 16 13:54:01 compute-0 nova_compute[185723]: 2026-02-16 13:54:01.068 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:54:01 compute-0 sshd-session[217479]: Connection closed by invalid user test 146.190.226.24 port 59210 [preauth]
Feb 16 13:54:01 compute-0 openstack_network_exporter[197909]: ERROR   13:54:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:54:01 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:54:01 compute-0 openstack_network_exporter[197909]: ERROR   13:54:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:54:01 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:54:02 compute-0 nova_compute[185723]: 2026-02-16 13:54:02.829 185727 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1771250027.8277473, 169216bd-b5ad-4408-8962-d36ad92cbf8c => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 13:54:02 compute-0 nova_compute[185723]: 2026-02-16 13:54:02.829 185727 INFO nova.compute.manager [-] [instance: 169216bd-b5ad-4408-8962-d36ad92cbf8c] VM Stopped (Lifecycle Event)
Feb 16 13:54:02 compute-0 nova_compute[185723]: 2026-02-16 13:54:02.850 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:54:02 compute-0 nova_compute[185723]: 2026-02-16 13:54:02.854 185727 DEBUG nova.compute.manager [None req-59c9207b-ad2a-472e-9be0-1f42e85a6353 - - - - - -] [instance: 169216bd-b5ad-4408-8962-d36ad92cbf8c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:54:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:54:03.249 105360 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:54:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:54:03.249 105360 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:54:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:54:03.250 105360 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:54:03 compute-0 nova_compute[185723]: 2026-02-16 13:54:03.433 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:54:06 compute-0 nova_compute[185723]: 2026-02-16 13:54:06.071 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:54:07 compute-0 nova_compute[185723]: 2026-02-16 13:54:07.852 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:54:11 compute-0 nova_compute[185723]: 2026-02-16 13:54:11.074 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:54:12 compute-0 nova_compute[185723]: 2026-02-16 13:54:12.892 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:54:16 compute-0 nova_compute[185723]: 2026-02-16 13:54:16.075 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:54:17 compute-0 nova_compute[185723]: 2026-02-16 13:54:17.894 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:54:18 compute-0 podman[217481]: 2026-02-16 13:54:18.003251202 +0000 UTC m=+0.045421211 container health_status 93151dcbec9f159583042df5101bdd7b5b1bcb5f3117955b4bc9cd1c0e465b98 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9/ubi-minimal, org.opencontainers.image.created=2026-02-05T04:57:10Z, version=9.7, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., distribution-scope=public, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, maintainer=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1770267347, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, architecture=x86_64, config_id=openstack_network_exporter, build-date=2026-02-05T04:57:10Z, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=)
Feb 16 13:54:18 compute-0 podman[217482]: 2026-02-16 13:54:18.003206561 +0000 UTC m=+0.042963050 container health_status d69bd0be854ec8043fcba555b70c2cd311c7a2510d07d6bc9929fe7684abece9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20260127, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Feb 16 13:54:18 compute-0 ovn_controller[96072]: 2026-02-16T13:54:18Z|00261|memory_trim|INFO|Detected inactivity (last active 30013 ms ago): trimming memory
Feb 16 13:54:21 compute-0 nova_compute[185723]: 2026-02-16 13:54:21.077 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:54:22 compute-0 sshd-session[217522]: Invalid user ubuntu from 64.227.72.94 port 36030
Feb 16 13:54:22 compute-0 sshd-session[217522]: Connection closed by invalid user ubuntu 64.227.72.94 port 36030 [preauth]
Feb 16 13:54:22 compute-0 nova_compute[185723]: 2026-02-16 13:54:22.938 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:54:25 compute-0 podman[217524]: 2026-02-16 13:54:25.02115974 +0000 UTC m=+0.062530937 container health_status 19961a2f872cc72daf954f4dd556f980b5f58f137d145776df6132c167a8a93c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3)
Feb 16 13:54:26 compute-0 nova_compute[185723]: 2026-02-16 13:54:26.078 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:54:27 compute-0 nova_compute[185723]: 2026-02-16 13:54:27.941 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:54:28 compute-0 sshd-session[217551]: Invalid user postgres from 188.166.42.159 port 50928
Feb 16 13:54:28 compute-0 sshd-session[217551]: Connection closed by invalid user postgres 188.166.42.159 port 50928 [preauth]
Feb 16 13:54:29 compute-0 nova_compute[185723]: 2026-02-16 13:54:29.451 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:54:29 compute-0 nova_compute[185723]: 2026-02-16 13:54:29.452 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:54:29 compute-0 nova_compute[185723]: 2026-02-16 13:54:29.537 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:54:29 compute-0 nova_compute[185723]: 2026-02-16 13:54:29.538 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:54:29 compute-0 nova_compute[185723]: 2026-02-16 13:54:29.538 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:54:29 compute-0 nova_compute[185723]: 2026-02-16 13:54:29.538 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 16 13:54:29 compute-0 podman[217554]: 2026-02-16 13:54:29.6432045 +0000 UTC m=+0.059473442 container health_status 4ec6105d1727f201f656d7e6e358931be8ceabc5af37fb5ad779abc620122180 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 16 13:54:29 compute-0 nova_compute[185723]: 2026-02-16 13:54:29.673 185727 WARNING nova.virt.libvirt.driver [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 13:54:29 compute-0 nova_compute[185723]: 2026-02-16 13:54:29.674 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5848MB free_disk=73.22000122070312GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 16 13:54:29 compute-0 nova_compute[185723]: 2026-02-16 13:54:29.674 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:54:29 compute-0 nova_compute[185723]: 2026-02-16 13:54:29.675 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:54:29 compute-0 podman[195053]: time="2026-02-16T13:54:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:54:29 compute-0 podman[195053]: @ - - [16/Feb/2026:13:54:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16012 "" "Go-http-client/1.1"
Feb 16 13:54:29 compute-0 podman[195053]: @ - - [16/Feb/2026:13:54:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2178 "" "Go-http-client/1.1"
Feb 16 13:54:30 compute-0 nova_compute[185723]: 2026-02-16 13:54:30.145 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 16 13:54:30 compute-0 nova_compute[185723]: 2026-02-16 13:54:30.145 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 16 13:54:30 compute-0 nova_compute[185723]: 2026-02-16 13:54:30.289 185727 DEBUG nova.compute.provider_tree [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Inventory has not changed in ProviderTree for provider: c9501a85-df32-4b8f-bce0-9425ef1e7866 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 13:54:30 compute-0 nova_compute[185723]: 2026-02-16 13:54:30.356 185727 DEBUG nova.scheduler.client.report [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Inventory has not changed for provider c9501a85-df32-4b8f-bce0-9425ef1e7866 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 13:54:30 compute-0 nova_compute[185723]: 2026-02-16 13:54:30.394 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 16 13:54:30 compute-0 nova_compute[185723]: 2026-02-16 13:54:30.395 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.720s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:54:31 compute-0 nova_compute[185723]: 2026-02-16 13:54:31.081 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:54:31 compute-0 nova_compute[185723]: 2026-02-16 13:54:31.377 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:54:31 compute-0 nova_compute[185723]: 2026-02-16 13:54:31.377 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:54:31 compute-0 openstack_network_exporter[197909]: ERROR   13:54:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:54:31 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:54:31 compute-0 openstack_network_exporter[197909]: ERROR   13:54:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:54:31 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:54:31 compute-0 nova_compute[185723]: 2026-02-16 13:54:31.440 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:54:31 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:54:31.440 105360 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=33, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '52:96:1b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '16:6c:7a:bd:49:39'}, ipsec=False) old=SB_Global(nb_cfg=32) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 13:54:31 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:54:31.441 105360 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 16 13:54:32 compute-0 nova_compute[185723]: 2026-02-16 13:54:32.432 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:54:32 compute-0 nova_compute[185723]: 2026-02-16 13:54:32.551 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:54:32 compute-0 nova_compute[185723]: 2026-02-16 13:54:32.994 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:54:33 compute-0 nova_compute[185723]: 2026-02-16 13:54:33.433 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:54:33 compute-0 nova_compute[185723]: 2026-02-16 13:54:33.433 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 16 13:54:33 compute-0 nova_compute[185723]: 2026-02-16 13:54:33.434 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 16 13:54:33 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:54:33.443 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=b0e583b2-47d7-4bde-bbd6-282143e0c194, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '33'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:54:33 compute-0 nova_compute[185723]: 2026-02-16 13:54:33.470 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 16 13:54:34 compute-0 nova_compute[185723]: 2026-02-16 13:54:34.433 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:54:34 compute-0 nova_compute[185723]: 2026-02-16 13:54:34.433 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:54:34 compute-0 nova_compute[185723]: 2026-02-16 13:54:34.434 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 16 13:54:35 compute-0 nova_compute[185723]: 2026-02-16 13:54:35.429 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:54:36 compute-0 nova_compute[185723]: 2026-02-16 13:54:36.082 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:54:37 compute-0 nova_compute[185723]: 2026-02-16 13:54:37.996 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:54:41 compute-0 nova_compute[185723]: 2026-02-16 13:54:41.125 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:54:42 compute-0 nova_compute[185723]: 2026-02-16 13:54:42.998 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:54:46 compute-0 nova_compute[185723]: 2026-02-16 13:54:46.128 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:54:48 compute-0 nova_compute[185723]: 2026-02-16 13:54:48.000 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:54:49 compute-0 podman[217580]: 2026-02-16 13:54:49.006190942 +0000 UTC m=+0.049090173 container health_status 93151dcbec9f159583042df5101bdd7b5b1bcb5f3117955b4bc9cd1c0e465b98 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.openshift.tags=minimal rhel9, config_id=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, release=1770267347, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, vcs-type=git, vendor=Red Hat, Inc., build-date=2026-02-05T04:57:10Z, container_name=openstack_network_exporter, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.buildah.version=1.33.7, managed_by=edpm_ansible, maintainer=Red Hat, Inc., name=ubi9/ubi-minimal, org.opencontainers.image.created=2026-02-05T04:57:10Z, version=9.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Feb 16 13:54:49 compute-0 podman[217581]: 2026-02-16 13:54:49.012466838 +0000 UTC m=+0.050835616 container health_status d69bd0be854ec8043fcba555b70c2cd311c7a2510d07d6bc9929fe7684abece9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Feb 16 13:54:51 compute-0 nova_compute[185723]: 2026-02-16 13:54:51.130 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:54:53 compute-0 nova_compute[185723]: 2026-02-16 13:54:53.040 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:54:56 compute-0 podman[217620]: 2026-02-16 13:54:56.029070442 +0000 UTC m=+0.068927917 container health_status 19961a2f872cc72daf954f4dd556f980b5f58f137d145776df6132c167a8a93c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Feb 16 13:54:56 compute-0 nova_compute[185723]: 2026-02-16 13:54:56.132 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:54:58 compute-0 nova_compute[185723]: 2026-02-16 13:54:58.043 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:54:59 compute-0 podman[195053]: time="2026-02-16T13:54:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:54:59 compute-0 podman[195053]: @ - - [16/Feb/2026:13:54:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16012 "" "Go-http-client/1.1"
Feb 16 13:54:59 compute-0 podman[195053]: @ - - [16/Feb/2026:13:54:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2182 "" "Go-http-client/1.1"
Feb 16 13:55:00 compute-0 podman[217649]: 2026-02-16 13:55:00.006142771 +0000 UTC m=+0.049063413 container health_status 4ec6105d1727f201f656d7e6e358931be8ceabc5af37fb5ad779abc620122180 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 16 13:55:01 compute-0 nova_compute[185723]: 2026-02-16 13:55:01.133 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:55:01 compute-0 openstack_network_exporter[197909]: ERROR   13:55:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:55:01 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:55:01 compute-0 openstack_network_exporter[197909]: ERROR   13:55:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:55:01 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:55:03 compute-0 nova_compute[185723]: 2026-02-16 13:55:03.093 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:55:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:55:03.250 105360 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:55:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:55:03.250 105360 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:55:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:55:03.250 105360 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:55:06 compute-0 nova_compute[185723]: 2026-02-16 13:55:06.135 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:55:06 compute-0 sshd-session[217673]: Invalid user test from 146.190.226.24 port 40628
Feb 16 13:55:06 compute-0 sshd-session[217673]: Connection closed by invalid user test 146.190.226.24 port 40628 [preauth]
Feb 16 13:55:07 compute-0 sshd-session[217675]: Invalid user ubuntu from 64.227.72.94 port 59612
Feb 16 13:55:07 compute-0 sshd-session[217675]: Connection closed by invalid user ubuntu 64.227.72.94 port 59612 [preauth]
Feb 16 13:55:08 compute-0 nova_compute[185723]: 2026-02-16 13:55:08.097 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:55:11 compute-0 nova_compute[185723]: 2026-02-16 13:55:11.137 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:55:13 compute-0 nova_compute[185723]: 2026-02-16 13:55:13.100 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:55:16 compute-0 nova_compute[185723]: 2026-02-16 13:55:16.138 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:55:18 compute-0 nova_compute[185723]: 2026-02-16 13:55:18.159 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:55:19 compute-0 ovn_controller[96072]: 2026-02-16T13:55:19Z|00262|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Feb 16 13:55:20 compute-0 podman[217678]: 2026-02-16 13:55:20.008111795 +0000 UTC m=+0.044944400 container health_status d69bd0be854ec8043fcba555b70c2cd311c7a2510d07d6bc9929fe7684abece9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Feb 16 13:55:20 compute-0 podman[217677]: 2026-02-16 13:55:20.008157736 +0000 UTC m=+0.049372770 container health_status 93151dcbec9f159583042df5101bdd7b5b1bcb5f3117955b4bc9cd1c0e465b98 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, distribution-scope=public, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., build-date=2026-02-05T04:57:10Z, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, config_id=openstack_network_exporter, org.opencontainers.image.created=2026-02-05T04:57:10Z, managed_by=edpm_ansible, name=ubi9/ubi-minimal, release=1770267347, version=9.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c)
Feb 16 13:55:21 compute-0 nova_compute[185723]: 2026-02-16 13:55:21.140 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:55:21 compute-0 sshd-session[217717]: Invalid user postgres from 188.166.42.159 port 56452
Feb 16 13:55:21 compute-0 sshd-session[217717]: Connection closed by invalid user postgres 188.166.42.159 port 56452 [preauth]
Feb 16 13:55:23 compute-0 nova_compute[185723]: 2026-02-16 13:55:23.162 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:55:26 compute-0 nova_compute[185723]: 2026-02-16 13:55:26.143 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:55:26 compute-0 sshd-session[217719]: Invalid user postgres from 146.190.22.227 port 51530
Feb 16 13:55:26 compute-0 podman[217721]: 2026-02-16 13:55:26.391033788 +0000 UTC m=+0.075784347 container health_status 19961a2f872cc72daf954f4dd556f980b5f58f137d145776df6132c167a8a93c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_controller)
Feb 16 13:55:26 compute-0 sshd-session[217719]: Connection closed by invalid user postgres 146.190.22.227 port 51530 [preauth]
Feb 16 13:55:28 compute-0 nova_compute[185723]: 2026-02-16 13:55:28.165 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:55:29 compute-0 nova_compute[185723]: 2026-02-16 13:55:29.433 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:55:29 compute-0 nova_compute[185723]: 2026-02-16 13:55:29.477 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:55:29 compute-0 nova_compute[185723]: 2026-02-16 13:55:29.478 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:55:29 compute-0 nova_compute[185723]: 2026-02-16 13:55:29.478 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:55:29 compute-0 nova_compute[185723]: 2026-02-16 13:55:29.478 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 16 13:55:29 compute-0 nova_compute[185723]: 2026-02-16 13:55:29.623 185727 WARNING nova.virt.libvirt.driver [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 13:55:29 compute-0 nova_compute[185723]: 2026-02-16 13:55:29.624 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5842MB free_disk=73.22007751464844GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 16 13:55:29 compute-0 nova_compute[185723]: 2026-02-16 13:55:29.624 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:55:29 compute-0 nova_compute[185723]: 2026-02-16 13:55:29.625 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:55:29 compute-0 systemd[1]: virtproxyd.service: Deactivated successfully.
Feb 16 13:55:29 compute-0 nova_compute[185723]: 2026-02-16 13:55:29.732 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 16 13:55:29 compute-0 nova_compute[185723]: 2026-02-16 13:55:29.733 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 16 13:55:29 compute-0 podman[195053]: time="2026-02-16T13:55:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:55:29 compute-0 podman[195053]: @ - - [16/Feb/2026:13:55:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16012 "" "Go-http-client/1.1"
Feb 16 13:55:29 compute-0 podman[195053]: @ - - [16/Feb/2026:13:55:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2183 "" "Go-http-client/1.1"
Feb 16 13:55:29 compute-0 nova_compute[185723]: 2026-02-16 13:55:29.788 185727 DEBUG nova.compute.provider_tree [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Inventory has not changed in ProviderTree for provider: c9501a85-df32-4b8f-bce0-9425ef1e7866 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 13:55:29 compute-0 nova_compute[185723]: 2026-02-16 13:55:29.810 185727 DEBUG nova.scheduler.client.report [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Inventory has not changed for provider c9501a85-df32-4b8f-bce0-9425ef1e7866 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 13:55:29 compute-0 nova_compute[185723]: 2026-02-16 13:55:29.812 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 16 13:55:29 compute-0 nova_compute[185723]: 2026-02-16 13:55:29.812 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.187s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:55:30 compute-0 nova_compute[185723]: 2026-02-16 13:55:30.813 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:55:31 compute-0 podman[217749]: 2026-02-16 13:55:31.010616415 +0000 UTC m=+0.049479882 container health_status 4ec6105d1727f201f656d7e6e358931be8ceabc5af37fb5ad779abc620122180 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 16 13:55:31 compute-0 nova_compute[185723]: 2026-02-16 13:55:31.144 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:55:31 compute-0 openstack_network_exporter[197909]: ERROR   13:55:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:55:31 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:55:31 compute-0 openstack_network_exporter[197909]: ERROR   13:55:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:55:31 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:55:31 compute-0 nova_compute[185723]: 2026-02-16 13:55:31.433 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:55:31 compute-0 nova_compute[185723]: 2026-02-16 13:55:31.433 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:55:33 compute-0 nova_compute[185723]: 2026-02-16 13:55:33.169 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:55:33 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:55:33.307 105360 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=34, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '52:96:1b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '16:6c:7a:bd:49:39'}, ipsec=False) old=SB_Global(nb_cfg=33) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 13:55:33 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:55:33.308 105360 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 16 13:55:33 compute-0 nova_compute[185723]: 2026-02-16 13:55:33.308 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:55:33 compute-0 nova_compute[185723]: 2026-02-16 13:55:33.433 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:55:34 compute-0 nova_compute[185723]: 2026-02-16 13:55:34.428 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:55:34 compute-0 nova_compute[185723]: 2026-02-16 13:55:34.454 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:55:34 compute-0 nova_compute[185723]: 2026-02-16 13:55:34.454 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 16 13:55:34 compute-0 nova_compute[185723]: 2026-02-16 13:55:34.455 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 16 13:55:34 compute-0 nova_compute[185723]: 2026-02-16 13:55:34.480 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 16 13:55:34 compute-0 nova_compute[185723]: 2026-02-16 13:55:34.480 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:55:34 compute-0 nova_compute[185723]: 2026-02-16 13:55:34.481 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 16 13:55:36 compute-0 nova_compute[185723]: 2026-02-16 13:55:36.146 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:55:36 compute-0 nova_compute[185723]: 2026-02-16 13:55:36.433 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:55:37 compute-0 nova_compute[185723]: 2026-02-16 13:55:37.428 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:55:38 compute-0 nova_compute[185723]: 2026-02-16 13:55:38.215 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:55:41 compute-0 nova_compute[185723]: 2026-02-16 13:55:41.148 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:55:41 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:55:41.309 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=b0e583b2-47d7-4bde-bbd6-282143e0c194, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '34'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:55:43 compute-0 nova_compute[185723]: 2026-02-16 13:55:43.254 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:55:46 compute-0 nova_compute[185723]: 2026-02-16 13:55:46.150 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:55:48 compute-0 nova_compute[185723]: 2026-02-16 13:55:48.256 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:55:51 compute-0 podman[217775]: 2026-02-16 13:55:51.012135089 +0000 UTC m=+0.052030536 container health_status d69bd0be854ec8043fcba555b70c2cd311c7a2510d07d6bc9929fe7684abece9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS)
Feb 16 13:55:51 compute-0 podman[217774]: 2026-02-16 13:55:51.012658362 +0000 UTC m=+0.054355074 container health_status 93151dcbec9f159583042df5101bdd7b5b1bcb5f3117955b4bc9cd1c0e465b98 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, release=1770267347, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, maintainer=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9/ubi-minimal, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, architecture=x86_64, io.openshift.expose-services=, org.opencontainers.image.created=2026-02-05T04:57:10Z, container_name=openstack_network_exporter, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2026-02-05T04:57:10Z, version=9.7, config_id=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream)
Feb 16 13:55:51 compute-0 nova_compute[185723]: 2026-02-16 13:55:51.183 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:55:51 compute-0 sshd-session[217815]: Invalid user ubuntu from 64.227.72.94 port 42542
Feb 16 13:55:51 compute-0 sshd-session[217815]: Connection closed by invalid user ubuntu 64.227.72.94 port 42542 [preauth]
Feb 16 13:55:53 compute-0 nova_compute[185723]: 2026-02-16 13:55:53.259 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:55:56 compute-0 nova_compute[185723]: 2026-02-16 13:55:56.184 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:55:57 compute-0 podman[217817]: 2026-02-16 13:55:57.033487094 +0000 UTC m=+0.070803143 container health_status 19961a2f872cc72daf954f4dd556f980b5f58f137d145776df6132c167a8a93c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Feb 16 13:55:58 compute-0 nova_compute[185723]: 2026-02-16 13:55:58.261 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:55:59 compute-0 podman[195053]: time="2026-02-16T13:55:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:55:59 compute-0 podman[195053]: @ - - [16/Feb/2026:13:55:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16012 "" "Go-http-client/1.1"
Feb 16 13:55:59 compute-0 podman[195053]: @ - - [16/Feb/2026:13:55:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2177 "" "Go-http-client/1.1"
Feb 16 13:56:01 compute-0 nova_compute[185723]: 2026-02-16 13:56:01.186 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:56:01 compute-0 openstack_network_exporter[197909]: ERROR   13:56:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:56:01 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:56:01 compute-0 openstack_network_exporter[197909]: ERROR   13:56:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:56:01 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:56:02 compute-0 podman[217843]: 2026-02-16 13:56:02.009225388 +0000 UTC m=+0.049846081 container health_status 4ec6105d1727f201f656d7e6e358931be8ceabc5af37fb5ad779abc620122180 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 16 13:56:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:56:03.251 105360 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:56:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:56:03.251 105360 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:56:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:56:03.251 105360 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:56:03 compute-0 nova_compute[185723]: 2026-02-16 13:56:03.265 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:56:06 compute-0 nova_compute[185723]: 2026-02-16 13:56:06.189 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:56:08 compute-0 nova_compute[185723]: 2026-02-16 13:56:08.267 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:56:11 compute-0 nova_compute[185723]: 2026-02-16 13:56:11.190 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:56:13 compute-0 sshd-session[217865]: Invalid user test from 146.190.226.24 port 58758
Feb 16 13:56:13 compute-0 sshd-session[217867]: Invalid user postgres from 188.166.42.159 port 59782
Feb 16 13:56:13 compute-0 nova_compute[185723]: 2026-02-16 13:56:13.269 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:56:13 compute-0 sshd-session[217865]: Connection closed by invalid user test 146.190.226.24 port 58758 [preauth]
Feb 16 13:56:13 compute-0 sshd-session[217867]: Connection closed by invalid user postgres 188.166.42.159 port 59782 [preauth]
Feb 16 13:56:16 compute-0 nova_compute[185723]: 2026-02-16 13:56:16.191 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:56:18 compute-0 nova_compute[185723]: 2026-02-16 13:56:18.271 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:56:21 compute-0 nova_compute[185723]: 2026-02-16 13:56:21.194 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:56:22 compute-0 podman[217869]: 2026-02-16 13:56:22.01073727 +0000 UTC m=+0.046994371 container health_status 93151dcbec9f159583042df5101bdd7b5b1bcb5f3117955b4bc9cd1c0e465b98 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, release=1770267347, name=ubi9/ubi-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., org.opencontainers.image.created=2026-02-05T04:57:10Z, vcs-type=git, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.33.7, build-date=2026-02-05T04:57:10Z, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, architecture=x86_64, managed_by=edpm_ansible, config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., version=9.7)
Feb 16 13:56:22 compute-0 podman[217870]: 2026-02-16 13:56:22.014549645 +0000 UTC m=+0.046757575 container health_status d69bd0be854ec8043fcba555b70c2cd311c7a2510d07d6bc9929fe7684abece9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, config_id=ovn_metadata_agent, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent)
Feb 16 13:56:23 compute-0 nova_compute[185723]: 2026-02-16 13:56:23.275 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:56:26 compute-0 nova_compute[185723]: 2026-02-16 13:56:26.034 185727 DEBUG nova.virt.libvirt.driver [None req-e6b0b2de-d7dc-4ef7-9379-840c95d8b8ab bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3266d7e2-8d63-44ff-970a-45b95f88dc2f] Creating tmpfile /var/lib/nova/instances/tmphtyz6pdh to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10041
Feb 16 13:56:26 compute-0 nova_compute[185723]: 2026-02-16 13:56:26.036 185727 DEBUG nova.compute.manager [None req-e6b0b2de-d7dc-4ef7-9379-840c95d8b8ab bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmphtyz6pdh',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.9/site-packages/nova/compute/manager.py:8476
Feb 16 13:56:26 compute-0 nova_compute[185723]: 2026-02-16 13:56:26.194 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:56:26 compute-0 nova_compute[185723]: 2026-02-16 13:56:26.980 185727 DEBUG nova.compute.manager [None req-e6b0b2de-d7dc-4ef7-9379-840c95d8b8ab bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmphtyz6pdh',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='3266d7e2-8d63-44ff-970a-45b95f88dc2f',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8604
Feb 16 13:56:27 compute-0 nova_compute[185723]: 2026-02-16 13:56:27.014 185727 DEBUG oslo_concurrency.lockutils [None req-e6b0b2de-d7dc-4ef7-9379-840c95d8b8ab bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "refresh_cache-3266d7e2-8d63-44ff-970a-45b95f88dc2f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 13:56:27 compute-0 nova_compute[185723]: 2026-02-16 13:56:27.014 185727 DEBUG oslo_concurrency.lockutils [None req-e6b0b2de-d7dc-4ef7-9379-840c95d8b8ab bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Acquired lock "refresh_cache-3266d7e2-8d63-44ff-970a-45b95f88dc2f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 13:56:27 compute-0 nova_compute[185723]: 2026-02-16 13:56:27.014 185727 DEBUG nova.network.neutron [None req-e6b0b2de-d7dc-4ef7-9379-840c95d8b8ab bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3266d7e2-8d63-44ff-970a-45b95f88dc2f] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 16 13:56:28 compute-0 podman[217907]: 2026-02-16 13:56:28.066535733 +0000 UTC m=+0.109787184 container health_status 19961a2f872cc72daf954f4dd556f980b5f58f137d145776df6132c167a8a93c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Feb 16 13:56:28 compute-0 nova_compute[185723]: 2026-02-16 13:56:28.276 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:56:28 compute-0 nova_compute[185723]: 2026-02-16 13:56:28.989 185727 DEBUG nova.network.neutron [None req-e6b0b2de-d7dc-4ef7-9379-840c95d8b8ab bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3266d7e2-8d63-44ff-970a-45b95f88dc2f] Updating instance_info_cache with network_info: [{"id": "8d907fd7-6b02-461e-8612-e5f777af8eea", "address": "fa:16:3e:d1:53:06", "network": {"id": "09e98704-cf1f-47d1-8021-93211c7aa37e", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1331653241-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "77712f67f33f426cb3d6d9b7a640f32a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8d907fd7-6b", "ovs_interfaceid": "8d907fd7-6b02-461e-8612-e5f777af8eea", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 13:56:29 compute-0 nova_compute[185723]: 2026-02-16 13:56:29.016 185727 DEBUG oslo_concurrency.lockutils [None req-e6b0b2de-d7dc-4ef7-9379-840c95d8b8ab bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Releasing lock "refresh_cache-3266d7e2-8d63-44ff-970a-45b95f88dc2f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 13:56:29 compute-0 nova_compute[185723]: 2026-02-16 13:56:29.018 185727 DEBUG nova.virt.libvirt.driver [None req-e6b0b2de-d7dc-4ef7-9379-840c95d8b8ab bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3266d7e2-8d63-44ff-970a-45b95f88dc2f] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmphtyz6pdh',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='3266d7e2-8d63-44ff-970a-45b95f88dc2f',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10827
Feb 16 13:56:29 compute-0 nova_compute[185723]: 2026-02-16 13:56:29.018 185727 DEBUG nova.virt.libvirt.driver [None req-e6b0b2de-d7dc-4ef7-9379-840c95d8b8ab bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3266d7e2-8d63-44ff-970a-45b95f88dc2f] Creating instance directory: /var/lib/nova/instances/3266d7e2-8d63-44ff-970a-45b95f88dc2f pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10840
Feb 16 13:56:29 compute-0 nova_compute[185723]: 2026-02-16 13:56:29.019 185727 DEBUG nova.virt.libvirt.driver [None req-e6b0b2de-d7dc-4ef7-9379-840c95d8b8ab bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3266d7e2-8d63-44ff-970a-45b95f88dc2f] Creating disk.info with the contents: {'/var/lib/nova/instances/3266d7e2-8d63-44ff-970a-45b95f88dc2f/disk': 'qcow2', '/var/lib/nova/instances/3266d7e2-8d63-44ff-970a-45b95f88dc2f/disk.config': 'raw'} pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10854
Feb 16 13:56:29 compute-0 nova_compute[185723]: 2026-02-16 13:56:29.019 185727 DEBUG nova.virt.libvirt.driver [None req-e6b0b2de-d7dc-4ef7-9379-840c95d8b8ab bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3266d7e2-8d63-44ff-970a-45b95f88dc2f] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10864
Feb 16 13:56:29 compute-0 nova_compute[185723]: 2026-02-16 13:56:29.019 185727 DEBUG nova.objects.instance [None req-e6b0b2de-d7dc-4ef7-9379-840c95d8b8ab bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lazy-loading 'trusted_certs' on Instance uuid 3266d7e2-8d63-44ff-970a-45b95f88dc2f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 13:56:29 compute-0 nova_compute[185723]: 2026-02-16 13:56:29.056 185727 DEBUG oslo_concurrency.processutils [None req-e6b0b2de-d7dc-4ef7-9379-840c95d8b8ab bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:56:29 compute-0 nova_compute[185723]: 2026-02-16 13:56:29.108 185727 DEBUG oslo_concurrency.processutils [None req-e6b0b2de-d7dc-4ef7-9379-840c95d8b8ab bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:56:29 compute-0 nova_compute[185723]: 2026-02-16 13:56:29.109 185727 DEBUG oslo_concurrency.lockutils [None req-e6b0b2de-d7dc-4ef7-9379-840c95d8b8ab bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "755ae6cb63977e865a71a8c38a1a67ce95c9d0b7" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:56:29 compute-0 nova_compute[185723]: 2026-02-16 13:56:29.110 185727 DEBUG oslo_concurrency.lockutils [None req-e6b0b2de-d7dc-4ef7-9379-840c95d8b8ab bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "755ae6cb63977e865a71a8c38a1a67ce95c9d0b7" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:56:29 compute-0 nova_compute[185723]: 2026-02-16 13:56:29.123 185727 DEBUG oslo_concurrency.processutils [None req-e6b0b2de-d7dc-4ef7-9379-840c95d8b8ab bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:56:29 compute-0 nova_compute[185723]: 2026-02-16 13:56:29.174 185727 DEBUG oslo_concurrency.processutils [None req-e6b0b2de-d7dc-4ef7-9379-840c95d8b8ab bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:56:29 compute-0 nova_compute[185723]: 2026-02-16 13:56:29.176 185727 DEBUG oslo_concurrency.processutils [None req-e6b0b2de-d7dc-4ef7-9379-840c95d8b8ab bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7,backing_fmt=raw /var/lib/nova/instances/3266d7e2-8d63-44ff-970a-45b95f88dc2f/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:56:29 compute-0 nova_compute[185723]: 2026-02-16 13:56:29.205 185727 DEBUG oslo_concurrency.processutils [None req-e6b0b2de-d7dc-4ef7-9379-840c95d8b8ab bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7,backing_fmt=raw /var/lib/nova/instances/3266d7e2-8d63-44ff-970a-45b95f88dc2f/disk 1073741824" returned: 0 in 0.029s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:56:29 compute-0 nova_compute[185723]: 2026-02-16 13:56:29.207 185727 DEBUG oslo_concurrency.lockutils [None req-e6b0b2de-d7dc-4ef7-9379-840c95d8b8ab bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "755ae6cb63977e865a71a8c38a1a67ce95c9d0b7" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.097s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:56:29 compute-0 nova_compute[185723]: 2026-02-16 13:56:29.208 185727 DEBUG oslo_concurrency.processutils [None req-e6b0b2de-d7dc-4ef7-9379-840c95d8b8ab bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:56:29 compute-0 nova_compute[185723]: 2026-02-16 13:56:29.263 185727 DEBUG oslo_concurrency.processutils [None req-e6b0b2de-d7dc-4ef7-9379-840c95d8b8ab bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:56:29 compute-0 nova_compute[185723]: 2026-02-16 13:56:29.264 185727 DEBUG nova.virt.disk.api [None req-e6b0b2de-d7dc-4ef7-9379-840c95d8b8ab bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Checking if we can resize image /var/lib/nova/instances/3266d7e2-8d63-44ff-970a-45b95f88dc2f/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Feb 16 13:56:29 compute-0 nova_compute[185723]: 2026-02-16 13:56:29.265 185727 DEBUG oslo_concurrency.processutils [None req-e6b0b2de-d7dc-4ef7-9379-840c95d8b8ab bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3266d7e2-8d63-44ff-970a-45b95f88dc2f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:56:29 compute-0 nova_compute[185723]: 2026-02-16 13:56:29.321 185727 DEBUG oslo_concurrency.processutils [None req-e6b0b2de-d7dc-4ef7-9379-840c95d8b8ab bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3266d7e2-8d63-44ff-970a-45b95f88dc2f/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:56:29 compute-0 nova_compute[185723]: 2026-02-16 13:56:29.322 185727 DEBUG nova.virt.disk.api [None req-e6b0b2de-d7dc-4ef7-9379-840c95d8b8ab bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Cannot resize image /var/lib/nova/instances/3266d7e2-8d63-44ff-970a-45b95f88dc2f/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Feb 16 13:56:29 compute-0 nova_compute[185723]: 2026-02-16 13:56:29.322 185727 DEBUG nova.objects.instance [None req-e6b0b2de-d7dc-4ef7-9379-840c95d8b8ab bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lazy-loading 'migration_context' on Instance uuid 3266d7e2-8d63-44ff-970a-45b95f88dc2f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 13:56:29 compute-0 nova_compute[185723]: 2026-02-16 13:56:29.351 185727 DEBUG oslo_concurrency.processutils [None req-e6b0b2de-d7dc-4ef7-9379-840c95d8b8ab bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/3266d7e2-8d63-44ff-970a-45b95f88dc2f/disk.config 485376 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:56:29 compute-0 nova_compute[185723]: 2026-02-16 13:56:29.374 185727 DEBUG oslo_concurrency.processutils [None req-e6b0b2de-d7dc-4ef7-9379-840c95d8b8ab bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/3266d7e2-8d63-44ff-970a-45b95f88dc2f/disk.config 485376" returned: 0 in 0.023s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:56:29 compute-0 nova_compute[185723]: 2026-02-16 13:56:29.376 185727 DEBUG nova.virt.libvirt.volume.remotefs [None req-e6b0b2de-d7dc-4ef7-9379-840c95d8b8ab bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Copying file compute-1.ctlplane.example.com:/var/lib/nova/instances/3266d7e2-8d63-44ff-970a-45b95f88dc2f/disk.config to /var/lib/nova/instances/3266d7e2-8d63-44ff-970a-45b95f88dc2f copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Feb 16 13:56:29 compute-0 nova_compute[185723]: 2026-02-16 13:56:29.377 185727 DEBUG oslo_concurrency.processutils [None req-e6b0b2de-d7dc-4ef7-9379-840c95d8b8ab bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Running cmd (subprocess): scp -C -r compute-1.ctlplane.example.com:/var/lib/nova/instances/3266d7e2-8d63-44ff-970a-45b95f88dc2f/disk.config /var/lib/nova/instances/3266d7e2-8d63-44ff-970a-45b95f88dc2f execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:56:29 compute-0 nova_compute[185723]: 2026-02-16 13:56:29.433 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:56:29 compute-0 nova_compute[185723]: 2026-02-16 13:56:29.457 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:56:29 compute-0 nova_compute[185723]: 2026-02-16 13:56:29.458 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:56:29 compute-0 nova_compute[185723]: 2026-02-16 13:56:29.458 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:56:29 compute-0 nova_compute[185723]: 2026-02-16 13:56:29.458 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 16 13:56:29 compute-0 nova_compute[185723]: 2026-02-16 13:56:29.563 185727 WARNING nova.virt.libvirt.driver [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 13:56:29 compute-0 nova_compute[185723]: 2026-02-16 13:56:29.564 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5849MB free_disk=73.21939468383789GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 16 13:56:29 compute-0 nova_compute[185723]: 2026-02-16 13:56:29.564 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:56:29 compute-0 nova_compute[185723]: 2026-02-16 13:56:29.564 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:56:29 compute-0 nova_compute[185723]: 2026-02-16 13:56:29.607 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Migration for instance 3266d7e2-8d63-44ff-970a-45b95f88dc2f refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903
Feb 16 13:56:29 compute-0 nova_compute[185723]: 2026-02-16 13:56:29.626 185727 INFO nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] [instance: 3266d7e2-8d63-44ff-970a-45b95f88dc2f] Updating resource usage from migration f030fcd6-f4e2-43f5-8bfe-798430c65047
Feb 16 13:56:29 compute-0 nova_compute[185723]: 2026-02-16 13:56:29.626 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] [instance: 3266d7e2-8d63-44ff-970a-45b95f88dc2f] Starting to track incoming migration f030fcd6-f4e2-43f5-8bfe-798430c65047 with flavor 6d89f72c-1760-421e-a5f2-83dfc3723b84 _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1431
Feb 16 13:56:29 compute-0 nova_compute[185723]: 2026-02-16 13:56:29.709 185727 WARNING nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Instance 3266d7e2-8d63-44ff-970a-45b95f88dc2f has been moved to another host compute-1.ctlplane.example.com(compute-1.ctlplane.example.com). There are allocations remaining against the source host that might need to be removed: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}.
Feb 16 13:56:29 compute-0 nova_compute[185723]: 2026-02-16 13:56:29.709 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 16 13:56:29 compute-0 nova_compute[185723]: 2026-02-16 13:56:29.709 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 16 13:56:29 compute-0 nova_compute[185723]: 2026-02-16 13:56:29.732 185727 DEBUG nova.scheduler.client.report [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Refreshing inventories for resource provider c9501a85-df32-4b8f-bce0-9425ef1e7866 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Feb 16 13:56:29 compute-0 podman[195053]: time="2026-02-16T13:56:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:56:29 compute-0 podman[195053]: @ - - [16/Feb/2026:13:56:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16012 "" "Go-http-client/1.1"
Feb 16 13:56:29 compute-0 podman[195053]: @ - - [16/Feb/2026:13:56:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2178 "" "Go-http-client/1.1"
Feb 16 13:56:29 compute-0 nova_compute[185723]: 2026-02-16 13:56:29.767 185727 DEBUG nova.scheduler.client.report [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Updating ProviderTree inventory for provider c9501a85-df32-4b8f-bce0-9425ef1e7866 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Feb 16 13:56:29 compute-0 nova_compute[185723]: 2026-02-16 13:56:29.767 185727 DEBUG nova.compute.provider_tree [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Updating inventory in ProviderTree for provider c9501a85-df32-4b8f-bce0-9425ef1e7866 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Feb 16 13:56:29 compute-0 nova_compute[185723]: 2026-02-16 13:56:29.789 185727 DEBUG nova.scheduler.client.report [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Refreshing aggregate associations for resource provider c9501a85-df32-4b8f-bce0-9425ef1e7866, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Feb 16 13:56:29 compute-0 nova_compute[185723]: 2026-02-16 13:56:29.815 185727 DEBUG nova.scheduler.client.report [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Refreshing trait associations for resource provider c9501a85-df32-4b8f-bce0-9425ef1e7866, traits: COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_MMX,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_SSE,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_SSE2,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_SECURITY_TPM_1_2,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_VOLUME_EXTEND,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_STORAGE_BUS_USB,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_SECURITY_TPM_2_0,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_ACCELERATORS,COMPUTE_STORAGE_BUS_FDC,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NODE,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_SSE41,COMPUTE_IMAGE_TYPE_AKI _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Feb 16 13:56:29 compute-0 nova_compute[185723]: 2026-02-16 13:56:29.825 185727 DEBUG oslo_concurrency.processutils [None req-e6b0b2de-d7dc-4ef7-9379-840c95d8b8ab bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] CMD "scp -C -r compute-1.ctlplane.example.com:/var/lib/nova/instances/3266d7e2-8d63-44ff-970a-45b95f88dc2f/disk.config /var/lib/nova/instances/3266d7e2-8d63-44ff-970a-45b95f88dc2f" returned: 0 in 0.448s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:56:29 compute-0 nova_compute[185723]: 2026-02-16 13:56:29.825 185727 DEBUG nova.virt.libvirt.driver [None req-e6b0b2de-d7dc-4ef7-9379-840c95d8b8ab bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3266d7e2-8d63-44ff-970a-45b95f88dc2f] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10794
Feb 16 13:56:29 compute-0 nova_compute[185723]: 2026-02-16 13:56:29.826 185727 DEBUG nova.virt.libvirt.vif [None req-e6b0b2de-d7dc-4ef7-9379-840c95d8b8ab bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-16T13:55:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteZoneMigrationStrategy-server-1379037604',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutezonemigrationstrategy-server-1379037604',id=29,image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-16T13:55:52Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='77712f67f33f426cb3d6d9b7a640f32a',ramdisk_id='',reservation_id='r-p6jzrmkx',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteZoneMigrationStrategy-2141702843',owner_user_name='tempest-TestExecuteZoneMigrationStrategy-2141702843-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2026-02-16T13:55:52Z,user_data=None,user_id='a2a37907788d4195986dc759905dcc95',uuid=3266d7e2-8d63-44ff-970a-45b95f88dc2f,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "8d907fd7-6b02-461e-8612-e5f777af8eea", "address": "fa:16:3e:d1:53:06", "network": {"id": "09e98704-cf1f-47d1-8021-93211c7aa37e", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1331653241-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "77712f67f33f426cb3d6d9b7a640f32a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap8d907fd7-6b", "ovs_interfaceid": "8d907fd7-6b02-461e-8612-e5f777af8eea", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Feb 16 13:56:29 compute-0 nova_compute[185723]: 2026-02-16 13:56:29.826 185727 DEBUG nova.network.os_vif_util [None req-e6b0b2de-d7dc-4ef7-9379-840c95d8b8ab bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Converting VIF {"id": "8d907fd7-6b02-461e-8612-e5f777af8eea", "address": "fa:16:3e:d1:53:06", "network": {"id": "09e98704-cf1f-47d1-8021-93211c7aa37e", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1331653241-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "77712f67f33f426cb3d6d9b7a640f32a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap8d907fd7-6b", "ovs_interfaceid": "8d907fd7-6b02-461e-8612-e5f777af8eea", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 13:56:29 compute-0 nova_compute[185723]: 2026-02-16 13:56:29.827 185727 DEBUG nova.network.os_vif_util [None req-e6b0b2de-d7dc-4ef7-9379-840c95d8b8ab bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d1:53:06,bridge_name='br-int',has_traffic_filtering=True,id=8d907fd7-6b02-461e-8612-e5f777af8eea,network=Network(09e98704-cf1f-47d1-8021-93211c7aa37e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8d907fd7-6b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 13:56:29 compute-0 nova_compute[185723]: 2026-02-16 13:56:29.828 185727 DEBUG os_vif [None req-e6b0b2de-d7dc-4ef7-9379-840c95d8b8ab bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d1:53:06,bridge_name='br-int',has_traffic_filtering=True,id=8d907fd7-6b02-461e-8612-e5f777af8eea,network=Network(09e98704-cf1f-47d1-8021-93211c7aa37e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8d907fd7-6b') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Feb 16 13:56:29 compute-0 nova_compute[185723]: 2026-02-16 13:56:29.829 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:56:29 compute-0 nova_compute[185723]: 2026-02-16 13:56:29.829 185727 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:56:29 compute-0 nova_compute[185723]: 2026-02-16 13:56:29.830 185727 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 13:56:29 compute-0 nova_compute[185723]: 2026-02-16 13:56:29.832 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:56:29 compute-0 nova_compute[185723]: 2026-02-16 13:56:29.832 185727 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8d907fd7-6b, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:56:29 compute-0 nova_compute[185723]: 2026-02-16 13:56:29.833 185727 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap8d907fd7-6b, col_values=(('external_ids', {'iface-id': '8d907fd7-6b02-461e-8612-e5f777af8eea', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:d1:53:06', 'vm-uuid': '3266d7e2-8d63-44ff-970a-45b95f88dc2f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:56:29 compute-0 nova_compute[185723]: 2026-02-16 13:56:29.834 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:56:29 compute-0 NetworkManager[56177]: <info>  [1771250189.8350] manager: (tap8d907fd7-6b): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/100)
Feb 16 13:56:29 compute-0 nova_compute[185723]: 2026-02-16 13:56:29.837 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 13:56:29 compute-0 nova_compute[185723]: 2026-02-16 13:56:29.839 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:56:29 compute-0 nova_compute[185723]: 2026-02-16 13:56:29.840 185727 INFO os_vif [None req-e6b0b2de-d7dc-4ef7-9379-840c95d8b8ab bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d1:53:06,bridge_name='br-int',has_traffic_filtering=True,id=8d907fd7-6b02-461e-8612-e5f777af8eea,network=Network(09e98704-cf1f-47d1-8021-93211c7aa37e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8d907fd7-6b')
Feb 16 13:56:29 compute-0 nova_compute[185723]: 2026-02-16 13:56:29.841 185727 DEBUG nova.virt.libvirt.driver [None req-e6b0b2de-d7dc-4ef7-9379-840c95d8b8ab bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10954
Feb 16 13:56:29 compute-0 nova_compute[185723]: 2026-02-16 13:56:29.841 185727 DEBUG nova.compute.manager [None req-e6b0b2de-d7dc-4ef7-9379-840c95d8b8ab bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmphtyz6pdh',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='3266d7e2-8d63-44ff-970a-45b95f88dc2f',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8668
Feb 16 13:56:29 compute-0 nova_compute[185723]: 2026-02-16 13:56:29.864 185727 DEBUG nova.compute.provider_tree [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Inventory has not changed in ProviderTree for provider: c9501a85-df32-4b8f-bce0-9425ef1e7866 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 13:56:29 compute-0 nova_compute[185723]: 2026-02-16 13:56:29.889 185727 DEBUG nova.scheduler.client.report [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Inventory has not changed for provider c9501a85-df32-4b8f-bce0-9425ef1e7866 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 13:56:29 compute-0 nova_compute[185723]: 2026-02-16 13:56:29.912 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 16 13:56:29 compute-0 nova_compute[185723]: 2026-02-16 13:56:29.912 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.348s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:56:31 compute-0 nova_compute[185723]: 2026-02-16 13:56:31.197 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:56:31 compute-0 openstack_network_exporter[197909]: ERROR   13:56:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:56:31 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:56:31 compute-0 openstack_network_exporter[197909]: ERROR   13:56:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:56:31 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:56:31 compute-0 nova_compute[185723]: 2026-02-16 13:56:31.912 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:56:32 compute-0 nova_compute[185723]: 2026-02-16 13:56:32.433 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:56:32 compute-0 nova_compute[185723]: 2026-02-16 13:56:32.434 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:56:32 compute-0 nova_compute[185723]: 2026-02-16 13:56:32.443 185727 DEBUG nova.network.neutron [None req-e6b0b2de-d7dc-4ef7-9379-840c95d8b8ab bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3266d7e2-8d63-44ff-970a-45b95f88dc2f] Port 8d907fd7-6b02-461e-8612-e5f777af8eea updated with migration profile {'migrating_to': 'compute-0.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.9/site-packages/nova/network/neutron.py:354
Feb 16 13:56:32 compute-0 nova_compute[185723]: 2026-02-16 13:56:32.445 185727 DEBUG nova.compute.manager [None req-e6b0b2de-d7dc-4ef7-9379-840c95d8b8ab bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmphtyz6pdh',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='3266d7e2-8d63-44ff-970a-45b95f88dc2f',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8723
Feb 16 13:56:32 compute-0 systemd[1]: Starting libvirt proxy daemon...
Feb 16 13:56:32 compute-0 systemd[1]: Started libvirt proxy daemon.
Feb 16 13:56:32 compute-0 podman[217955]: 2026-02-16 13:56:32.661035436 +0000 UTC m=+0.042492239 container health_status 4ec6105d1727f201f656d7e6e358931be8ceabc5af37fb5ad779abc620122180 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 16 13:56:32 compute-0 kernel: tap8d907fd7-6b: entered promiscuous mode
Feb 16 13:56:32 compute-0 ovn_controller[96072]: 2026-02-16T13:56:32Z|00263|binding|INFO|Claiming lport 8d907fd7-6b02-461e-8612-e5f777af8eea for this additional chassis.
Feb 16 13:56:32 compute-0 ovn_controller[96072]: 2026-02-16T13:56:32Z|00264|binding|INFO|8d907fd7-6b02-461e-8612-e5f777af8eea: Claiming fa:16:3e:d1:53:06 10.100.0.8
Feb 16 13:56:32 compute-0 NetworkManager[56177]: <info>  [1771250192.7798] manager: (tap8d907fd7-6b): new Tun device (/org/freedesktop/NetworkManager/Devices/101)
Feb 16 13:56:32 compute-0 nova_compute[185723]: 2026-02-16 13:56:32.780 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:56:32 compute-0 nova_compute[185723]: 2026-02-16 13:56:32.782 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:56:32 compute-0 nova_compute[185723]: 2026-02-16 13:56:32.787 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:56:32 compute-0 systemd-udevd[218011]: Network interface NamePolicy= disabled on kernel command line.
Feb 16 13:56:32 compute-0 nova_compute[185723]: 2026-02-16 13:56:32.805 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:56:32 compute-0 systemd-machined[155229]: New machine qemu-24-instance-0000001d.
Feb 16 13:56:32 compute-0 ovn_controller[96072]: 2026-02-16T13:56:32Z|00265|binding|INFO|Setting lport 8d907fd7-6b02-461e-8612-e5f777af8eea ovn-installed in OVS
Feb 16 13:56:32 compute-0 nova_compute[185723]: 2026-02-16 13:56:32.808 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:56:32 compute-0 NetworkManager[56177]: <info>  [1771250192.8123] device (tap8d907fd7-6b): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 16 13:56:32 compute-0 NetworkManager[56177]: <info>  [1771250192.8130] device (tap8d907fd7-6b): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Feb 16 13:56:32 compute-0 systemd[1]: Started Virtual Machine qemu-24-instance-0000001d.
Feb 16 13:56:33 compute-0 nova_compute[185723]: 2026-02-16 13:56:33.432 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:56:34 compute-0 nova_compute[185723]: 2026-02-16 13:56:34.433 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:56:34 compute-0 nova_compute[185723]: 2026-02-16 13:56:34.433 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 16 13:56:34 compute-0 nova_compute[185723]: 2026-02-16 13:56:34.478 185727 DEBUG nova.virt.driver [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] Emitting event <LifecycleEvent: 1771250194.4778962, 3266d7e2-8d63-44ff-970a-45b95f88dc2f => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 13:56:34 compute-0 nova_compute[185723]: 2026-02-16 13:56:34.478 185727 INFO nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: 3266d7e2-8d63-44ff-970a-45b95f88dc2f] VM Started (Lifecycle Event)
Feb 16 13:56:34 compute-0 nova_compute[185723]: 2026-02-16 13:56:34.511 185727 DEBUG nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: 3266d7e2-8d63-44ff-970a-45b95f88dc2f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:56:34 compute-0 nova_compute[185723]: 2026-02-16 13:56:34.835 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:56:36 compute-0 nova_compute[185723]: 2026-02-16 13:56:36.199 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:56:36 compute-0 nova_compute[185723]: 2026-02-16 13:56:36.433 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:56:36 compute-0 nova_compute[185723]: 2026-02-16 13:56:36.434 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 16 13:56:36 compute-0 nova_compute[185723]: 2026-02-16 13:56:36.434 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 16 13:56:36 compute-0 nova_compute[185723]: 2026-02-16 13:56:36.453 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 16 13:56:36 compute-0 nova_compute[185723]: 2026-02-16 13:56:36.534 185727 DEBUG nova.virt.driver [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] Emitting event <LifecycleEvent: 1771250196.5346646, 3266d7e2-8d63-44ff-970a-45b95f88dc2f => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 13:56:36 compute-0 nova_compute[185723]: 2026-02-16 13:56:36.536 185727 INFO nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: 3266d7e2-8d63-44ff-970a-45b95f88dc2f] VM Resumed (Lifecycle Event)
Feb 16 13:56:36 compute-0 nova_compute[185723]: 2026-02-16 13:56:36.561 185727 DEBUG nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: 3266d7e2-8d63-44ff-970a-45b95f88dc2f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:56:36 compute-0 nova_compute[185723]: 2026-02-16 13:56:36.565 185727 DEBUG nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: 3266d7e2-8d63-44ff-970a-45b95f88dc2f] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 16 13:56:36 compute-0 nova_compute[185723]: 2026-02-16 13:56:36.589 185727 INFO nova.compute.manager [None req-e9eeed24-180f-4f48-9198-9303009d986c - - - - - -] [instance: 3266d7e2-8d63-44ff-970a-45b95f88dc2f] During the sync_power process the instance has moved from host compute-1.ctlplane.example.com to host compute-0.ctlplane.example.com
Feb 16 13:56:37 compute-0 nova_compute[185723]: 2026-02-16 13:56:37.433 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:56:37 compute-0 nova_compute[185723]: 2026-02-16 13:56:37.435 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:56:37 compute-0 sshd-session[218042]: Invalid user ubuntu from 64.227.72.94 port 59892
Feb 16 13:56:38 compute-0 sshd-session[218042]: Connection closed by invalid user ubuntu 64.227.72.94 port 59892 [preauth]
Feb 16 13:56:38 compute-0 ovn_controller[96072]: 2026-02-16T13:56:38Z|00266|binding|INFO|Claiming lport 8d907fd7-6b02-461e-8612-e5f777af8eea for this chassis.
Feb 16 13:56:38 compute-0 ovn_controller[96072]: 2026-02-16T13:56:38Z|00267|binding|INFO|8d907fd7-6b02-461e-8612-e5f777af8eea: Claiming fa:16:3e:d1:53:06 10.100.0.8
Feb 16 13:56:38 compute-0 ovn_controller[96072]: 2026-02-16T13:56:38Z|00268|binding|INFO|Setting lport 8d907fd7-6b02-461e-8612-e5f777af8eea up in Southbound
Feb 16 13:56:38 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:56:38.702 105360 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d1:53:06 10.100.0.8'], port_security=['fa:16:3e:d1:53:06 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '3266d7e2-8d63-44ff-970a-45b95f88dc2f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-09e98704-cf1f-47d1-8021-93211c7aa37e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '77712f67f33f426cb3d6d9b7a640f32a', 'neutron:revision_number': '11', 'neutron:security_group_ids': 'c1368845-2f7a-494d-9bee-474d9166c8a0', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c4e8c351-7159-44d3-b122-efa9b0154fd9, chassis=[<ovs.db.idl.Row object at 0x7f2e53046760>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2e53046760>], logical_port=8d907fd7-6b02-461e-8612-e5f777af8eea) old=Port_Binding(up=[False], additional_chassis=[<ovs.db.idl.Row object at 0x7f2e53046760>], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 13:56:38 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:56:38.704 105360 INFO neutron.agent.ovn.metadata.agent [-] Port 8d907fd7-6b02-461e-8612-e5f777af8eea in datapath 09e98704-cf1f-47d1-8021-93211c7aa37e bound to our chassis
Feb 16 13:56:38 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:56:38.705 105360 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 09e98704-cf1f-47d1-8021-93211c7aa37e
Feb 16 13:56:38 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:56:38.713 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[cf261130-8d98-4ae2-aa17-4777dbec93bd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:56:38 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:56:38.714 105360 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap09e98704-c1 in ovnmeta-09e98704-cf1f-47d1-8021-93211c7aa37e namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Feb 16 13:56:38 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:56:38.716 206438 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap09e98704-c0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Feb 16 13:56:38 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:56:38.716 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[36a58dba-9b77-440a-bd28-c4c823d176f0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:56:38 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:56:38.717 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[bf6fecdd-51ad-47cb-a312-883adff8f6fd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:56:38 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:56:38.726 105762 DEBUG oslo.privsep.daemon [-] privsep: reply[19741161-486a-4614-89b7-60e367ee3eeb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:56:38 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:56:38.734 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[685ad4de-02fd-40e7-9a2c-b1d436c3a05a]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:56:38 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:56:38.758 206452 DEBUG oslo.privsep.daemon [-] privsep: reply[c2e4da74-c223-4f07-8db4-1f148c80fbde]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:56:38 compute-0 NetworkManager[56177]: <info>  [1771250198.7636] manager: (tap09e98704-c0): new Veth device (/org/freedesktop/NetworkManager/Devices/102)
Feb 16 13:56:38 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:56:38.763 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[50bc2038-3e6c-407d-8fb5-ae6a53a8121d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:56:38 compute-0 systemd-udevd[218051]: Network interface NamePolicy= disabled on kernel command line.
Feb 16 13:56:38 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:56:38.784 206452 DEBUG oslo.privsep.daemon [-] privsep: reply[ffb2b93f-02f2-42a9-bc5b-2b9bff3d5931]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:56:38 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:56:38.786 206452 DEBUG oslo.privsep.daemon [-] privsep: reply[25ae034c-ae4d-415b-9d63-caf3762e5387]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:56:38 compute-0 NetworkManager[56177]: <info>  [1771250198.8101] device (tap09e98704-c0): carrier: link connected
Feb 16 13:56:38 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:56:38.814 206452 DEBUG oslo.privsep.daemon [-] privsep: reply[2388fde2-9154-463b-bfbe-48dbfe261bd3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:56:38 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:56:38.829 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[6975f2b1-0c84-4d59-bf60-7d703aebbd97]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap09e98704-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:77:74:12'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 72], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 628220, 'reachable_time': 40295, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218070, 'error': None, 'target': 'ovnmeta-09e98704-cf1f-47d1-8021-93211c7aa37e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:56:38 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:56:38.838 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[68a83f1c-7848-4946-be21-2552145df38c]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe77:7412'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 628220, 'tstamp': 628220}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218071, 'error': None, 'target': 'ovnmeta-09e98704-cf1f-47d1-8021-93211c7aa37e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:56:38 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:56:38.852 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[761100d1-8bc1-49df-821b-1598493fb184]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap09e98704-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:77:74:12'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 72], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 628220, 'reachable_time': 40295, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 218072, 'error': None, 'target': 'ovnmeta-09e98704-cf1f-47d1-8021-93211c7aa37e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:56:38 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:56:38.875 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[c59c0472-3345-4df1-b8c3-926948ee0b42]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:56:38 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:56:38.921 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[ec84b830-7053-4db0-b71b-0321ea030f1b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:56:38 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:56:38.923 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap09e98704-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:56:38 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:56:38.923 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 13:56:38 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:56:38.924 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap09e98704-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:56:38 compute-0 NetworkManager[56177]: <info>  [1771250198.9266] manager: (tap09e98704-c0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/103)
Feb 16 13:56:38 compute-0 nova_compute[185723]: 2026-02-16 13:56:38.926 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:56:38 compute-0 kernel: tap09e98704-c0: entered promiscuous mode
Feb 16 13:56:38 compute-0 nova_compute[185723]: 2026-02-16 13:56:38.928 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:56:38 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:56:38.929 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap09e98704-c0, col_values=(('external_ids', {'iface-id': 'eea5c447-c012-4beb-b864-a8e81dbeffa6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:56:38 compute-0 ovn_controller[96072]: 2026-02-16T13:56:38Z|00269|binding|INFO|Releasing lport eea5c447-c012-4beb-b864-a8e81dbeffa6 from this chassis (sb_readonly=0)
Feb 16 13:56:38 compute-0 nova_compute[185723]: 2026-02-16 13:56:38.930 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:56:38 compute-0 nova_compute[185723]: 2026-02-16 13:56:38.931 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:56:38 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:56:38.931 105360 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/09e98704-cf1f-47d1-8021-93211c7aa37e.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/09e98704-cf1f-47d1-8021-93211c7aa37e.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Feb 16 13:56:38 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:56:38.932 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[79425349-6580-4d7e-ad16-6972f190fdf7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:56:38 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:56:38.933 105360 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Feb 16 13:56:38 compute-0 ovn_metadata_agent[105355]: global
Feb 16 13:56:38 compute-0 ovn_metadata_agent[105355]:     log         /dev/log local0 debug
Feb 16 13:56:38 compute-0 ovn_metadata_agent[105355]:     log-tag     haproxy-metadata-proxy-09e98704-cf1f-47d1-8021-93211c7aa37e
Feb 16 13:56:38 compute-0 ovn_metadata_agent[105355]:     user        root
Feb 16 13:56:38 compute-0 ovn_metadata_agent[105355]:     group       root
Feb 16 13:56:38 compute-0 ovn_metadata_agent[105355]:     maxconn     1024
Feb 16 13:56:38 compute-0 ovn_metadata_agent[105355]:     pidfile     /var/lib/neutron/external/pids/09e98704-cf1f-47d1-8021-93211c7aa37e.pid.haproxy
Feb 16 13:56:38 compute-0 ovn_metadata_agent[105355]:     daemon
Feb 16 13:56:38 compute-0 ovn_metadata_agent[105355]: 
Feb 16 13:56:38 compute-0 ovn_metadata_agent[105355]: defaults
Feb 16 13:56:38 compute-0 ovn_metadata_agent[105355]:     log global
Feb 16 13:56:38 compute-0 ovn_metadata_agent[105355]:     mode http
Feb 16 13:56:38 compute-0 ovn_metadata_agent[105355]:     option httplog
Feb 16 13:56:38 compute-0 ovn_metadata_agent[105355]:     option dontlognull
Feb 16 13:56:38 compute-0 ovn_metadata_agent[105355]:     option http-server-close
Feb 16 13:56:38 compute-0 ovn_metadata_agent[105355]:     option forwardfor
Feb 16 13:56:38 compute-0 ovn_metadata_agent[105355]:     retries                 3
Feb 16 13:56:38 compute-0 ovn_metadata_agent[105355]:     timeout http-request    30s
Feb 16 13:56:38 compute-0 ovn_metadata_agent[105355]:     timeout connect         30s
Feb 16 13:56:38 compute-0 ovn_metadata_agent[105355]:     timeout client          32s
Feb 16 13:56:38 compute-0 ovn_metadata_agent[105355]:     timeout server          32s
Feb 16 13:56:38 compute-0 ovn_metadata_agent[105355]:     timeout http-keep-alive 30s
Feb 16 13:56:38 compute-0 ovn_metadata_agent[105355]: 
Feb 16 13:56:38 compute-0 ovn_metadata_agent[105355]: 
Feb 16 13:56:38 compute-0 ovn_metadata_agent[105355]: listen listener
Feb 16 13:56:38 compute-0 ovn_metadata_agent[105355]:     bind 169.254.169.254:80
Feb 16 13:56:38 compute-0 ovn_metadata_agent[105355]:     server metadata /var/lib/neutron/metadata_proxy
Feb 16 13:56:38 compute-0 ovn_metadata_agent[105355]:     http-request add-header X-OVN-Network-ID 09e98704-cf1f-47d1-8021-93211c7aa37e
Feb 16 13:56:38 compute-0 ovn_metadata_agent[105355]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Feb 16 13:56:38 compute-0 nova_compute[185723]: 2026-02-16 13:56:38.934 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:56:38 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:56:38.934 105360 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-09e98704-cf1f-47d1-8021-93211c7aa37e', 'env', 'PROCESS_TAG=haproxy-09e98704-cf1f-47d1-8021-93211c7aa37e', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/09e98704-cf1f-47d1-8021-93211c7aa37e.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Feb 16 13:56:39 compute-0 podman[218105]: 2026-02-16 13:56:39.235459627 +0000 UTC m=+0.043222827 container create f40264dc2678d728f359c19c58e76680897f2442cb9204d7f65bf883aac76956 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-09e98704-cf1f-47d1-8021-93211c7aa37e, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, tcib_managed=true)
Feb 16 13:56:39 compute-0 systemd[1]: Started libpod-conmon-f40264dc2678d728f359c19c58e76680897f2442cb9204d7f65bf883aac76956.scope.
Feb 16 13:56:39 compute-0 nova_compute[185723]: 2026-02-16 13:56:39.283 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:56:39 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:56:39.284 105360 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=35, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '52:96:1b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '16:6c:7a:bd:49:39'}, ipsec=False) old=SB_Global(nb_cfg=34) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 13:56:39 compute-0 systemd[1]: Started libcrun container.
Feb 16 13:56:39 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3d7c9151a9b0df83218dc9ffa8562e067b3e1fce218ec3b9669deb8a9c9f7329/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 16 13:56:39 compute-0 podman[218105]: 2026-02-16 13:56:39.21351179 +0000 UTC m=+0.021275000 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Feb 16 13:56:39 compute-0 podman[218105]: 2026-02-16 13:56:39.316594316 +0000 UTC m=+0.124357536 container init f40264dc2678d728f359c19c58e76680897f2442cb9204d7f65bf883aac76956 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-09e98704-cf1f-47d1-8021-93211c7aa37e, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 16 13:56:39 compute-0 podman[218105]: 2026-02-16 13:56:39.320929304 +0000 UTC m=+0.128692504 container start f40264dc2678d728f359c19c58e76680897f2442cb9204d7f65bf883aac76956 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-09e98704-cf1f-47d1-8021-93211c7aa37e, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Feb 16 13:56:39 compute-0 neutron-haproxy-ovnmeta-09e98704-cf1f-47d1-8021-93211c7aa37e[218120]: [NOTICE]   (218124) : New worker (218126) forked
Feb 16 13:56:39 compute-0 neutron-haproxy-ovnmeta-09e98704-cf1f-47d1-8021-93211c7aa37e[218120]: [NOTICE]   (218124) : Loading success.
Feb 16 13:56:39 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:56:39.366 105360 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 16 13:56:39 compute-0 nova_compute[185723]: 2026-02-16 13:56:39.464 185727 INFO nova.compute.manager [None req-e6b0b2de-d7dc-4ef7-9379-840c95d8b8ab bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3266d7e2-8d63-44ff-970a-45b95f88dc2f] Post operation of migration started
Feb 16 13:56:39 compute-0 nova_compute[185723]: 2026-02-16 13:56:39.836 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:56:40 compute-0 nova_compute[185723]: 2026-02-16 13:56:40.705 185727 DEBUG oslo_concurrency.lockutils [None req-e6b0b2de-d7dc-4ef7-9379-840c95d8b8ab bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "refresh_cache-3266d7e2-8d63-44ff-970a-45b95f88dc2f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 13:56:40 compute-0 nova_compute[185723]: 2026-02-16 13:56:40.706 185727 DEBUG oslo_concurrency.lockutils [None req-e6b0b2de-d7dc-4ef7-9379-840c95d8b8ab bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Acquired lock "refresh_cache-3266d7e2-8d63-44ff-970a-45b95f88dc2f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 13:56:40 compute-0 nova_compute[185723]: 2026-02-16 13:56:40.706 185727 DEBUG nova.network.neutron [None req-e6b0b2de-d7dc-4ef7-9379-840c95d8b8ab bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3266d7e2-8d63-44ff-970a-45b95f88dc2f] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 16 13:56:41 compute-0 nova_compute[185723]: 2026-02-16 13:56:41.200 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:56:43 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:56:43.369 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=b0e583b2-47d7-4bde-bbd6-282143e0c194, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '35'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:56:43 compute-0 nova_compute[185723]: 2026-02-16 13:56:43.533 185727 DEBUG nova.network.neutron [None req-e6b0b2de-d7dc-4ef7-9379-840c95d8b8ab bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3266d7e2-8d63-44ff-970a-45b95f88dc2f] Updating instance_info_cache with network_info: [{"id": "8d907fd7-6b02-461e-8612-e5f777af8eea", "address": "fa:16:3e:d1:53:06", "network": {"id": "09e98704-cf1f-47d1-8021-93211c7aa37e", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1331653241-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "77712f67f33f426cb3d6d9b7a640f32a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8d907fd7-6b", "ovs_interfaceid": "8d907fd7-6b02-461e-8612-e5f777af8eea", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 13:56:44 compute-0 nova_compute[185723]: 2026-02-16 13:56:44.447 185727 DEBUG oslo_concurrency.lockutils [None req-e6b0b2de-d7dc-4ef7-9379-840c95d8b8ab bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Releasing lock "refresh_cache-3266d7e2-8d63-44ff-970a-45b95f88dc2f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 13:56:44 compute-0 nova_compute[185723]: 2026-02-16 13:56:44.470 185727 DEBUG oslo_concurrency.lockutils [None req-e6b0b2de-d7dc-4ef7-9379-840c95d8b8ab bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:56:44 compute-0 nova_compute[185723]: 2026-02-16 13:56:44.471 185727 DEBUG oslo_concurrency.lockutils [None req-e6b0b2de-d7dc-4ef7-9379-840c95d8b8ab bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:56:44 compute-0 nova_compute[185723]: 2026-02-16 13:56:44.471 185727 DEBUG oslo_concurrency.lockutils [None req-e6b0b2de-d7dc-4ef7-9379-840c95d8b8ab bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:56:44 compute-0 nova_compute[185723]: 2026-02-16 13:56:44.477 185727 INFO nova.virt.libvirt.driver [None req-e6b0b2de-d7dc-4ef7-9379-840c95d8b8ab bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3266d7e2-8d63-44ff-970a-45b95f88dc2f] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Feb 16 13:56:44 compute-0 virtqemud[184843]: Domain id=24 name='instance-0000001d' uuid=3266d7e2-8d63-44ff-970a-45b95f88dc2f is tainted: custom-monitor
Feb 16 13:56:44 compute-0 nova_compute[185723]: 2026-02-16 13:56:44.839 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:56:45 compute-0 nova_compute[185723]: 2026-02-16 13:56:45.488 185727 INFO nova.virt.libvirt.driver [None req-e6b0b2de-d7dc-4ef7-9379-840c95d8b8ab bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3266d7e2-8d63-44ff-970a-45b95f88dc2f] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Feb 16 13:56:46 compute-0 nova_compute[185723]: 2026-02-16 13:56:46.202 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:56:46 compute-0 nova_compute[185723]: 2026-02-16 13:56:46.492 185727 INFO nova.virt.libvirt.driver [None req-e6b0b2de-d7dc-4ef7-9379-840c95d8b8ab bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3266d7e2-8d63-44ff-970a-45b95f88dc2f] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Feb 16 13:56:46 compute-0 nova_compute[185723]: 2026-02-16 13:56:46.496 185727 DEBUG nova.compute.manager [None req-e6b0b2de-d7dc-4ef7-9379-840c95d8b8ab bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3266d7e2-8d63-44ff-970a-45b95f88dc2f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:56:46 compute-0 nova_compute[185723]: 2026-02-16 13:56:46.526 185727 DEBUG nova.objects.instance [None req-e6b0b2de-d7dc-4ef7-9379-840c95d8b8ab bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3266d7e2-8d63-44ff-970a-45b95f88dc2f] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Feb 16 13:56:49 compute-0 nova_compute[185723]: 2026-02-16 13:56:49.840 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:56:51 compute-0 nova_compute[185723]: 2026-02-16 13:56:51.242 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:56:51 compute-0 nova_compute[185723]: 2026-02-16 13:56:51.383 185727 DEBUG oslo_concurrency.lockutils [None req-ef487dfe-2440-428b-a8e9-b6846ed3f9e0 a2a37907788d4195986dc759905dcc95 77712f67f33f426cb3d6d9b7a640f32a - - default default] Acquiring lock "3266d7e2-8d63-44ff-970a-45b95f88dc2f" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:56:51 compute-0 nova_compute[185723]: 2026-02-16 13:56:51.384 185727 DEBUG oslo_concurrency.lockutils [None req-ef487dfe-2440-428b-a8e9-b6846ed3f9e0 a2a37907788d4195986dc759905dcc95 77712f67f33f426cb3d6d9b7a640f32a - - default default] Lock "3266d7e2-8d63-44ff-970a-45b95f88dc2f" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:56:51 compute-0 nova_compute[185723]: 2026-02-16 13:56:51.384 185727 DEBUG oslo_concurrency.lockutils [None req-ef487dfe-2440-428b-a8e9-b6846ed3f9e0 a2a37907788d4195986dc759905dcc95 77712f67f33f426cb3d6d9b7a640f32a - - default default] Acquiring lock "3266d7e2-8d63-44ff-970a-45b95f88dc2f-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:56:51 compute-0 nova_compute[185723]: 2026-02-16 13:56:51.384 185727 DEBUG oslo_concurrency.lockutils [None req-ef487dfe-2440-428b-a8e9-b6846ed3f9e0 a2a37907788d4195986dc759905dcc95 77712f67f33f426cb3d6d9b7a640f32a - - default default] Lock "3266d7e2-8d63-44ff-970a-45b95f88dc2f-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:56:51 compute-0 nova_compute[185723]: 2026-02-16 13:56:51.384 185727 DEBUG oslo_concurrency.lockutils [None req-ef487dfe-2440-428b-a8e9-b6846ed3f9e0 a2a37907788d4195986dc759905dcc95 77712f67f33f426cb3d6d9b7a640f32a - - default default] Lock "3266d7e2-8d63-44ff-970a-45b95f88dc2f-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:56:51 compute-0 nova_compute[185723]: 2026-02-16 13:56:51.386 185727 INFO nova.compute.manager [None req-ef487dfe-2440-428b-a8e9-b6846ed3f9e0 a2a37907788d4195986dc759905dcc95 77712f67f33f426cb3d6d9b7a640f32a - - default default] [instance: 3266d7e2-8d63-44ff-970a-45b95f88dc2f] Terminating instance
Feb 16 13:56:51 compute-0 nova_compute[185723]: 2026-02-16 13:56:51.386 185727 DEBUG nova.compute.manager [None req-ef487dfe-2440-428b-a8e9-b6846ed3f9e0 a2a37907788d4195986dc759905dcc95 77712f67f33f426cb3d6d9b7a640f32a - - default default] [instance: 3266d7e2-8d63-44ff-970a-45b95f88dc2f] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Feb 16 13:56:51 compute-0 kernel: tap8d907fd7-6b (unregistering): left promiscuous mode
Feb 16 13:56:51 compute-0 NetworkManager[56177]: <info>  [1771250211.4132] device (tap8d907fd7-6b): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Feb 16 13:56:51 compute-0 ovn_controller[96072]: 2026-02-16T13:56:51Z|00270|binding|INFO|Releasing lport 8d907fd7-6b02-461e-8612-e5f777af8eea from this chassis (sb_readonly=0)
Feb 16 13:56:51 compute-0 ovn_controller[96072]: 2026-02-16T13:56:51Z|00271|binding|INFO|Setting lport 8d907fd7-6b02-461e-8612-e5f777af8eea down in Southbound
Feb 16 13:56:51 compute-0 ovn_controller[96072]: 2026-02-16T13:56:51Z|00272|binding|INFO|Removing iface tap8d907fd7-6b ovn-installed in OVS
Feb 16 13:56:51 compute-0 nova_compute[185723]: 2026-02-16 13:56:51.419 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:56:51 compute-0 nova_compute[185723]: 2026-02-16 13:56:51.426 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:56:51 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:56:51.429 105360 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d1:53:06 10.100.0.8'], port_security=['fa:16:3e:d1:53:06 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '3266d7e2-8d63-44ff-970a-45b95f88dc2f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-09e98704-cf1f-47d1-8021-93211c7aa37e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '77712f67f33f426cb3d6d9b7a640f32a', 'neutron:revision_number': '11', 'neutron:security_group_ids': 'c1368845-2f7a-494d-9bee-474d9166c8a0', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c4e8c351-7159-44d3-b122-efa9b0154fd9, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2e53046760>], logical_port=8d907fd7-6b02-461e-8612-e5f777af8eea) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2e53046760>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 13:56:51 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:56:51.430 105360 INFO neutron.agent.ovn.metadata.agent [-] Port 8d907fd7-6b02-461e-8612-e5f777af8eea in datapath 09e98704-cf1f-47d1-8021-93211c7aa37e unbound from our chassis
Feb 16 13:56:51 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:56:51.431 105360 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 09e98704-cf1f-47d1-8021-93211c7aa37e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 16 13:56:51 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:56:51.432 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[08a959a1-e09d-4ad4-9d66-ee414b1f4f63]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:56:51 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:56:51.432 105360 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-09e98704-cf1f-47d1-8021-93211c7aa37e namespace which is not needed anymore
Feb 16 13:56:51 compute-0 systemd[1]: machine-qemu\x2d24\x2dinstance\x2d0000001d.scope: Deactivated successfully.
Feb 16 13:56:51 compute-0 systemd[1]: machine-qemu\x2d24\x2dinstance\x2d0000001d.scope: Consumed 2.636s CPU time.
Feb 16 13:56:51 compute-0 systemd-machined[155229]: Machine qemu-24-instance-0000001d terminated.
Feb 16 13:56:51 compute-0 neutron-haproxy-ovnmeta-09e98704-cf1f-47d1-8021-93211c7aa37e[218120]: [NOTICE]   (218124) : haproxy version is 2.8.14-c23fe91
Feb 16 13:56:51 compute-0 neutron-haproxy-ovnmeta-09e98704-cf1f-47d1-8021-93211c7aa37e[218120]: [NOTICE]   (218124) : path to executable is /usr/sbin/haproxy
Feb 16 13:56:51 compute-0 neutron-haproxy-ovnmeta-09e98704-cf1f-47d1-8021-93211c7aa37e[218120]: [WARNING]  (218124) : Exiting Master process...
Feb 16 13:56:51 compute-0 neutron-haproxy-ovnmeta-09e98704-cf1f-47d1-8021-93211c7aa37e[218120]: [ALERT]    (218124) : Current worker (218126) exited with code 143 (Terminated)
Feb 16 13:56:51 compute-0 neutron-haproxy-ovnmeta-09e98704-cf1f-47d1-8021-93211c7aa37e[218120]: [WARNING]  (218124) : All workers exited. Exiting... (0)
Feb 16 13:56:51 compute-0 systemd[1]: libpod-f40264dc2678d728f359c19c58e76680897f2442cb9204d7f65bf883aac76956.scope: Deactivated successfully.
Feb 16 13:56:51 compute-0 podman[218161]: 2026-02-16 13:56:51.548645263 +0000 UTC m=+0.042605031 container died f40264dc2678d728f359c19c58e76680897f2442cb9204d7f65bf883aac76956 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-09e98704-cf1f-47d1-8021-93211c7aa37e, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0)
Feb 16 13:56:51 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f40264dc2678d728f359c19c58e76680897f2442cb9204d7f65bf883aac76956-userdata-shm.mount: Deactivated successfully.
Feb 16 13:56:51 compute-0 systemd[1]: var-lib-containers-storage-overlay-3d7c9151a9b0df83218dc9ffa8562e067b3e1fce218ec3b9669deb8a9c9f7329-merged.mount: Deactivated successfully.
Feb 16 13:56:51 compute-0 podman[218161]: 2026-02-16 13:56:51.580619179 +0000 UTC m=+0.074578947 container cleanup f40264dc2678d728f359c19c58e76680897f2442cb9204d7f65bf883aac76956 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-09e98704-cf1f-47d1-8021-93211c7aa37e, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3)
Feb 16 13:56:51 compute-0 systemd[1]: libpod-conmon-f40264dc2678d728f359c19c58e76680897f2442cb9204d7f65bf883aac76956.scope: Deactivated successfully.
Feb 16 13:56:51 compute-0 nova_compute[185723]: 2026-02-16 13:56:51.606 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:56:51 compute-0 nova_compute[185723]: 2026-02-16 13:56:51.610 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:56:51 compute-0 podman[218192]: 2026-02-16 13:56:51.643857833 +0000 UTC m=+0.044982331 container remove f40264dc2678d728f359c19c58e76680897f2442cb9204d7f65bf883aac76956 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-09e98704-cf1f-47d1-8021-93211c7aa37e, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 16 13:56:51 compute-0 nova_compute[185723]: 2026-02-16 13:56:51.650 185727 INFO nova.virt.libvirt.driver [-] [instance: 3266d7e2-8d63-44ff-970a-45b95f88dc2f] Instance destroyed successfully.
Feb 16 13:56:51 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:56:51.650 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[483c0d9d-1649-4a79-b10f-3dca479351e9]: (4, ('Mon Feb 16 01:56:51 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-09e98704-cf1f-47d1-8021-93211c7aa37e (f40264dc2678d728f359c19c58e76680897f2442cb9204d7f65bf883aac76956)\nf40264dc2678d728f359c19c58e76680897f2442cb9204d7f65bf883aac76956\nMon Feb 16 01:56:51 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-09e98704-cf1f-47d1-8021-93211c7aa37e (f40264dc2678d728f359c19c58e76680897f2442cb9204d7f65bf883aac76956)\nf40264dc2678d728f359c19c58e76680897f2442cb9204d7f65bf883aac76956\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:56:51 compute-0 nova_compute[185723]: 2026-02-16 13:56:51.652 185727 DEBUG nova.objects.instance [None req-ef487dfe-2440-428b-a8e9-b6846ed3f9e0 a2a37907788d4195986dc759905dcc95 77712f67f33f426cb3d6d9b7a640f32a - - default default] Lazy-loading 'resources' on Instance uuid 3266d7e2-8d63-44ff-970a-45b95f88dc2f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 13:56:51 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:56:51.653 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[1a9a65b1-53b0-4d27-8f1c-1c67b07cd86b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:56:51 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:56:51.654 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap09e98704-c0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:56:51 compute-0 kernel: tap09e98704-c0: left promiscuous mode
Feb 16 13:56:51 compute-0 nova_compute[185723]: 2026-02-16 13:56:51.655 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:56:51 compute-0 nova_compute[185723]: 2026-02-16 13:56:51.663 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:56:51 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:56:51.665 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[3294fba2-20d4-4e20-b02c-be3187f891fc]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:56:51 compute-0 nova_compute[185723]: 2026-02-16 13:56:51.666 185727 DEBUG nova.virt.libvirt.vif [None req-ef487dfe-2440-428b-a8e9-b6846ed3f9e0 a2a37907788d4195986dc759905dcc95 77712f67f33f426cb3d6d9b7a640f32a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-02-16T13:55:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteZoneMigrationStrategy-server-1379037604',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutezonemigrationstrategy-server-1379037604',id=29,image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-16T13:55:52Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='77712f67f33f426cb3d6d9b7a640f32a',ramdisk_id='',reservation_id='r-p6jzrmkx',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteZoneMigrationStrategy-2141702843',owner_user_name='tempest-TestExecuteZoneMigrationStrategy-2141702843-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-16T13:56:46Z,user_data=None,user_id='a2a37907788d4195986dc759905dcc95',uuid=3266d7e2-8d63-44ff-970a-45b95f88dc2f,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "8d907fd7-6b02-461e-8612-e5f777af8eea", "address": "fa:16:3e:d1:53:06", "network": {"id": "09e98704-cf1f-47d1-8021-93211c7aa37e", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1331653241-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "77712f67f33f426cb3d6d9b7a640f32a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8d907fd7-6b", "ovs_interfaceid": "8d907fd7-6b02-461e-8612-e5f777af8eea", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Feb 16 13:56:51 compute-0 nova_compute[185723]: 2026-02-16 13:56:51.667 185727 DEBUG nova.network.os_vif_util [None req-ef487dfe-2440-428b-a8e9-b6846ed3f9e0 a2a37907788d4195986dc759905dcc95 77712f67f33f426cb3d6d9b7a640f32a - - default default] Converting VIF {"id": "8d907fd7-6b02-461e-8612-e5f777af8eea", "address": "fa:16:3e:d1:53:06", "network": {"id": "09e98704-cf1f-47d1-8021-93211c7aa37e", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1331653241-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "77712f67f33f426cb3d6d9b7a640f32a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8d907fd7-6b", "ovs_interfaceid": "8d907fd7-6b02-461e-8612-e5f777af8eea", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 13:56:51 compute-0 nova_compute[185723]: 2026-02-16 13:56:51.668 185727 DEBUG nova.network.os_vif_util [None req-ef487dfe-2440-428b-a8e9-b6846ed3f9e0 a2a37907788d4195986dc759905dcc95 77712f67f33f426cb3d6d9b7a640f32a - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:d1:53:06,bridge_name='br-int',has_traffic_filtering=True,id=8d907fd7-6b02-461e-8612-e5f777af8eea,network=Network(09e98704-cf1f-47d1-8021-93211c7aa37e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8d907fd7-6b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 13:56:51 compute-0 nova_compute[185723]: 2026-02-16 13:56:51.668 185727 DEBUG os_vif [None req-ef487dfe-2440-428b-a8e9-b6846ed3f9e0 a2a37907788d4195986dc759905dcc95 77712f67f33f426cb3d6d9b7a640f32a - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:d1:53:06,bridge_name='br-int',has_traffic_filtering=True,id=8d907fd7-6b02-461e-8612-e5f777af8eea,network=Network(09e98704-cf1f-47d1-8021-93211c7aa37e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8d907fd7-6b') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Feb 16 13:56:51 compute-0 nova_compute[185723]: 2026-02-16 13:56:51.669 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:56:51 compute-0 nova_compute[185723]: 2026-02-16 13:56:51.670 185727 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8d907fd7-6b, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:56:51 compute-0 nova_compute[185723]: 2026-02-16 13:56:51.671 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:56:51 compute-0 nova_compute[185723]: 2026-02-16 13:56:51.673 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 13:56:51 compute-0 nova_compute[185723]: 2026-02-16 13:56:51.673 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:56:51 compute-0 nova_compute[185723]: 2026-02-16 13:56:51.675 185727 INFO os_vif [None req-ef487dfe-2440-428b-a8e9-b6846ed3f9e0 a2a37907788d4195986dc759905dcc95 77712f67f33f426cb3d6d9b7a640f32a - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:d1:53:06,bridge_name='br-int',has_traffic_filtering=True,id=8d907fd7-6b02-461e-8612-e5f777af8eea,network=Network(09e98704-cf1f-47d1-8021-93211c7aa37e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8d907fd7-6b')
Feb 16 13:56:51 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:56:51.675 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[d56a43c7-0d88-4d72-a04d-75a3853c96ae]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:56:51 compute-0 nova_compute[185723]: 2026-02-16 13:56:51.676 185727 INFO nova.virt.libvirt.driver [None req-ef487dfe-2440-428b-a8e9-b6846ed3f9e0 a2a37907788d4195986dc759905dcc95 77712f67f33f426cb3d6d9b7a640f32a - - default default] [instance: 3266d7e2-8d63-44ff-970a-45b95f88dc2f] Deleting instance files /var/lib/nova/instances/3266d7e2-8d63-44ff-970a-45b95f88dc2f_del
Feb 16 13:56:51 compute-0 nova_compute[185723]: 2026-02-16 13:56:51.677 185727 INFO nova.virt.libvirt.driver [None req-ef487dfe-2440-428b-a8e9-b6846ed3f9e0 a2a37907788d4195986dc759905dcc95 77712f67f33f426cb3d6d9b7a640f32a - - default default] [instance: 3266d7e2-8d63-44ff-970a-45b95f88dc2f] Deletion of /var/lib/nova/instances/3266d7e2-8d63-44ff-970a-45b95f88dc2f_del complete
Feb 16 13:56:51 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:56:51.676 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[6c66a9ca-f1bb-49ad-93fa-e33de620c93f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:56:51 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:56:51.686 206438 DEBUG oslo.privsep.daemon [-] privsep: reply[764dac8a-8a19-4c7c-a12c-6ec6d684d691]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 628214, 'reachable_time': 34593, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218227, 'error': None, 'target': 'ovnmeta-09e98704-cf1f-47d1-8021-93211c7aa37e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:56:51 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:56:51.689 105762 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-09e98704-cf1f-47d1-8021-93211c7aa37e deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Feb 16 13:56:51 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:56:51.689 105762 DEBUG oslo.privsep.daemon [-] privsep: reply[7b3c9f78-4320-4e6f-9938-f5b09ef32d31]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:56:51 compute-0 systemd[1]: run-netns-ovnmeta\x2d09e98704\x2dcf1f\x2d47d1\x2d8021\x2d93211c7aa37e.mount: Deactivated successfully.
Feb 16 13:56:51 compute-0 nova_compute[185723]: 2026-02-16 13:56:51.891 185727 INFO nova.compute.manager [None req-ef487dfe-2440-428b-a8e9-b6846ed3f9e0 a2a37907788d4195986dc759905dcc95 77712f67f33f426cb3d6d9b7a640f32a - - default default] [instance: 3266d7e2-8d63-44ff-970a-45b95f88dc2f] Took 0.50 seconds to destroy the instance on the hypervisor.
Feb 16 13:56:51 compute-0 nova_compute[185723]: 2026-02-16 13:56:51.892 185727 DEBUG oslo.service.loopingcall [None req-ef487dfe-2440-428b-a8e9-b6846ed3f9e0 a2a37907788d4195986dc759905dcc95 77712f67f33f426cb3d6d9b7a640f32a - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Feb 16 13:56:51 compute-0 nova_compute[185723]: 2026-02-16 13:56:51.892 185727 DEBUG nova.compute.manager [-] [instance: 3266d7e2-8d63-44ff-970a-45b95f88dc2f] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Feb 16 13:56:51 compute-0 nova_compute[185723]: 2026-02-16 13:56:51.892 185727 DEBUG nova.network.neutron [-] [instance: 3266d7e2-8d63-44ff-970a-45b95f88dc2f] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Feb 16 13:56:53 compute-0 podman[218230]: 2026-02-16 13:56:53.007825839 +0000 UTC m=+0.048265872 container health_status d69bd0be854ec8043fcba555b70c2cd311c7a2510d07d6bc9929fe7684abece9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Feb 16 13:56:53 compute-0 podman[218229]: 2026-02-16 13:56:53.038428081 +0000 UTC m=+0.079606112 container health_status 93151dcbec9f159583042df5101bdd7b5b1bcb5f3117955b4bc9cd1c0e465b98 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, io.openshift.tags=minimal rhel9, org.opencontainers.image.created=2026-02-05T04:57:10Z, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, container_name=openstack_network_exporter, release=1770267347, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9/ubi-minimal, vendor=Red Hat, Inc., version=9.7, build-date=2026-02-05T04:57:10Z, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, cpe=cpe:/a:redhat:enterprise_linux:9::appstream)
Feb 16 13:56:55 compute-0 sshd-session[218269]: Invalid user postgres from 146.190.22.227 port 37850
Feb 16 13:56:55 compute-0 nova_compute[185723]: 2026-02-16 13:56:55.941 185727 DEBUG nova.compute.manager [req-da5455b8-bed9-4410-8ab6-a7b9157849a5 req-430ca6ee-87c2-4313-b437-365bd92637aa faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3266d7e2-8d63-44ff-970a-45b95f88dc2f] Received event network-vif-unplugged-8d907fd7-6b02-461e-8612-e5f777af8eea external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:56:55 compute-0 nova_compute[185723]: 2026-02-16 13:56:55.941 185727 DEBUG oslo_concurrency.lockutils [req-da5455b8-bed9-4410-8ab6-a7b9157849a5 req-430ca6ee-87c2-4313-b437-365bd92637aa faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "3266d7e2-8d63-44ff-970a-45b95f88dc2f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:56:55 compute-0 nova_compute[185723]: 2026-02-16 13:56:55.941 185727 DEBUG oslo_concurrency.lockutils [req-da5455b8-bed9-4410-8ab6-a7b9157849a5 req-430ca6ee-87c2-4313-b437-365bd92637aa faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "3266d7e2-8d63-44ff-970a-45b95f88dc2f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:56:55 compute-0 nova_compute[185723]: 2026-02-16 13:56:55.941 185727 DEBUG oslo_concurrency.lockutils [req-da5455b8-bed9-4410-8ab6-a7b9157849a5 req-430ca6ee-87c2-4313-b437-365bd92637aa faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "3266d7e2-8d63-44ff-970a-45b95f88dc2f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:56:55 compute-0 nova_compute[185723]: 2026-02-16 13:56:55.941 185727 DEBUG nova.compute.manager [req-da5455b8-bed9-4410-8ab6-a7b9157849a5 req-430ca6ee-87c2-4313-b437-365bd92637aa faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3266d7e2-8d63-44ff-970a-45b95f88dc2f] No waiting events found dispatching network-vif-unplugged-8d907fd7-6b02-461e-8612-e5f777af8eea pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:56:55 compute-0 nova_compute[185723]: 2026-02-16 13:56:55.941 185727 DEBUG nova.compute.manager [req-da5455b8-bed9-4410-8ab6-a7b9157849a5 req-430ca6ee-87c2-4313-b437-365bd92637aa faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3266d7e2-8d63-44ff-970a-45b95f88dc2f] Received event network-vif-unplugged-8d907fd7-6b02-461e-8612-e5f777af8eea for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Feb 16 13:56:56 compute-0 nova_compute[185723]: 2026-02-16 13:56:56.244 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:56:56 compute-0 nova_compute[185723]: 2026-02-16 13:56:56.671 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:56:56 compute-0 nova_compute[185723]: 2026-02-16 13:56:56.707 185727 DEBUG nova.network.neutron [-] [instance: 3266d7e2-8d63-44ff-970a-45b95f88dc2f] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 13:56:56 compute-0 nova_compute[185723]: 2026-02-16 13:56:56.730 185727 INFO nova.compute.manager [-] [instance: 3266d7e2-8d63-44ff-970a-45b95f88dc2f] Took 4.84 seconds to deallocate network for instance.
Feb 16 13:56:56 compute-0 nova_compute[185723]: 2026-02-16 13:56:56.792 185727 DEBUG oslo_concurrency.lockutils [None req-ef487dfe-2440-428b-a8e9-b6846ed3f9e0 a2a37907788d4195986dc759905dcc95 77712f67f33f426cb3d6d9b7a640f32a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:56:56 compute-0 nova_compute[185723]: 2026-02-16 13:56:56.793 185727 DEBUG oslo_concurrency.lockutils [None req-ef487dfe-2440-428b-a8e9-b6846ed3f9e0 a2a37907788d4195986dc759905dcc95 77712f67f33f426cb3d6d9b7a640f32a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:56:56 compute-0 nova_compute[185723]: 2026-02-16 13:56:56.800 185727 DEBUG oslo_concurrency.lockutils [None req-ef487dfe-2440-428b-a8e9-b6846ed3f9e0 a2a37907788d4195986dc759905dcc95 77712f67f33f426cb3d6d9b7a640f32a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.007s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:56:56 compute-0 nova_compute[185723]: 2026-02-16 13:56:56.818 185727 DEBUG nova.compute.manager [req-2589458d-ff63-4b87-9d2e-d0903b1afde9 req-4967019e-f956-497d-ba34-d43292f39cf6 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3266d7e2-8d63-44ff-970a-45b95f88dc2f] Received event network-vif-deleted-8d907fd7-6b02-461e-8612-e5f777af8eea external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:56:56 compute-0 nova_compute[185723]: 2026-02-16 13:56:56.830 185727 INFO nova.scheduler.client.report [None req-ef487dfe-2440-428b-a8e9-b6846ed3f9e0 a2a37907788d4195986dc759905dcc95 77712f67f33f426cb3d6d9b7a640f32a - - default default] Deleted allocations for instance 3266d7e2-8d63-44ff-970a-45b95f88dc2f
Feb 16 13:56:56 compute-0 sshd-session[218269]: Connection closed by invalid user postgres 146.190.22.227 port 37850 [preauth]
Feb 16 13:56:56 compute-0 nova_compute[185723]: 2026-02-16 13:56:56.905 185727 DEBUG oslo_concurrency.lockutils [None req-ef487dfe-2440-428b-a8e9-b6846ed3f9e0 a2a37907788d4195986dc759905dcc95 77712f67f33f426cb3d6d9b7a640f32a - - default default] Lock "3266d7e2-8d63-44ff-970a-45b95f88dc2f" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 5.521s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:56:58 compute-0 nova_compute[185723]: 2026-02-16 13:56:58.028 185727 DEBUG nova.compute.manager [req-2e0bb0c9-c29e-43f6-a7dd-4a07bbcba8ae req-f07e1573-cf81-42d3-86ff-a48b140bc320 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3266d7e2-8d63-44ff-970a-45b95f88dc2f] Received event network-vif-plugged-8d907fd7-6b02-461e-8612-e5f777af8eea external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:56:58 compute-0 nova_compute[185723]: 2026-02-16 13:56:58.029 185727 DEBUG oslo_concurrency.lockutils [req-2e0bb0c9-c29e-43f6-a7dd-4a07bbcba8ae req-f07e1573-cf81-42d3-86ff-a48b140bc320 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "3266d7e2-8d63-44ff-970a-45b95f88dc2f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:56:58 compute-0 nova_compute[185723]: 2026-02-16 13:56:58.029 185727 DEBUG oslo_concurrency.lockutils [req-2e0bb0c9-c29e-43f6-a7dd-4a07bbcba8ae req-f07e1573-cf81-42d3-86ff-a48b140bc320 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "3266d7e2-8d63-44ff-970a-45b95f88dc2f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:56:58 compute-0 nova_compute[185723]: 2026-02-16 13:56:58.029 185727 DEBUG oslo_concurrency.lockutils [req-2e0bb0c9-c29e-43f6-a7dd-4a07bbcba8ae req-f07e1573-cf81-42d3-86ff-a48b140bc320 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "3266d7e2-8d63-44ff-970a-45b95f88dc2f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:56:58 compute-0 nova_compute[185723]: 2026-02-16 13:56:58.030 185727 DEBUG nova.compute.manager [req-2e0bb0c9-c29e-43f6-a7dd-4a07bbcba8ae req-f07e1573-cf81-42d3-86ff-a48b140bc320 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3266d7e2-8d63-44ff-970a-45b95f88dc2f] No waiting events found dispatching network-vif-plugged-8d907fd7-6b02-461e-8612-e5f777af8eea pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:56:58 compute-0 nova_compute[185723]: 2026-02-16 13:56:58.030 185727 WARNING nova.compute.manager [req-2e0bb0c9-c29e-43f6-a7dd-4a07bbcba8ae req-f07e1573-cf81-42d3-86ff-a48b140bc320 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3266d7e2-8d63-44ff-970a-45b95f88dc2f] Received unexpected event network-vif-plugged-8d907fd7-6b02-461e-8612-e5f777af8eea for instance with vm_state deleted and task_state None.
Feb 16 13:56:59 compute-0 podman[218272]: 2026-02-16 13:56:59.027406969 +0000 UTC m=+0.069623873 container health_status 19961a2f872cc72daf954f4dd556f980b5f58f137d145776df6132c167a8a93c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller)
Feb 16 13:56:59 compute-0 podman[195053]: time="2026-02-16T13:56:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:56:59 compute-0 podman[195053]: @ - - [16/Feb/2026:13:56:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16012 "" "Go-http-client/1.1"
Feb 16 13:56:59 compute-0 podman[195053]: @ - - [16/Feb/2026:13:56:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2178 "" "Go-http-client/1.1"
Feb 16 13:57:01 compute-0 nova_compute[185723]: 2026-02-16 13:57:01.247 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:57:01 compute-0 openstack_network_exporter[197909]: ERROR   13:57:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:57:01 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:57:01 compute-0 openstack_network_exporter[197909]: ERROR   13:57:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:57:01 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:57:01 compute-0 nova_compute[185723]: 2026-02-16 13:57:01.672 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:57:03 compute-0 podman[218298]: 2026-02-16 13:57:03.002091279 +0000 UTC m=+0.042835727 container health_status 4ec6105d1727f201f656d7e6e358931be8ceabc5af37fb5ad779abc620122180 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 16 13:57:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:57:03.253 105360 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:57:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:57:03.253 105360 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:57:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:57:03.254 105360 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:57:04 compute-0 sshd-session[218322]: Invalid user postgres from 188.166.42.159 port 38834
Feb 16 13:57:04 compute-0 sshd-session[218322]: Connection closed by invalid user postgres 188.166.42.159 port 38834 [preauth]
Feb 16 13:57:06 compute-0 nova_compute[185723]: 2026-02-16 13:57:06.249 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:57:06 compute-0 nova_compute[185723]: 2026-02-16 13:57:06.649 185727 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1771250211.6470938, 3266d7e2-8d63-44ff-970a-45b95f88dc2f => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 13:57:06 compute-0 nova_compute[185723]: 2026-02-16 13:57:06.650 185727 INFO nova.compute.manager [-] [instance: 3266d7e2-8d63-44ff-970a-45b95f88dc2f] VM Stopped (Lifecycle Event)
Feb 16 13:57:06 compute-0 nova_compute[185723]: 2026-02-16 13:57:06.709 185727 DEBUG nova.compute.manager [None req-64404e2e-a39e-4385-8dd6-17a482385623 - - - - - -] [instance: 3266d7e2-8d63-44ff-970a-45b95f88dc2f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:57:06 compute-0 nova_compute[185723]: 2026-02-16 13:57:06.709 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:57:11 compute-0 nova_compute[185723]: 2026-02-16 13:57:11.291 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:57:11 compute-0 nova_compute[185723]: 2026-02-16 13:57:11.710 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:57:16 compute-0 nova_compute[185723]: 2026-02-16 13:57:16.293 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:57:16 compute-0 nova_compute[185723]: 2026-02-16 13:57:16.711 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:57:19 compute-0 nova_compute[185723]: 2026-02-16 13:57:19.770 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:57:19 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:57:19.771 105360 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=36, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '52:96:1b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '16:6c:7a:bd:49:39'}, ipsec=False) old=SB_Global(nb_cfg=35) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 13:57:19 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:57:19.772 105360 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 16 13:57:19 compute-0 nova_compute[185723]: 2026-02-16 13:57:19.939 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:57:20 compute-0 sshd-session[218324]: Invalid user oracle from 146.190.226.24 port 57106
Feb 16 13:57:20 compute-0 sshd-session[218324]: Connection closed by invalid user oracle 146.190.226.24 port 57106 [preauth]
Feb 16 13:57:21 compute-0 nova_compute[185723]: 2026-02-16 13:57:21.295 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:57:21 compute-0 nova_compute[185723]: 2026-02-16 13:57:21.712 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:57:24 compute-0 podman[218326]: 2026-02-16 13:57:24.027444183 +0000 UTC m=+0.058344254 container health_status 93151dcbec9f159583042df5101bdd7b5b1bcb5f3117955b4bc9cd1c0e465b98 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, name=ubi9/ubi-minimal, release=1770267347, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, version=9.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2026-02-05T04:57:10Z, config_id=openstack_network_exporter, managed_by=edpm_ansible, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2026-02-05T04:57:10Z, io.buildah.version=1.33.7, vendor=Red Hat, Inc., architecture=x86_64)
Feb 16 13:57:24 compute-0 podman[218327]: 2026-02-16 13:57:24.05824398 +0000 UTC m=+0.076263050 container health_status d69bd0be854ec8043fcba555b70c2cd311c7a2510d07d6bc9929fe7684abece9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 16 13:57:24 compute-0 sshd-session[218367]: Invalid user ubuntu from 64.227.72.94 port 57192
Feb 16 13:57:24 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:57:24.774 105360 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=b0e583b2-47d7-4bde-bbd6-282143e0c194, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '36'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:57:24 compute-0 sshd-session[218367]: Connection closed by invalid user ubuntu 64.227.72.94 port 57192 [preauth]
Feb 16 13:57:26 compute-0 nova_compute[185723]: 2026-02-16 13:57:26.298 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:57:26 compute-0 nova_compute[185723]: 2026-02-16 13:57:26.714 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:57:29 compute-0 podman[195053]: time="2026-02-16T13:57:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:57:29 compute-0 podman[195053]: @ - - [16/Feb/2026:13:57:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16012 "" "Go-http-client/1.1"
Feb 16 13:57:29 compute-0 podman[195053]: @ - - [16/Feb/2026:13:57:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2184 "" "Go-http-client/1.1"
Feb 16 13:57:30 compute-0 podman[218369]: 2026-02-16 13:57:30.074073436 +0000 UTC m=+0.103641800 container health_status 19961a2f872cc72daf954f4dd556f980b5f58f137d145776df6132c167a8a93c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true)
Feb 16 13:57:31 compute-0 nova_compute[185723]: 2026-02-16 13:57:31.300 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:57:31 compute-0 openstack_network_exporter[197909]: ERROR   13:57:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:57:31 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:57:31 compute-0 openstack_network_exporter[197909]: ERROR   13:57:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:57:31 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:57:31 compute-0 nova_compute[185723]: 2026-02-16 13:57:31.433 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:57:31 compute-0 nova_compute[185723]: 2026-02-16 13:57:31.434 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:57:31 compute-0 nova_compute[185723]: 2026-02-16 13:57:31.492 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:57:31 compute-0 nova_compute[185723]: 2026-02-16 13:57:31.493 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:57:31 compute-0 nova_compute[185723]: 2026-02-16 13:57:31.493 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:57:31 compute-0 nova_compute[185723]: 2026-02-16 13:57:31.493 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 16 13:57:31 compute-0 nova_compute[185723]: 2026-02-16 13:57:31.663 185727 WARNING nova.virt.libvirt.driver [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 13:57:31 compute-0 nova_compute[185723]: 2026-02-16 13:57:31.666 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5840MB free_disk=73.21996307373047GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 16 13:57:31 compute-0 nova_compute[185723]: 2026-02-16 13:57:31.666 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:57:31 compute-0 nova_compute[185723]: 2026-02-16 13:57:31.667 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:57:31 compute-0 nova_compute[185723]: 2026-02-16 13:57:31.715 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:57:31 compute-0 nova_compute[185723]: 2026-02-16 13:57:31.751 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 16 13:57:31 compute-0 nova_compute[185723]: 2026-02-16 13:57:31.752 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 16 13:57:31 compute-0 nova_compute[185723]: 2026-02-16 13:57:31.782 185727 DEBUG nova.compute.provider_tree [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Inventory has not changed in ProviderTree for provider: c9501a85-df32-4b8f-bce0-9425ef1e7866 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 13:57:31 compute-0 nova_compute[185723]: 2026-02-16 13:57:31.808 185727 DEBUG nova.scheduler.client.report [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Inventory has not changed for provider c9501a85-df32-4b8f-bce0-9425ef1e7866 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 13:57:31 compute-0 nova_compute[185723]: 2026-02-16 13:57:31.834 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 16 13:57:31 compute-0 nova_compute[185723]: 2026-02-16 13:57:31.835 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.168s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:57:33 compute-0 nova_compute[185723]: 2026-02-16 13:57:33.835 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:57:33 compute-0 nova_compute[185723]: 2026-02-16 13:57:33.836 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:57:34 compute-0 podman[218397]: 2026-02-16 13:57:34.011979399 +0000 UTC m=+0.055153064 container health_status 4ec6105d1727f201f656d7e6e358931be8ceabc5af37fb5ad779abc620122180 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 16 13:57:34 compute-0 nova_compute[185723]: 2026-02-16 13:57:34.434 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:57:35 compute-0 nova_compute[185723]: 2026-02-16 13:57:35.429 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:57:36 compute-0 nova_compute[185723]: 2026-02-16 13:57:36.301 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:57:36 compute-0 nova_compute[185723]: 2026-02-16 13:57:36.432 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:57:36 compute-0 nova_compute[185723]: 2026-02-16 13:57:36.433 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 16 13:57:36 compute-0 nova_compute[185723]: 2026-02-16 13:57:36.717 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:57:37 compute-0 nova_compute[185723]: 2026-02-16 13:57:37.434 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:57:37 compute-0 nova_compute[185723]: 2026-02-16 13:57:37.434 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 16 13:57:37 compute-0 nova_compute[185723]: 2026-02-16 13:57:37.434 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 16 13:57:37 compute-0 nova_compute[185723]: 2026-02-16 13:57:37.452 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 16 13:57:37 compute-0 nova_compute[185723]: 2026-02-16 13:57:37.452 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:57:39 compute-0 nova_compute[185723]: 2026-02-16 13:57:39.448 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:57:41 compute-0 nova_compute[185723]: 2026-02-16 13:57:41.303 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:57:41 compute-0 nova_compute[185723]: 2026-02-16 13:57:41.718 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:57:46 compute-0 nova_compute[185723]: 2026-02-16 13:57:46.305 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:57:46 compute-0 nova_compute[185723]: 2026-02-16 13:57:46.718 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:57:51 compute-0 nova_compute[185723]: 2026-02-16 13:57:51.307 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:57:51 compute-0 nova_compute[185723]: 2026-02-16 13:57:51.721 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:57:55 compute-0 podman[218422]: 2026-02-16 13:57:55.018594706 +0000 UTC m=+0.057939173 container health_status 93151dcbec9f159583042df5101bdd7b5b1bcb5f3117955b4bc9cd1c0e465b98 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.7, distribution-scope=public, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., config_id=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.openshift.expose-services=, maintainer=Red Hat, Inc., architecture=x86_64, io.buildah.version=1.33.7, build-date=2026-02-05T04:57:10Z, org.opencontainers.image.created=2026-02-05T04:57:10Z, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9/ubi-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, release=1770267347, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git)
Feb 16 13:57:55 compute-0 podman[218423]: 2026-02-16 13:57:55.02038963 +0000 UTC m=+0.054700272 container health_status d69bd0be854ec8043fcba555b70c2cd311c7a2510d07d6bc9929fe7684abece9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, container_name=ovn_metadata_agent)
Feb 16 13:57:55 compute-0 ovn_controller[96072]: 2026-02-16T13:57:55Z|00273|memory_trim|INFO|Detected inactivity (last active 30024 ms ago): trimming memory
Feb 16 13:57:56 compute-0 sshd-session[218459]: Invalid user postgres from 188.166.42.159 port 57084
Feb 16 13:57:56 compute-0 nova_compute[185723]: 2026-02-16 13:57:56.309 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:57:56 compute-0 sshd-session[218459]: Connection closed by invalid user postgres 188.166.42.159 port 57084 [preauth]
Feb 16 13:57:56 compute-0 nova_compute[185723]: 2026-02-16 13:57:56.722 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:57:59 compute-0 podman[195053]: time="2026-02-16T13:57:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:57:59 compute-0 podman[195053]: @ - - [16/Feb/2026:13:57:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16012 "" "Go-http-client/1.1"
Feb 16 13:57:59 compute-0 podman[195053]: @ - - [16/Feb/2026:13:57:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2181 "" "Go-http-client/1.1"
Feb 16 13:58:01 compute-0 podman[218461]: 2026-02-16 13:58:01.137571782 +0000 UTC m=+0.164767782 container health_status 19961a2f872cc72daf954f4dd556f980b5f58f137d145776df6132c167a8a93c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Feb 16 13:58:01 compute-0 nova_compute[185723]: 2026-02-16 13:58:01.315 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:58:01 compute-0 openstack_network_exporter[197909]: ERROR   13:58:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:58:01 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:58:01 compute-0 openstack_network_exporter[197909]: ERROR   13:58:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:58:01 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:58:01 compute-0 nova_compute[185723]: 2026-02-16 13:58:01.724 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:58:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:58:03.255 105360 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:58:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:58:03.256 105360 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:58:03 compute-0 ovn_metadata_agent[105355]: 2026-02-16 13:58:03.256 105360 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:58:05 compute-0 podman[218487]: 2026-02-16 13:58:05.03225079 +0000 UTC m=+0.066487666 container health_status 4ec6105d1727f201f656d7e6e358931be8ceabc5af37fb5ad779abc620122180 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 16 13:58:06 compute-0 nova_compute[185723]: 2026-02-16 13:58:06.318 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:58:06 compute-0 nova_compute[185723]: 2026-02-16 13:58:06.726 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:58:11 compute-0 nova_compute[185723]: 2026-02-16 13:58:11.321 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:58:11 compute-0 sshd-session[218512]: Invalid user ubuntu from 64.227.72.94 port 33842
Feb 16 13:58:11 compute-0 sshd-session[218512]: Connection closed by invalid user ubuntu 64.227.72.94 port 33842 [preauth]
Feb 16 13:58:11 compute-0 nova_compute[185723]: 2026-02-16 13:58:11.727 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:58:16 compute-0 nova_compute[185723]: 2026-02-16 13:58:16.324 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:58:16 compute-0 nova_compute[185723]: 2026-02-16 13:58:16.729 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:58:21 compute-0 nova_compute[185723]: 2026-02-16 13:58:21.326 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:58:21 compute-0 nova_compute[185723]: 2026-02-16 13:58:21.731 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:58:22 compute-0 sshd-session[218516]: Accepted publickey for zuul from 192.168.122.10 port 33588 ssh2: ECDSA SHA256:6P1Q60WP6ePVjzkJMgpM7PslBYS6YIK3pVS2L1FZ4Yk
Feb 16 13:58:22 compute-0 systemd-logind[818]: New session 35 of user zuul.
Feb 16 13:58:22 compute-0 systemd[1]: Started Session 35 of User zuul.
Feb 16 13:58:22 compute-0 sshd-session[218516]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 16 13:58:22 compute-0 sudo[218520]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/bash -c 'rm -rf /var/tmp/sos-osp && mkdir /var/tmp/sos-osp && sos report --batch --all-logs --tmp-dir=/var/tmp/sos-osp  -p container,openstack_edpm,system,storage,virt'
Feb 16 13:58:22 compute-0 sudo[218520]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:58:23 compute-0 sshd-session[218514]: Invalid user postgres from 146.190.22.227 port 57310
Feb 16 13:58:23 compute-0 sshd-session[218514]: Connection closed by invalid user postgres 146.190.22.227 port 57310 [preauth]
Feb 16 13:58:24 compute-0 nova_compute[185723]: 2026-02-16 13:58:24.433 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:58:24 compute-0 nova_compute[185723]: 2026-02-16 13:58:24.434 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Feb 16 13:58:24 compute-0 nova_compute[185723]: 2026-02-16 13:58:24.456 185727 DEBUG nova.compute.manager [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Feb 16 13:58:26 compute-0 podman[218662]: 2026-02-16 13:58:26.032120678 +0000 UTC m=+0.056191179 container health_status d69bd0be854ec8043fcba555b70c2cd311c7a2510d07d6bc9929fe7684abece9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Feb 16 13:58:26 compute-0 podman[218661]: 2026-02-16 13:58:26.040386844 +0000 UTC m=+0.064380493 container health_status 93151dcbec9f159583042df5101bdd7b5b1bcb5f3117955b4bc9cd1c0e465b98 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.buildah.version=1.33.7, release=1770267347, vendor=Red Hat, Inc., managed_by=edpm_ansible, maintainer=Red Hat, Inc., name=ubi9/ubi-minimal, build-date=2026-02-05T04:57:10Z, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, container_name=openstack_network_exporter, config_id=openstack_network_exporter, version=9.7, org.opencontainers.image.created=2026-02-05T04:57:10Z, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, cpe=cpe:/a:redhat:enterprise_linux:9::appstream)
Feb 16 13:58:26 compute-0 nova_compute[185723]: 2026-02-16 13:58:26.332 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:58:26 compute-0 nova_compute[185723]: 2026-02-16 13:58:26.732 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:58:26 compute-0 ovs-vsctl[218727]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Feb 16 13:58:27 compute-0 sshd-session[218735]: Invalid user oracle from 146.190.226.24 port 60818
Feb 16 13:58:27 compute-0 systemd[1]: proc-sys-fs-binfmt_misc.automount: Got automount request for /proc/sys/fs/binfmt_misc, triggered by 218544 (sos)
Feb 16 13:58:27 compute-0 sshd-session[218735]: Connection closed by invalid user oracle 146.190.226.24 port 60818 [preauth]
Feb 16 13:58:27 compute-0 systemd[1]: Mounting Arbitrary Executable File Formats File System...
Feb 16 13:58:27 compute-0 systemd[1]: Mounted Arbitrary Executable File Formats File System.
Feb 16 13:58:27 compute-0 virtqemud[184843]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Feb 16 13:58:27 compute-0 virtqemud[184843]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Feb 16 13:58:27 compute-0 virtqemud[184843]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Feb 16 13:58:28 compute-0 crontab[219135]: (root) LIST (root)
Feb 16 13:58:29 compute-0 podman[195053]: time="2026-02-16T13:58:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:58:29 compute-0 podman[195053]: @ - - [16/Feb/2026:13:58:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16012 "" "Go-http-client/1.1"
Feb 16 13:58:29 compute-0 podman[195053]: @ - - [16/Feb/2026:13:58:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2182 "" "Go-http-client/1.1"
Feb 16 13:58:29 compute-0 kernel: /proc/cgroups lists only v1 controllers, use cgroup.controllers of root cgroup for v2 info
Feb 16 13:58:30 compute-0 systemd[1]: Starting Hostname Service...
Feb 16 13:58:30 compute-0 systemd[1]: Started Hostname Service.
Feb 16 13:58:31 compute-0 nova_compute[185723]: 2026-02-16 13:58:31.339 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:58:31 compute-0 openstack_network_exporter[197909]: ERROR   13:58:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:58:31 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:58:31 compute-0 openstack_network_exporter[197909]: ERROR   13:58:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:58:31 compute-0 openstack_network_exporter[197909]: 
Feb 16 13:58:31 compute-0 nova_compute[185723]: 2026-02-16 13:58:31.456 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:58:31 compute-0 nova_compute[185723]: 2026-02-16 13:58:31.734 185727 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:58:32 compute-0 podman[219438]: 2026-02-16 13:58:32.096967766 +0000 UTC m=+0.132998801 container health_status 19961a2f872cc72daf954f4dd556f980b5f58f137d145776df6132c167a8a93c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.schema-version=1.0, config_id=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc-8f6ae2e1a687e6bdd04c014e1955e37194733419590965d86dbf32bcf1194ecc'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127)
Feb 16 13:58:32 compute-0 nova_compute[185723]: 2026-02-16 13:58:32.433 185727 DEBUG oslo_service.periodic_task [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:58:32 compute-0 nova_compute[185723]: 2026-02-16 13:58:32.505 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:58:32 compute-0 nova_compute[185723]: 2026-02-16 13:58:32.506 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:58:32 compute-0 nova_compute[185723]: 2026-02-16 13:58:32.506 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:58:32 compute-0 nova_compute[185723]: 2026-02-16 13:58:32.506 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 16 13:58:32 compute-0 nova_compute[185723]: 2026-02-16 13:58:32.672 185727 WARNING nova.virt.libvirt.driver [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 13:58:32 compute-0 nova_compute[185723]: 2026-02-16 13:58:32.674 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5335MB free_disk=72.99961853027344GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 16 13:58:32 compute-0 nova_compute[185723]: 2026-02-16 13:58:32.674 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:58:32 compute-0 nova_compute[185723]: 2026-02-16 13:58:32.674 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:58:32 compute-0 nova_compute[185723]: 2026-02-16 13:58:32.995 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 16 13:58:32 compute-0 nova_compute[185723]: 2026-02-16 13:58:32.996 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 16 13:58:33 compute-0 nova_compute[185723]: 2026-02-16 13:58:33.090 185727 DEBUG nova.compute.provider_tree [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Inventory has not changed in ProviderTree for provider: c9501a85-df32-4b8f-bce0-9425ef1e7866 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 13:58:33 compute-0 nova_compute[185723]: 2026-02-16 13:58:33.112 185727 DEBUG nova.scheduler.client.report [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Inventory has not changed for provider c9501a85-df32-4b8f-bce0-9425ef1e7866 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 13:58:33 compute-0 nova_compute[185723]: 2026-02-16 13:58:33.143 185727 DEBUG nova.compute.resource_tracker [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 16 13:58:33 compute-0 nova_compute[185723]: 2026-02-16 13:58:33.144 185727 DEBUG oslo_concurrency.lockutils [None req-e83b7bbd-12f3-47de-b889-b26fe1c47479 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.470s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
